Local Computing Infra Key to AI's Future: Meta's Chief AI Scientist Yann LeCun "AI is going to become a common infrastructure which people will use as a repository of human knowledge. And this cannot be built by a single entity. It has to be a collaborative project," LeCun noted.
You're reading Entrepreneur India, an international franchise of Entrepreneur Media.
Yann LeCun, one of the godfathers of artificial intelligence (AI) and chief AI scientist at Meta believes that having will be integral for the future of AI. He believes that training will be dispersed globally in such a manner that one can have models that are trained on worldwide data without copying the data.
According to MarketsandMarkets, the global cloud computing market is expected to grow from USD 626.4 billion in 2023 to USD 1,266.4 billion by 2028. Back home, the Indian public cloud services market is expected to reach USD 24.2 billion by 2028, growing at a CAGR of 23.8 per cent for 2023-28.
"It's crucial for two reasons- one is for having local ability to train models and two is having very low-cost access to inference for AI systems," LeCun said in the latest episode of Nikhil Kamath's WTF podcast. Inference, in machine learning (ML), refers to the process of using a trained model to make predictions or classifications on new data. "It's a lot of infrastructure. It's much bigger than the infrastructure for learning. It's an area where there is scope for a lot of innovation than training," he added. At present, NVIDIA dominates the training segment.
"Inference is going to be 100 times bigger than training. Nvidia is really good at training but very miscast at inference," said Chamath Palihapitiya, a Silicon Valley venture capitalist, noting how AI was two markets, training and inference earlier this year.
Inference is witnessing an abundance of innovation, reducing the cost. "Cost of inference for large language model (LLM) has gone down by a factor of 100 in two years," LeCun shared adding that it was way faster than Moore's Law.
The fullness of time
The future will be built through collaborative efforts, not individual isolated ones. "AI is going to become a common infrastructure which people will use as a repository of human knowledge. And this cannot be built by a single entity. It has to be a collaborative project," LeCun noted. According to MarketsandMarkets, the global AI market size is projected to touch USD 1,339.1 billion in 2030, at a Compound Annual Growth Rate (CAGR) of 35.7% during 2024-2030.
"Currently, the LLMs are trained with a combination of publicly available data and licensed data…It's biased, a lot of it is in English," the AI guru said. According to reports, OpenAI's GPT-3 had over 90 per cent of its training data in English.
He further said that LLMs will need data sets that are 'encompassing' in nature. With this, the systems that are trained on a diverse language set will be able to understand world languages, cultures, and value systems. Under AI, the LLM market is expected to touch a projected value of USD 64.9 billion by 2032.
"There's still going to be a need for collecting data and filtering data to keep high-quality data and get rid of the junk," he said, adding the process would be an expensive one.
LeCun predicts that five years down the line, the world will be dominated by open-source platforms. "The proprietary engines will not be nearly as important as they are today…a fine-tuned open-source engine like Llama always works better than a non-fine-tined generic top-performing model," he added.
On a piece of advice to entrepreneurs, LeCun gives an academic suggestion. "Doing a PhD or graduate studies trains you to invent new things. It also makes sure that the methodology you use prevents you from fooling yourself into thinking you're being an innovator, but you are not," he concluded candidly.