Source: Business Standard
Meta uses its Llama models to power its AI chatbot, called Meta AI, which operates inside its apps, including Instagram and WhatsApp, and also as a separate web product
“If AI Is Going To Be As Important In The Future As Mobile Platforms Are, Then I Just Don’t Want To Be In The Position Where We’re Accessing AI Through” A Competitor.”
Facebook parent company Meta Platforms Inc. debuted a new and powerful AI model that Chief Executive Officer Mark Zuckerberg called “state of the art” and said will rival similar offerings from competitors like OpenAI and Alphabet Inc.’s Google.
The new model released Tuesday, called Llama 3.1, took several months to train and hundreds of millions of dollars of computing power. The company said it represents a major update from Llama 3, which came out in April.
“I think the most important product for an AI assistant is going to be how smart it is.”
“The Llama models that we’re building are some of the most advanced in the world. Meta is already working on Llama 4.”
Meta executives say that the model, which is primarily used to power chatbots both within Meta and by outside developers, has a wide range of new capabilities, including improved reasoning to help solve complex math problems or instantly synthesize an entire book of text. It also has generative AI features that can create images on demand through text prompts. A feature called “Imagine Yourself” lets users upload an image of their face, which can then be used to create depictions of them in different scenes and scenarios.
Meta uses its Llama models to power its AI chatbot, called Meta AI, which operates inside its apps, including Instagram and WhatsApp, and also as a separate web product. Zuckerberg said that Meta has “hundreds of millions” of users for its chatbot, and expects it will be the most widely used chatbot in the world by the end of the year. He expects that others outside of Meta will use Llama to train their own AI models.
“It’s just gonna be this teacher that allows so many different organizations to create their own models rather than having to rely on the kind of off-the-shelf ones that the other guys are selling.”
Meta’s investments in AI have been steep. Zuckerberg said that Meta’s Llama 3 models cost “hundreds of millions of dollars” in computing power to train, but that he expects future models will cost even more. “Going forward it’s going to be billions and many billions of dollars of compute” power, he said. Meta in 2023 tried to reign in some of its spending on futuristic technologies and management layers, cutting thousands of jobs in what Zuckerberg dubbed the “year of efficiency.” But Zuckerberg is still willing to spend on the AI arms race.
“I think that there’s a meaningful chance that a lot of the companies are over-building now, and that you’ll look back and you’re like, ‘oh, we maybe all spent some number of billions of dollars more than we had to. On the flip side, I actually think all the companies that are investing are making a rational decision, because the downside of being behind is that you’re out of position for like the most important technology for the next 10 to 15 years.”
After all the investment, Meta makes the technology behind Llama available for the public to use for free, so long as they adhere to the company’s “acceptable use policy.” Zuckerberg hopes the open-access strategy will help make the company’s work the foundation of other successful startups and products, giving Meta greater sway in how the industry moves forward.
“If AI is going to be as important in the future as mobile platforms are, then I just don’t want to be in the position where we’re accessing AI through” a competitor, said Zuckerberg, who has long been frustrated with Meta’s reliance on distributing its social media apps on phones and operating systems from Google and Apple Inc. We’re a technology company and we need to be able to kind of build stuff not just at the app layer but all the way down. And it’s worth it to us to make these massive investments to do that.”
Despite the pledge to make Llama open, Zuckerberg and other top company executives are keeping the data sets used for training Llama 3.1 a secret. “Even though it’s open we are designing this also for ourselves,” he explained. Meta is using publicly available user posts from Facebook and Instagram, as well as other “proprietary” data sets that the company has licensed from others, Zuckerberg said, without sharing specifics.
He also dismissed the idea that training Llama on data from Facebook and Instagram posts is a key advantage.
“A lot of the public data on those services we allow to be indexed in search engines, so I think Google and others actually have the ability to use a lot of that data, too.”
Meta told investors in April that it was planning to spend billions of dollars more than initially expected this year, with investments in AI being a core reason why. The company is expected to have some 350,000 Nvidia Corp. H100 GPUs by the end of the year, according to a company blog post. The H100 chips have become the foundational technology used to train large language models like Llama and OpenAI’s ChatGPT, and can cost upwards of tens of thousands of dollars apiece.
Critics of Meta’s open source approach to AI point to the potential for abuse — or the fear that tech companies from geopolitical rivals like China will piggyback off Meta’s technology to keep pace with their American counterparts.
Zuckerberg is more concerned that closing off the tech from other parts of the world would ultimately be a detriment.
There’s one string of thought which is like:
“Ok well we need to lock it all down.”
“I just happen to think that that’s really wrong because the US thrives on open and decentralized innovation. I mean that’s the way our economy works, that’s how we build awesome stuff. So I think that locking everything down would hamstring us and make us more likely to not be the leaders.”
It’s also unrealistic to think that the US will ever be years ahead of China when it comes to AI advancements, he added, but pointed out that even a small, multi-month lead can “compound” over time to give the US a clear advantage.