Omar Marques | Sopa Images | Lightrocket | Getty Images
of Meta The massive investment in artificial intelligence includes the development of an AI system designed to power Facebook’s entire video recommendation engine across all of its platforms, a company executive said Wednesday.
Tom Allisonthe Facebook chief, said part of Meta’s “technology roadmap going into 2026” includes developing an AI recommendation model that can power both the TikTok-style short video service Reels and more traditional, longer videos.
To date, Meta has typically used a separate model for each of its products, such as Reels, Groups and the core Facebook Feed, Alison said on stage at the Morgan Stanley technology conference in San Francisco.
As part of Meta’s ambitious foray into artificial intelligence, the company is spending billions of dollars on Nvidia graphics processing units, or GPUs. They have become the primary chips used by AI researchers to train the types of large language models used to power OpenAI’s popular ChatGPT chatbot and other AI models.
Alison said “phase 1” of Meta’s technology roadmap involved migrating the company’s current recommendation systems to GPUs from more traditional computer chips, helping to improve overall product performance.
As interest in LLMs soared last year, Meta executives were impressed by how these large AI models could “handle a lot of data and all kinds of very general-purpose activities like conversation,” Alison said. Meta saw the potential for a giant recommendation model that could be used across all products, and by last year, it had built “this kind of new model architecture,” Alison said, adding that the company tested it in Reels.
This new “model architecture” helped Facebook achieve “an 8% to 10% gain in Reels watch time” on the core Facebook app, which Alison said helped demonstrate that the model was “learning from the data much more effectively from the previous generation. “
“We’ve really focused on the kind of investments more to make sure we can scale these models with the right kind of hardware,” he said.
Meta is now in “phase 3” of the system redesign, which involves trying to validate the technology and push it to multiple products.
“Rather than just powering Reels, we’re working on a project to power our entire video ecosystem with this single model, and then we can add our streaming recommendation product to be served by that model as well,” said Alison. “If we do it right, not only will the recommendations be somewhat more engaging and more relevant, but we think their response can also be improved.”
Explaining how it will work if successful, Alison said: “If you see something you like in Reels and then come back to Feed, we can show you more similar content.”
Alison said that Meta has amassed a huge stockpile of GPUs that will be used to aid broader AI production efforts, such as the development of digital assistants.
Some productive AI projects that Meta is looking at include integrating more sophisticated chat tools into its main Feed, so that a person who sees a “suggested post about Taylor Swift,” could perhaps “just click a button and say, “Hey Meta AI, tell me more about what I’m seeing with Taylor Swift right now.”
Meta is also experimenting with integrating its AI chat tool into Groups, so a member of a Facebook baking group could potentially ask a question about desserts and get a response from a digital assistant.
“I think we have an opportunity to put genetic AI into a sort of consumer multiplayer environment,” Alison said.
I’M WATCHING: The full CNBC interview with Meta’s Nick Clegg
Don’t miss these stories from CNBC PRO: