Topics
late
AI
Amazon
Image Credits:TOBIAS SCHWARZ/AFP / Getty Images
Apps
Biotech & Health
Climate
Image Credits:TOBIAS SCHWARZ/AFP / Getty Images
Cloud Computing
Commerce
Crypto
Enterprise
EVs
Fintech
fund raise
Gadgets
Gaming
Government & Policy
Hardware
layoff
Media & Entertainment
Meta
Microsoft
Privacy
Robotics
Security
societal
place
startup
TikTok
transfer
Venture
More from TechCrunch
issue
Startup Battlefield
StrictlyVC
Podcasts
video
Partner Content
TechCrunch Brand Studio
Crunchboard
meet Us
Meta , which grow one of the biggest foundational open source large language models , Llama , believes it will need importantly more calculation top executive to take aim models in the future .
Mark Zuckerberg say on Meta ’s second - quarter earnings call on Tuesday that to train Llama 4 , the company will need 10x more compute than what was needed to educate Llama 3 . But he still wants Meta to build capacity to school models rather than fall behind its competitor .
“ The amount of computing needed to train Llama 4 will likely be almost 10 time more than what we used to train Llama 3 , and future modelling will continue to grow beyond that , ” Zuckerberg said .
“ It ’s hard to predict how this will trend multiple generation out into the future . But at this point in time , I ’d rather risk construction capacitance before it is want rather than too late , founder the tenacious wind sentence for spinning up new inference project . ”
Meta releasedLlama 3 with 8 billion parameter in April . The company last hebdomad released an kick upstairs version of the model , calledLlama 3.1 405B , which had 405 billion parameter , making it Meta ’s biggest open source model .
Meta ’s CFO , Susan Li , also said the companionship is thinking about unlike data centre projection and building capacity to train future AI models . She said Meta expects this investment to increase upper-case letter expenditure in 2025 .
Training large voice communication role model can be a dear business . Meta ’s capital using up rose nearly 33 % to $ 8.5 billion in Q2 2024 , from $ 6.4 billion a year in the first place , driven by investment in server , information centers and web base .
Join us at TechCrunch Sessions: AI
Exhibit at TechCrunch Sessions: AI
According to a write up fromThe Information , OpenAI spends $ 3 billion on training models and an additional $ 4 billion on renting servers at a discount rate from Microsoft .
“ As we scale generative AI training capacitance to advance our foundation model , we ’ll continue to ramp up our infrastructure in a way that provides us with flexibility in how we use it over time . This will allow us to manoeuvre training capacity to GenAI illation or to our core ranking and recommendation work , when we expect that doing so would be more valuable , ” Li said during the call .
During the call , Meta also blab out about its consumer - facingMeta AI ’s usage and said India is the heavy mart of its chatbot . But Li note that the companionship does n’t expect Gen AI products to contribute to revenue in a significant way .