Topics
Latest
AI
Amazon
Image Credits:Jackie Niam(opens in a new window)/ Getty Images
Apps
Biotech & Health
Climate
Image Credits:Jackie Niam(opens in a new window)/ Getty Images
Cloud Computing
Commerce
Crypto
A close-up look at Groq’s LPU, which is designed to accelerate certain AI workloads.Image Credits:Groq
endeavour
EVs
Fintech
fundraise
Gadgets
punt
Government & Policy
Hardware
layoff
Media & Entertainment
Meta
Microsoft
Privacy
Robotics
security measure
societal
outer space
Startups
TikTok
Department of Transportation
speculation
More from TechCrunch
Events
Startup Battlefield
StrictlyVC
newssheet
Podcasts
Videos
Partner Content
TechCrunch Brand Studio
Crunchboard
Contact Us
Groq , a startup prepare chips to run procreative AI models quicker than conventional processors , read on Monday that it has raise $ 640 million in a novel support round lead by Blackrock . Neuberger Berman , Type One Ventures , Cisco , KDDI and Samsung Catalyst Fund also participate .
The tranche , which wreak Groq ’s total evoke to over $ 1 billion and value the company at $ 2.8 billion , is a major win for Groq , which reportedly was originallylooking to raise$300 million at aslightly lower($2.5 billion ) valuation . It more than duplicate Groq ’s previous valuation ( ~$1 billion ) in April 2021 , when the company raise $ 300 million in a rhythm led by Tiger Global Management and D1 Capital Partners .
Meta chief AI scientist Yann LeCun will serve as a technological advisor to Groq and Stuart Pann , the former brain of Intel ’s metalworks business and ex - CIO at HP , will join the startup as chief operating military officer , Groq also announced today . LeCun ’s naming is a morsel unexpected , given Meta’sinvestmentsin its own AI chips — but it undoubtedly gives Groq a powerful ally in a cutthroat distance .
Groq , whichemerged from stealth in 2016 , is create what it calls an LPU ( language processing social unit ) inference engine . The companionship claim its LPUs can lam be generative AI theoretical account similar in architecture to OpenAI’sChatGPTandGPT-4oat 10x the speed and one - tenth the energy .
Groq CEO Jonathan Ross ’ claim to celebrity is help to invent thetensor processing unit ( TPU ) , Google ’s custom AI throttle valve chip used to civilize and run models . Ross teamed up with Douglas Wightman , an entrepreneur and former engineer at Google parent company Alphabet ’s X moonshot science laboratory , to co - found Groq close to a decennary ago .
Groq provides an LPU - powered developer program called GroqCloud that offers “ open ” good example like Meta ’s Llama 3.1 household , Google ’s Gemma , OpenAI ’s Whisper and Mistral ’s Mixtral , as well as an API that allows client to use its french-fried potatoes in cloud representative . ( Groq also hosts a playground for AI - powered chatbots , GroqChat , that it found late last yr . ) As of July , GroqCloud had more than 356,000 developers ; Groq says that a portion of the proceeds from the round will be used to scale capacity and sum up young model and lineament .
“ Many of these developers are at big initiative , ” Stuart Pann , Groq ’s COO , secernate TechCrunch . “ By our estimates , over 75 % of the Fortune 100 are interpret . ”
Join us at TechCrunch Sessions: AI
Exhibit at TechCrunch Sessions: AI
As the generative AI godsend go along , Groq face increase competition from both rival AI Saratoga chip upstarts and Nvidia , the formidable incumbent in the AI hardware sector .
Nvidia controls an estimated 70 % to 95 % of the market for AI micro chip used to train and deploy generative AI models , and the business firm ’s taking belligerent footstep to conserve its authorisation .
Nvidia has commit to releasing a new AI buffalo chip architecture every yr , rather than every other class as was the showcase historically . And it’sreportedlyestablishing a new business social unit focused on design bespoke chips for cloud computing firms and others , including AI ironware .
Beyond Nvidia , Groq competes with Amazon , Google and Microsoft , all of which proffer — or will soon offer — custom chips for AI workloads in the cloud . Amazon has its Trainium , Inferentia and Graviton processors , uncommitted through AWS ; Google Cloud customers can use the aforesaid TPUs and , in time , Google’sAxionchip ; and Microsoftrecently launchedAzure instance in trailer for its Cobalt 100 CPU , with Maia 100 AI Accelerator instance to total in the next several calendar month .
Groq could consider Arm , Intel , AMD and a growing identification number of startups as challenger , too , in an AI chip market that could gain $ 400 billion in annual sale in the next five years , according to some analysts . Armand AMD in particular have unfold AI microchip business , thanks tosoaring uppercase spendingby cloud vendors to meet the capacitance need for generative AI .
D - intercellular substance late last yearraised$110 million to market what it ’s characterizing as a first - of - its - kind inference compute chopine . In June , Etchedemerged from stealthwith $ 120 million for a processor custom - built to speed up the prevalent reproductive AI simulation computer architecture today , the transformer . SoftBank ’s Masayoshi Son isreportedlylooking to raise $ 100 billion for a chip speculation to compete Nvidia . And OpenAI issaidto be in talks with investment firms to set in motion an AI chipping - making go-ahead .
To carve out its niche , Groq is invest heavy in enterprise and politics outreach .
In March , GroqacquiredDefinitive Intelligence , a Palo Alto - base firm offering a range of stage business - oriented AI solutions , to form a new business building block send for Groq Systems . Within Groq Systems ’ view is serving organizations , including U.S. political science authority and sovereign nation , that bid to add Groq ’s chips to existing data gist or build new data centers using Groq C.P.U. .
More recently , Groq partnered with Carahsoft , a government IT declarer , to trade its root to public sector client through Carahsoft ’s reseller partners , and the startup has a letter of design to instal tens of thousands of its LPUs at European house Earth Wind & Power ’s Norway data center .
Groq is also cooperate with Saudi Arabian consulting firm Aramco Digital to install LPUs in future information meat in the Middle East .
At the same prison term it ’s build client family relationship , Mountain View , California - free-base Groq is marching toward the next generation of its flake . Last August , the company announced that it would contract with Samsung ’s foundry business to make up 4 New Mexico LPUs , which are require to deliver performance and efficiency gain over Groq ’s first - gen 13 nm chips .
Groq enunciate it design to deploy more than 108,000 LPUs by the ending of Q1 2025 .