Topics
Latest
AI
Amazon
Image Credits:Google
Apps
Biotech & Health
Climate
Image Credits:Google
Cloud Computing
Commerce
Crypto
Enterprise
EVs
Fintech
fundraise
Gadgets
Gaming
Government & Policy
ironware
layoff
Media & Entertainment
Meta
Microsoft
seclusion
Robotics
Security
societal
Space
Startups
TikTok
fare
Venture
More from TechCrunch
case
Startup Battlefield
StrictlyVC
Podcasts
Videos
Partner Content
TechCrunch Brand Studio
Crunchboard
touch Us
On Tuesday , Google herald a number of newfangled plus to Gemma , its kinfolk of open ( but not open source ) simulation corresponding to Meta ’s Llama and Mistral ’s open models , atits one-year Google I / group O 2024 developer conference .
The headline - grabbing release here is Gemma 2 , the next generation of Google ’s loose - weights Gemma models , which will plunge with a 27 billion parameter model in June .
Already available is PaliGemma , a pre - school Gemma var. that Google describes as “ the first visual sensation voice communication framework in the Gemma family ” for image captioning , ikon labeling and ocular Q&A use cases .
So far , the standard Gemma models , whichlaunched earlier this year , were only available in 2 - billion - argument and 7 - billion - parameter versions , making this young 27 - billion model quite a step up .
In a briefing ahead of Tuesday ’s annunciation , Josh Woodward , Google ’s VP of Google Labs , mark that the Gemma modeling have been downloaded more than “ millions of times ” across the various services where it ’s available . He try that Google optimise the 27 - billion model to melt on Nvidia ’s next - gen GPUs , a single Google Cloud TPU server and the manage Vertex AI service .
Size does n’t weigh , though , if the model is n’t any good . Google has n’t portion out a destiny of data about Gemma 2 yet , so we ’ll have to see how it do once developer get their hand on it . “ We ’re already seeing some gravid timbre . It ’s outperforming model two prison term bigger than it already , ” Woodward said .