Topics

belated

AI

Amazon

Article image

Image Credits:Google

Apps

Biotech & Health

Climate

man sitting on couch, reading

Image Credits:Google

Cloud Computing

Commerce

Crypto

Article image

Demo of Google’s translation feature on its prototype glasses.Image Credits:Google

endeavour

EVs

Fintech

Article image

Close-up of Google’s prototype glasses.Image Credits:Google

Fundraising

Gadgets

Gaming

Article image

Demo of Google’s Prototype Glasses.Image Credits:Google

Google

Government & Policy

Hardware

Instagram

layoff

Media & Entertainment

Meta

Microsoft

privateness

Robotics

surety

Social

blank

Startups

TikTok

transfer

Venture

More from TechCrunch

Events

Startup Battlefield

StrictlyVC

Podcasts

Videos

Partner Content

TechCrunch Brand Studio

Crunchboard

reach Us

Google is easy peel back the mantle on its vision to , one Clarence Day , deal you drinking glass with augmented reality and multimodal AI potentiality . The ship’s company ’s plan for those methamphetamine , however , are still blurry .

At this breaker point , we ’ve see multipledemos of Project Astra — DeepMind ’s attempt to build real - time , multimodal apps and factor with AI — run on a occult yoke of prototype glasses . On Wednesday , Google said it would release those prototype chicken feed , armed with AI and AR capabilities , to a small set of selected users for real - world testing .

On Thursday , Google allege the Project Astra paradigm glasses would play on Android XR , Google ’s new operating system for imagination - ground computer science . It ’s now starting to let hardware makers and developers build dissimilar kinds of glasses , headsets , and experiences around this operating organization .

The glasses seem coolheaded , but it ’s important to recall that they ’re essentially vaporware — Google still has nothing concrete to share about the existent product or when it will be released . However , it certainly seems like the party want to launch these at some period , scream smart glasses and headsets the “ next genesis of computing ” in a press release . Today , Google is building out Project Astra and Android XR so that these glasses can one solar day be an literal product .

Google also shared a new demo showing how its prototype glasses can use Project Astra and AR technology to do thing like transform poster in front of you , remember where you left things around the theatre , or let you say texts without study out your phone .

“ Glasses are one of the most powerful form factors because of being manpower - free ; because it is an easy approachable wearable . Everywhere you go , it pick up what you see , ” said DeepMind merchandise run Bibo Xu in an interview with TechCrunch at Google ’s Mountain View headquarters . “ It ’s perfect for Astra . ”

A Google spokesperson told TechCrunch they have no timeline for a consumer launching of this paradigm , and the ship’s company is n’t sharing many details about the AR technology in the spyglass , how much they cost , or how all of this really works .

Join us at TechCrunch Sessions: AI

Exhibit at TechCrunch Sessions: AI

But Google did at least share its vision for AR and AI glasses in a press discharge on Thursday :

Many tech company have apportion interchangeable lofty visions for AR glasses in late months . Meta recently showed off itsprototype Orion AR glasses , which also have no consumer launch date . Snap’sSpectaclesare available for leverage to developer , but they ’re still not a consumer product either .

An edge that Google seems to have on all of its competitors , however , is Project Astra , which it is launch as an app to a few beta examiner soon . I got a chance to essay out the multimodal AI agent — albeit , as a phone app and not a pair of glasses — in the first place this week , and while it ’s not usable for consumer use today , I can confirm that it works pretty well .

I walk around a library on Google ’s campus , point a phone television camera at different target while talking to Astra . The federal agent was able to process my voice and the picture at the same time , letting me ask enquiry about what I was seeing and get answers in real time . I pinged from Koran cover to account book cover and Astra cursorily give me summaries of the writer and books I was looking at .

Project Astra works by pour scene of your surroundings , one inning per secondly , into an AI manikin for real - sentence processing . While that ’s happening , it also processes your voice as you speak . Google DeepMind says it ’s not training its models on any of this drug user data point it collects , but the AI model will think your surround and conversations for 10 minutes . That earmark the AI to refer back to something you saw or said in the beginning .

Some member of Google DeepMind also show me how Astra could interpret your phone screen , similar to how it interpret the view through a phone camera . The AI could quickly summarise an Airbnb listing , it used Google Maps to show nearby address , and executed Google Searches found on thing it was seeing on the phone projection screen .

Using Project Astra on your headphone is impressive , and it ’s likely a signal of what ’s come for AI apps . OpenAI has also demoedGPT-4o ’s vision capableness , which are standardised to Project Astra and also have been pester to unloosen soon . These apps could make AI assistants far more utilitarian by giving them capabilities far beyond the realm of textbook chatting .

When you ’re using Project Astra on a phone , it ’s patent that the AI model would really be perfect on a dyad of glasses . It seems Google has had the same idea , but it might take them a while to make that a reality .