It’s taken 10 years to develop but Mark Zuckerberg has finally released the technology that he believes will change the world as much as the iPhone.
The Holy Grail of tech was unveiled by Mark Zuckerberg at Meta Connect 2024 in Silicon Valley.
But we’ll have to wait a bit longer before it is made available to the public.
Dubbed the “most advanced glasses ever made” by the Meta founder, the Orion glasses are the culmination of a decade’s research and innovation.
I was at the event in Menlo Park on Wednesday morning (US time) for the announcement and also got a close look at Meta’s new virtual reality headset Meta Quest 3S as well learning about the AI updates to the Meta Ray-Ban smartglasses.
Orion augmented reality glasses
Zuckerberg saved the biggest surprise for last in his keynote address, bringing out a working prototype of the holographic Orion glasses to the stage.
Looking through the glasses you can see the physical world, with holograms overlayed on it.
“If you want to be with someone who is far away, they’re going to be able to teleport, as a hologram, into your living room, as if they’re right there with you. You’ll be able to tap your fingers and bring up a game of cards or chess, or holographic ping pong. You can work or play,” Zuckerberg said.
He explained the display is not actually a screen, but instead a completely new display architecture with tiny projectors that shoot light into wave guides, aided by nanoscale 3D structures etched into the lenses that diffract light and put holograms in different depths and sizes into the world in front of you.Emphasising the difficulty of manufacturing the glasses, he revealed the lenses are made from silicon carbide and cooling is achieved by using material similar to that of satellites in space.
The battery fits into the arms of the glasses. There’s also a small wireless puck that goes with it to help power it.
Users can interact with the glasses with voice, AI, eyetracking and handtracking. Incredibly, Orion also has a neural interface that operates through you wrist.
“I think you need a device that allows you to send a signal from your brain to the device,” Mr Zuckerberg said.
“This is the first device that is powered by a wrist-based neural interface.”
Orion means Zuckerberg’s long-held dream of creating a device that will replace the smartphone has finally become a reality — and his presentation was dubbed “Meta’s iPhone moment” by tech writer Rihard Jarc.
Although its not one we can buy just yet.
Zuck said the developers are looking to slim down the device, which weighs in at 98g and boasts a 70-degree field of view, before it is released to the public.
Meta Quest 3S
Meta is expecting a big uptick of mixed reality users as it launches the Meta Quest 3S, a cheaper headset with much of the same capabilities as the Meta Quest 3.
“We’ve been working on bringing the Quest 3 family to a lot more people,” Zuckerberg said.
Much like the Quest 3, the Quest 3S allows people to experience high resolution colour mixed reality, vivid passthrough, and handtracking. You can use the headset for social media, gaming, watching videos, working and fitness.
It has the same processor as the Quest 3, but is cheaper thanks to the use of Fresnel, rather than pancake lenses.
“We have made so many improvements and optimisations to the technical stack, to the effective resolution, to the latency... mixed reality and handtracking software is actually better in Quest 3S at $US299 today than it was in Quest 3 than when we launched it a year ago,” Zuckerberg said. Those advancements have also been added to the Quest 3.
Having tested it out during a demo yesterday, I found one of the best features is the ability to watch YouTube, Netflix and Twitch on a big VR screen with Dolby surround sound.
Gaming on the headset is also a unique experience. I tried Batman: Arkham Shadow, which involved me creeping around sewers, shooting a grappling hook up a wall and gliding across an abyss. It’s something you really have to try yourself to understand what it’s like.
Meta glasses get AI upgrade and new look
There were a lot of AI announcements made at Meta Connect 2024, notably changes to Meta AI and the launch of underlying large language model Llama 3.2. It’s now multimodal meaning the AI can natively understand both images and text.
The Meta AI change is the one that will impact most Australians, as more than 80 per cent of Aussies use Facebook. Other apps including WhatsApp, Instagram and Messenger are also impacted, as I outlined in this article.
However, Meta’s other main hardware showpiece is Ray-Ban Meta smartglasses.
They’ve actually been so popular that Meta was at one point struggling to keep up production to meet demand.
Zuckerberg today revealed that the smartglasses are getting a major AI update, although it won’t immediately roll out in Australia.
For the uninitiated, you can use the smartglasses to listen to music, take phone calls, and capture photos and video.
Now they’ve been supercharged with new AI capabilities.
“Glasses, they’re kind of the perfect form factor for AI,” Zuckerberg said.
“For letting an AI assistant see what you see, hear what you hear, be able to communicate with your privately.”
Meta has been updating the AI with iterative releases, but there are some big changes coming in the next few months.
Firstly, it will be more natural and conversational, so that once you start the prompt with “Hey Meta”, and look at fruit in a bowl, you can ask “What kind of smoothie can I make with this?” and it will answer. You can keep the conversation going without repeating “Hey Meta”.
The glasses will also be able to help people remember things, like where you parked or what was on your shopping list.
You will also be able to take actions based on what you’re looking at, such as remembering phone numbers, QR codes or billboards.
Multimodal video is also being added, which will allow the AI to help you as you’re doing things, like trying to work out what you want to wear.
The glasses will essentially be able to figure out what’s going on around you and give you feedback as you go.
“This is the beginning of a big thing,” Zuckerberg said.
Another feature that will be particularly helpful in this increasingly globalised world is live translation, allowing you to hear a live translation in your ear. A live translation app will also be available to translate what you’re saying into a foreign language. Currently only English and Spanish is being tested.
Zuckerberg also revealed that Meta had teamed up with volunteer service Be My Eyes, which helps people with visual impairments, and said the smartglasses would be allow the volunteers to see what they are seeing, and give them information directly to their ear.
Andrew Backhouse is at Meta Connect 2024 courtesy of Meta
Newer articles
<p>The US president has been vague about what victory looks like for both allies, leaving their leaders to pursue their own agenda</p>