You are currently viewing AR got its ‘killer app’: GenAI

AR got its ‘killer app’: GenAI


You hear that sound? That’s the sound of augmented reality (AR) fading away as a driving concept in technology.

You could blame Apple, which handed down an edict to Apple Vision Pro developers: “Refer to your app as a spatial computing app. Don’t describe your app experience as augmented reality (AR), virtual reality (VR), extended reality (XR), or mixed reality (MR).”

But blaming Apple would be wrong. Instead, blame artificial intelligence (AI) — specifically the generative AI (genAI) trend of the past year and a half; it’s completely upended and re-directed the purpose and function of the glasses formerly known as AR glasses.

How AR became a four-letter word

Tech giants have been working on “smart glasses” for more than a decade. And these products are finally hitting the market. Their killer app? AI, of course. (I’m defining AI glasses here as glasses with the primary purpose of facilitating fast and easy access to AI.)

The Chinese smartphone and gadget maker Oppo wants to be a leader in AI glasses. It introduced its Oppo Air Glass 3 at the recently concluded Mobile World Congress in Barcelona, Spain. The company is using its own genAI tech called AndesGPT, for the Oppo glasses. They have multimodal capability, meaning the integrated camera can hoover up pictures and run them through AndesGPT for identification and processing. The Oppo glasses excel at offering a visual interface in a small and light design, weighing only 50 grams (equivalent to roughly two alkaline AA batteries).

Google is also planning an entry into the AI glasses market. Since ending its Google Glasses Enterprise Edition product a year ago, the company has been focused on smart glasses that can pass as ordinary glasses. Google has a large portfolio of granted and applied-for patents in this space, and it could be a contender.

In this case, Google is sweating the small stuff. For example, one Google patent addresses the inherent heat issues in AI glasses, because the graphics, AI and other processing is generating a lot of heat right next to the head. Alternatively, it could license its patents to OEMs to build devices that support Google AI. Google’s motivation for AI glasses is to make sure the world has a user interface for its LLM-based information tools.

Microsoft is also filing patents for “smart glasses” which, given its control over OpenAI, would likely focus on access to ChatGPT. The company is working on other issues that will pop up for AI glasses users, including battery life. One patent describes both hot-swappable mini batteries, plus the ability to connect to an external battery pack on the belt or in the pocket.

Amazon was actually somewhat early in this space, shipping its Echo Frames back in September 2019. Now in their third generation, the product is getting clobbered in the market by Ray-Ban Meta glasses, which are much better and very close to the same price. Echo Frames give you old-and-busted Alexa, rather than the new-hotness LLM-based chatbot. Amazon will no doubt soon come out with real AI glasses, complete with a generative AI chatbot plus a camera for multimodal capability.

The current leader in audio output AI glasses is the Ray-Ban Meta, a collaboration between the Italian glasses giant Luxottica and the company formerly known as Facebook.

And, of course, Apple is armed to the teeth with patents that would enable it to come out with AI glasses. In fact, the combination of tech and fashion will likely prove irresistible to Apple.

The giant companies are in a mad scramble to get AI glasses to market because they know the real threat will come from dozens or hundreds of smaller companies joining the game.

AI glasses follow the smartphone playbook

One of the hidden catalysts driving the smartphone market over the past 20 years is the background development of all the parts and components of a smartphone. Hundreds of companies now make tiny, low-power cameras, radios, processors, batteries, screens, audio components and other parts, which makes it pretty easy to enter the smartphone market.

That’s now just beginning with AI glasses.

I recently spoke to Ed Tang, CEO of Avegant, a Silicon Valley-based light engine company. (Light engines are tiny projectors that beam visual content onto the lenses of augmented reality (AR) glasses.)

Avegant works with smart glasses makers to design AI glasses that are as small, light and normal-looking as possible.

The company recently announced a partnership with chip giant Qualcomm and computer components giant Applied Materials to develop a range of reference designs for companies that want to build and sell visual-output AI glasses.

In the partnership, Avegant supplies their AG-30L2 part, which enables high-quality two-lens or one-lens visuals with tiny components that give you a high-resolution heads-up display in glasses that look like regular prescription glasses. Avegant’s AG-30L2 part weighs only 2.7 grams — exactly the same weight as a regulation ping-pong ball.

Qualcomm’s hardware contribution is the Snapdragon AR1 Gen 1 SoC — the same part that powers Ray-Ban Meta glasses. The AR1 Gen 1 is super light-weight, and processes high-quality graphics, on-device AI and has fast connectivity. It can display handle resolutions of up to 3k per eye. The integrated radios support Wi-Fi 7 and Bluetooth 5.3. And it can process signals for up to eight microphones. (Ray-Ban Meta glasses have five microphones.)

Applied Materials is contributing high-efficiency waveguides, which take the projection from Avegant’s light engine and re-directs it into the wearer’s eyes, all within very thin transparent lenses.

Tang told me: “We think that AI is really going to be the key factor to drive the use and sales of these (AR) type of devices. The reason why you’re buying these is not because it’s a display that you can wear. The reason why you’re buying them is because of the value and application that it’s providing to you. And we think that is primarily going to be driven around AI applications.”

Tech companies, he said, are investing heavily in AI, and Avegant offers a direction for a “human interface device that really is going to drive the use case and applications of AI.”

(While Tang wouldn’t tell me which companies Avegant is working for, I wouldn’t be surprised at all if OpenAI is working on AI glasses.)

With all this activity, we’re likely to see a wave of AI glasses startups — the hardware that rides on the massive recent VC investments in genAI.

Even very small companies will be able to go shopping for the parts, and even get the reference designs to offer a seriously compelling hardware interface for their genAI chatbots.

The coming revolution in AI glasses

AR should now be viewed as an umbrella term, because when someone now refers to AR glasses, it’s not clear what that means. It could mean a spacial computing device like Apple Vision Pro. Or it could mean a heads-up display like the now defunct Google Glass. But that’s pretty much the range for AR.

Over the next three years, that range is likely to look like a bell curve graph, with low functionality to the right, high functionality to the left and sales numbers on the vertical axis. Taking up the 80% of the center will likely be what we call AI glasses, with the left side of that center making up audio output only and the right side audio plus visual data.

Because these devices will often cost very little — less than half the cost of an average smartphone and likely to dip below $250 — the appeal will be massive because even the lowest-cost, audio-only devices will still give you the holy grail feature: instant access to genAI all day, every day.

Within three to five years, I think the number of AI glasses users will be measured in the hundreds of millions. Avegant’s Tang goes even further:

“I feel like the public is just about to see what I would call the minimum viable product in the space,” he said. “And that’s probably happening next year. And if you think that glasses will evolve into smart glasses, then you’re talking about 1.2 billion units a year.”

This is the technology revolution of the decade, and still hardly anyone is talking about it in those terms.

It’s time to stop waiting for the AR glasses revolution and start understanding the AI glasses revolution. It’s all about AI now.

Copyright © 2024 IDG Communications, Inc.

RK THE HACKER BOY

Hello Guys I am RK The Hacker Boy. I am the Owner Of RK Hacking Zone. I am Carder, Cracker and Hacker. If you want learn about this Just Join our Telegram Channel. My AIM is I do Something For Poor people and give his some helps. Jai Hind Dosto

Leave a Reply