Skip to content
Meta’s Wearables Have a Long Way To Go
Meta
Meta is putting a lot of effort into wearables, but based on our recent testing of the new Meta Display, they have a long way to go before they get the right formula. We ordered our Meta Display as soon as it was announced and got it 26 days later. Our bottom line is that while the tech is impressive, we find it hard to recommend the product based on its limited use cases. However, we continue to believe Meta is making the right move by investing around $18B annually into Reality Labs, but the return on that investment will take years.

Key Takeaways

The tech will leave you impressed, but the realization that there is not much to do will leave you wanting more.
As one of the main selling points, Meta AI, is largely useless compared to ChatGPT or Grok.
Meta is on the right track pursuing the consumer, just ask Google, Microsoft, and Apple.
1

Initial Impression

We are disappointed to report that my initial reaction to the Meta Display demo still holds true: our Meta Ray Ban Display demo shows promise, but mass adoption remains a long way off. The Deepwater team has been experimenting with the glasses and quickly came to a consensus. Initially blown away by the unique technology, then quickly realizing there is not much use for them.

Here is a roundup of our takeaways:

  • Calls, Texts: The glasses are a good medium for previewing notifications including texts, calls, Instagram DMs, and WhatsApp messages. Then again, so is your phone or watch. Replying using Meta Display takes some effort using verbal dictation, which most of the time was buggy, prompting me to reply with my phone anyway.
  • AI Identification: AI object identification felt incoherent. For example, when I held up a cup with the Minnesota Wild logo and asked what team it was, it replied New York Yankees.
  • Photos: Photo quality is a noticeable step down from the phone.
  • Sound Quality: We were impressed by the sound quality. For not having in ear or over ear headphones, the glasses offer crisp audio for calls and music. You can connect to Spotify or Apple Music.
  • Maps: Another good use case is Maps. The on lens map works well, picking up your exact location and direction you are facing. It makes walking directions easy, despite this being a use case we rarely ever need.
  • Style: The most fashion-friendly wearable to date, yet we all agreed we still don’t like the look. The glasses feel a bit cheap and clearly read as smartglasses, drawing attention from passersby.

Our bottom line is Meta Display utility is similar to a watch, with the added hassle that you have to wear the glasses and a neural band at the same time. With your watch you get subtle notifications you can quickly and discreetly preview without pulling out your phone. The glasses provide the same type of notification, but I found myself pulling out my phone to address or respond anyway.

Part of this is habit, but part of it is that it is simply easier to use your phone. To get consumers to switch, the tech needs to be easier to use.

2

Meta AI

When testing AI features accessible through Meta Display, we essentially hit a dead end. One use case of AI running inside of Meta Display is to help identify the world around you. When asked to identify several specific items I was looking at, here’s what happened:

  • A framed picture of the Minneapolis skyline: “I see a white wall with a framed picture and several other items in the background.”
  • Minnesota Wild logo: “This is the Atlanta Braves.” It told me that three times even when corrected.
  • A show playing on my TV: “I see a TV on a TV stand in front of a wall in a room.”

My take: Compared to ChatGPT, Grok, and Gemini, Meta AI image analysis and identification was nonsensical.

3

Consumer Focus

As I tested Meta Display, I kept thinking about Google Glass and HoloLens a decade ago and Vision Pro more recently. Those companies went after the enterprise, and it didn’t work. I’m glad to see Meta’s focus has been on consumer for the past four years, and I and believe pursuing that market is the avenue of success.

A recap of why an enterprise wearable is hard.

  • Google Glass Explorer Edition cost $1,500 and was first released to testers in April 2013 and ended production in 2015. The learning: the glasses had a high creep factor, high price, and low utility.

  • Microsoft jumped in with HoloLens in 2015 priced at $3,000, focused on developers and enterprise. HoloLens 2 came out in 2019 at $3,500, also targeted to enterprise. The platform had some design wins including the US Army IVAS program, while broader traction stayed limited due to high cost and niche apps. Microsoft has since ended HoloLens production. The Army’s IVAS program shifted in 2025 when the Army approved a multiyear contract to Anduril, which now leads the next phase of development.

  • Apple Vision Pro came out in February 2024 at $3,500. Apple positioned Vision Pro as a premium spatial computer for work and entertainment. The company highlighted business use through partnerships and apps from SAP, Microsoft 365, Zoom, and Webex, plus pilots like KLM’s engine maintenance training, while consumer uptake has been constrained by price and comfort concerns. Vision Pro remains on the market and is rumored to be on its way out as Apple shifts focus to glasses as its core vision product.

Disclaimer

Back To Top