Skip to content
My Meta Ray-Ban Display Demo Shows Promise, but Mass Adoption Remains a Long Way Off
Artificial Intelligence, Meta, Wearables
The road to purchasing Meta Display led through a local Best Buy, where a mandatory demo was required to ensure proper fitting. In the end, I found the technology impressive, the use case still limited, and the fashion grade below average. The bottom line: Meta is making the right move by investing $20B annually into Reality Labs, but the return on that investment will take years.

Key Takeaways

Meta Display demo-first rollout is a reminder that wearables are personal, which remains a barrier to adoption.
My sense is early demand appears modest.
The technology impresses, but comfort, style, and behavioral change remain adoption hurdles.
Limited iPhone integration needs and will get fixed.
1

Buying Experience and Rollout Strategy

The Meta Display glasses are the most affordable step toward ambient, wearable AI computing. Ambient computing complements the real world rather than replacing it, and that distinction matters because it’s a major accomplishment. At $799, they’re roughly 25% of the cost of Vision Pro.

Meta’s sales model for its Display glasses reveals that users need to be fitted with both the glasses and the Neural Band. As stated on the first line of the Meta Display landing page, “A demo appointment is required to buy Meta Ray-Ban Display.”

The glasses are only available through in-person demos at select Best Buy, LensCrafters, Ray-Ban, Sunglass Hut, and soon Verizon store. This measured rollout indicates that Meta is prioritizing fit accuracy, message control, and early adopter quality over scale.

At Best Buy, the glasses are sold by Meta employees. It appears Meta hired and trained local Best Buy retail personnel, which underscores the degree of control the company wants to maintain over the rollout.

The rep explained that Meta chose a demo-first approach because of the sizing of the Neural Band, an crucial accessory for functionality. If the band is too small or large on the user’s wrist, gestures do not register properly and can become “highly exaggerated.” The decision to gate access through demos introduces friction but also serves to calibrate expectations, ensuring first impressions align with what the product can deliver on.

I expect by the end of the year you will be able to order a pair online.

2

Early Demand Signs

The rep had a clear message when it came to utility: this brings the phone to your face without the effort of taking it out of your pocket, including easy access to AI. I’d rate that pitch as good, but not great. It’s also important to note that the operating system for Meta Display is brand new, meaning apps are still in development and Meta is actively working to expand use cases.

When asked about the typical demo, he said most users had the same reaction I did, surprised at how good the technology is, yet uncertain about the use case. When it came time to actually buy the glasses, I learned they’re on backorder with no clear delivery date. The rep explained that only stores in California received inventory, which sold out immediately, mentioning that a few thousand units had been sold.

I don’t want to put too much weight on an off-the-cuff comment, but even if that number were 300x higher, it would still be less than 1 million and fall into the category of modest initial demand.

The takeaway: December unit volumes will be insignificant given the “demo-before-order” requirement and the month-plus wait to receive a pair.

Taking a step back, it feels like last month Zuckerberg was eager to continue pushing the narrative that Meta is the leader in smartglasses, despite the reality that both sales and production capabilities are still in their early stages.

3

Product Experience

Technology: My initial reaction to the demo, and when first seeing it during Meta Connect, was the same. I was blown away by the technology. Both the product and ease of use exceeded expectations. The display is clear, responsive, and comfortably layered over the real world, allowing users to shift focus naturally between digital and physical space. The Neural Band control initially feels foreign but quickly becomes intuitive.

Comfort: The glasses are lighter than they look and comfortable for short wear, yet still visually obvious as a tech product, which is a psychological and social barrier.

Style: Glasses remain an identity product; people choose them for fashion, not function. While Meta’s Ray-Ban models excel in style, the Meta Display misses the fashion mark. Meta needs to take a page from Apple’s playbook with the Apple Watch, making leading-edge technology fashionable.

Changing behavior: An obstacle to adoption is less about the technology, and more about human behavior. Altering behavior to wear glasses every day like you wear a watch is a monumental hurdle, even for those who already wear them, just ask the over 10 million Americans have had LASIK.

4

iOS Limitations

If Meta wants to replace the iPhone in your hand with glasses, the device needs to mirror your iPhone. Today, that is not the case and it represents a major barrier to adoption, second only to the glasses form factor itself.

During my demo, I was told the glasses were compatible with my iPhone and iOS. My understanding is that is not true today. Instead, communication runs through Meta apps like Facebook Messenger. The challenge is that most Apple users, myself included, want to use iMessage.

My sense is this is a high priority issue and will likely be resolved in the coming weeks.

Disclaimer

Back To Top