Skip to content
Zuckerberg Bets on His New Vision of the Metaverse
Meta
Zuckerberg is aggressively investing in his new vision of the metaverse, as evidenced by the top talent hire of Apple's head of design, Alan Dye. Back in 2021, when Facebook changed its name to Meta, the company defined the metaverse as an immersive digital world you would step into, with Horizon Worlds and Quest headsets at the center. At Meta Connect in September, Zuckerberg revised that definition. The metaverse is now a blend of AI glasses and virtual reality, with the real world as the canvas and an assistant in your line of sight. That’s bad news for VR and good news for the new canvas Dye will now paint.

Key Takeaways

Meta has redefined the metaverse, shifting from a VR world you enter to AI glasses that layer intelligence on the real world.
The reported 30% metaverse budget cuts and the hiring of longtime Apple design leader Alan Dye show Meta is reallocating resources toward AI wearables and integrated hardware software AI design.
Our testing of the Ray Ban Meta Display glasses shows exactly why Meta needs Alan Dye.
We continue to support Meta’s $15B plus annual Reality Labs investment as a long term bet, though we expect it will be at least five years before AI glasses see meaningful adoption.
1

Metaverse Pivot

The news of Meta’s hire of Apple’s lead designer, Alan Dye, hit the center of the bullseye given I’ve long believed that Meta needs Apple design talent to unlock its potential around wearables. But the move begs a bigger question; what does Zuckerberg want to achieve with Reality Labs, and the metaverse more broadly.

That question led me to compare how Zuckerberg has described the metaverse over the past four years. That work led to an insight. Mark is dedicated to Reality Labs, but the group’s central goal has evolved. To better define the change, let’s go back to Zuckerberg’s own words.

In July 2021, during Facebook’s second quarter earnings call, Zuckerberg gave his first clear public definition of the metaverse:

“So what is the metaverse? It’s a virtual environment where you can be present with people in digital spaces. And you can kind of think about this as an embodied Internet that you’re inside of rather than just looking at, and we believe that this is going to be the successor to the mobile Internet.”

That version is clearly VR centric. The point is that you leave the real world and step into a digital one. Horizon Worlds and Quest sat at the center of that vision.

Fast forward to Meta Connect in September 2025 and the definition has changed:

“AI glasses and virtual reality. Our goal is to build great looking glasses that deliver personal superintelligence and a feeling of presence using realistic holograms. And these ideas combined are what we call the metaverse…Glasses are the only form factor where you can let an AI see what you see, hear what you hear, talk to you throughout the day, and very soon generate whatever UI you need right in your vision in real time.”

The metaverse is no longer primarily a place you go. It is now a blend of AI, holograms, and the physical world, with glasses as the default way you experience it.

That is a profound shift. Meta is effectively saying that the future of computing will be ambient and heads up, not contained in a headset.

2

Focusing on Both Vision and Profit

The reports suggesting that Meta is considering cutting up to 30% of the budget for its metaverse group, which I consider Reality Labs, next year feel long overdue given the segment has been consistently losing more than $15B a year.

Meta has been chasing a VR future that never quite broke out, which implies the cuts will land heavily on Horizon Worlds and parts of the Quest virtual reality business. It is unclear how much will actually be cut from Reality Labs and how much VR spending will be redirected to wearables. Assuming half of the 30% is cuts and half is redirect would yield between $2 and $3B in annual savings.

On the invest more side, Meta has poached Alan Dye, who has led Apple’s user interface design team since 2015. My guess is that his five year pay package including stock will be valued around $500m.

Dye will head a new design studio at Meta, responsible for the look and feel of hardware, software, and AI interfaces across the company’s devices. His resume includes the iPhone, Apple Watch, and Vision Pro. Bringing him in, alongside a deputy, signals that Meta understands how critical cohesive design is if glasses are going to move beyond a niche.

The big picture: Meta is pulling back on pure VR and doubling down on AI wearables. The company is shifting some of the same capital and attention that once went to the original metaverse into building consumer ready AI glasses that people might actually want to wear.

3

Our Meta Display Testing

To get a sense of the status of Meta’s metaverse success to date, look no further than the Meta Display glasses.

Conceptually, AI glasses make sense. If an AI assistant can see and hear what you do, it should be more helpful than one that only lives in your phone. Our testing of the Ray Ban Meta Display shows Meta’s AI Glasses Need More Than a Display.

In a side by side test, Meta AI on Display was behind ChatGPT’s Video Chat in both understanding and usefulness. Meta got about half of the prompts right, compared to ChatGPT landing a 98% success rate. More importantly, the test revealed that Meta Display often ignored what was right in front of us, giving generic descriptions instead of specific, contextual help. The experience left us impressed by the hardware and underwhelmed by what we could actually do with it.

The takeaway is simple. Today, Meta’s glasses feel closer to a smartwatch than a replacement for the smartphone. For the average person to adopt them, the usefulness will need to increase dramatically.

4

The Long Road to Personal Superintelligence

Zuckerberg frequently describes glasses as “the ideal form factor for personal superintelligence” because they let an AI see and hear your world and talk to you throughout the day while you stay present. I agree with the direction. If Meta can build an assistant that genuinely understands your context and can take action on your behalf, it will change how we interact with technology.

The path from here to there will be measured in years, not quarters.

We estimate that it will take about five years before AI glasses can reach a 100 million units per year type scale. As a point of reference, there are about 2B smartphones sold a year. That assumes progress on several fronts at once: better models, tighter integration of hardware and software, more comfortable industrial design, and clear use cases that resonate beyond early adopters.

Just as important, the everyday utility has to beat the smartphone by a wide margin. The phone radically changed the consumer device market because it was so much better than what came before. For AI glasses to follow a similar trajectory, they will need to be five or ten times better in the moments that matter. Until then, most people will stick with the device they already know and love.

For investors, this is why we view Meta’s roughly $20B in annual Reality Labs spending as a long duration bet. The company is one of the few that can afford to invest at that scale for years while the product matures. If glasses and personal AI do become the next computing platform, the payoff could be substantial. If they do not, the spend will look like a costly detour.

My stance is that Meta is doing the right thing by shifting from a VR heavy metaverse toward AI glasses, trimming parts of the old strategy that are not working, and bringing in top design talent. The moves are positive if glasses are indeed the answer.

Disclaimer

Back To Top