Meta Connect announcements: $2.5M creator contest, Gen AI investments, Meta Horizon Engine and wearable device tools

GB Next - Join the must attend GamesBeat event November 12-13, San Francisco. Register Now.

At day two of Meta Connect keynote talks, Meta leaders spent a lot of time talking across the landscape of AI, virtual reality, augmented reality and devices like smart glasses.

Personal superintelligence is in sight, and it’s likely to help form the very fabric of computing from here on out, Meta said. From today’s VR headsets to tomorrow’s AR glasses, AI is the unlock that will bring the next computing platform into much sharper focus.

Meta spent a lot of time yesterday talking about glasses as the first AI-native device to be adopted at scale, and today, it shifted gears to dive deeper on VR and the ways in which AI is changing the game.

Meta Horizon Studio: Level Up with Generative AI Tools

The company sees a future where you can build entire virtual worlds without ever touching a line of code. That’s a huge opportunity for VR — not just because it equates to more worlds for people to explore but because it opens up the door for anyone to be a creator.

Generative AI tools are lowering the barriers of entry for people to build compelling 3D, immersive content without sacrificing the ceiling of what’s possible. People with zero experience can make some incredible stuff. And seasoned professionals can make some much better stuff — and faster, Meta said.

Meta already has an expansive set of generative AI tools for Horizon. You can generate 3D meshes, textures, skyboxes, ambient audio, sound effects, and TypeScript — all with simple text prompts. You can even generate a fully rigged and animated 3D avatar. And our NPC generation system goes beyond that, coming up with a backstory, voice, conversation style, and dialogue based on just a few prompts.

Take Citadel. It’s already one of the most popular worlds in Horizon, and when its scripted NPCs were replaced with AI generated characters, time spent increased by nearly 15%.

Meta said it has seen Horizon creators embrace these new tools. Nearly 60% of eligible creators have used them in the last 30 days, and roughly one in three new worlds in the past month has AI-generated assets.

Meta is also building an agentic creation system that orchestrates all these pieces behind the scenes, letting you create a fully immersive playable 3D world using nothing but natural language prompts. It won’t take months or even days. It’ll take minutes.

These new tools will be available in the new and improved Horizon Studio, and if you want to be among the first to try it out, you can apply for the beta today.

For developers, increased velocity translates to faster iteration and more experimentation. Rather than grayboxing an early prototype, what if you could populate it with vibrant assets and sound? Thanks to generative AI tooling in Horizon Studio, you can.

Meta Horizon Engine: Driving Improved Quality Across the Metaverse

Mark Zuckerberg talks about the Meta Horizon Engine at Meta Connect Day One. Source: GamesBeat/Dean Takahashi

To really raise the quality bar for Horizon, we completely rebuilt the core architecture. The new standalone Meta Horizon Engine has been fully optimized to bring the metaverse to life. It’s based on a scriptable render pipeline that opens the door to new visual styles and effects. You get industry standards like physically based rendering materials and lightmaps. It unlocks more complex genres, like RPGs and sims. And there’s now support for fundamental development features like source control.

In a conversation with one developer, they said this could be a big disruption to the content creation system on Meta if it means that Meta is getting rid of Unity. We have asked Meta about that [update: They say they aren’t getting rid of Unity].

Thanks to Horizon Engine, Meta has been able to rebuild your Immersive Home in VR. You can customize it with persistent app windows you can pin in place. You might anchor a web browser to your coffee table or put a giant Instagram portal on your wall to enjoy scrolling through 3D versions of the photos and videos in your feed.

When you change home environments, you’ll see a beautiful transition in lighting and colors. Rather than just swapping out the skybox, it’s a living, breathing environment all around you. And your home is designed to be social. Soon you’ll be able to invite up to seven friends over to watch movies or play games together.

Looking for a bigger crowd? Horizon Engine supports higher concurrency in worlds — 100+ in a single space — which opens the door for massively multiplayer experiences. Public spaces like the new Horizon Central will feel more vibrant and active, and thanks to improved load and travel times, you can quickly jump from there to the new Arena to catch a concert or live event.

Improving Distribution & Discovery to Boost Dev Success

With Horizon Studio lowering the barrier to entry for creation and Horizon Engine raising the ceiling of what’s possible, there’s more opportunity for everyone — including developers, who can now build games that look and play great across VR and mobile and reach a massive audience by using our social apps for distribution.

And Meta said it is making it easier for people to connect with great content, both in-headset and out. Meta has redesigned the Horizon Feed in VR, so people now get personalized app recommendations and see more promos that help them discover new apps. We also redesigned our mobile feed with a simpler structure that encourages more discovery. There’s live video previews, trailers, and more.

Meta said it overhauled its search surfaces in the Horizon mobile app and in-headset to improve discovery. Before people enter a search term, they’ll see a list of trending searches, categories, and shelves that are relevant to them. And in the Horizon Store, there’s more places where apps can be promoted.

Meta is introducing a new Targeted Re-Engagement Tool that helps you reach specific user cohorts with direct messaging, discounts, or promo codes using in-app or in-store notifications, which significantly increase the efficiency and conversion rate of your offers. For example, you could give a 50% discount to top spenders who haven’t purchased an in-app item in the last 14 days. And Meta has added new Organization Profiles, which are designed to help you connect with your community so more people discover your work.

The Meta Horizon+ VR subscription service is another tool for developers to drive revenue. Over the past year, we’ve had 67 participating apps that have cumulatively earned $20 million from the program. The VR game Synth Riders alone saw a 60% bump in revenue after being added to the rotating Meta Horizon+ games catalog a year ago.

Social multiplayer games like Gorilla Tag continue to outperform on the platform. Animal Company, a free-to-play game with over 1 million monthly active players, managed to increase their paying user base by nine times in six months and ranked No. 5 in gross first-year revenue as of this summer.

And as of this May, Ghosts of Tabor, a premium paid multiplayer game that launched in Early Access in 2023 before graduating to a full release in 2024, has already made over $30 million in revenue with more than 1 million players across Quest and Steam.

Regardless of the size of your studio, there’s money to be made. Over 300 apps on the Meta Horizon Store have now generated more than $1 million in revenue, with an even 10 apps generating over $50 million each.

Unity & Android Updates

Meta Ray-Ban Display. Source: Meta

If you’re already working with your own AI tooling, Meta said its new Horizon OS MCP Server can help accelerate your workflows. Connect it to your LLM and you’ll have all the relevant context needed to build for Horizon OS.

You can then ask your LLM about Horizon OS and get accurate responses from our knowledge base — everything from understanding new platform capabilities to publishing guidance for your app. You can also generate quality code for Unity or the Meta Spatial SDK. That could mean boilerplate code to save time or a new code snippet in a language you’re less familiar with. Say you need help understanding how to build a VR native component — your LLM can now connect to our knowledge base to guide you through all of it.

It can help you through some of the hardest problems in VR development, like performance optimization. You can take a trace and ask your LLM to analyze it for performance issues, make recommendations on fixes, and compare results after your changes are made to see the wins.

And your LLMs and computer vision models can also integrate with our Passthrough Camera API thanks to our new Meta Building Blocks for Unity. This makes it possible to build experiences that capture and understand your players’ physical surroundings in much deeper ways.

Growing the VR User Base Through Expanded Use Cases

Oakley Meta Vanguard AI glasses. Source: Meta

Just like smartphones, laptops, and desktop computers are expected to include web browsers, email clients, 2D entertainment, and more, VR needs to deliver the best of today’s computing — and make it better.

That’s where 2D panel apps, or window apps, come into play.

Time spent in this category is up more than 60% over the last year. And in VR, it can be even better than a 1:1 translation of traditional computing. Take the old-school browser — it’s better on a massive display that you can take with you anywhere. In fact, time spent using our browser over the past year is up 28%, making it one of the most popular apps on the platform.

And Discord is getting in on the action with the upcoming launch of a new native window app on Quest in 2026. People in-headset will be able to stay tapped into that community during their VR exploits and learn about new games just by seeing what their friends are playing — and maybe even challenge them to a slashing sesh in Beat Saber.

This is a massive opportunity for VR developers thanks to Discord’s huge player base. Think about it: Discord is home to a highly engaged community of more than 200 million monthly active players who spend a combined 1.9 billion hours playing games each month across thousands of titles on PC alone. With the launch of the Discord app on Quest, VR devs will have an incredible discoverability engine at their disposal.

People are using window apps in-headset in droves — which creates a new opportunity for devs to build something special. It’s now much easier to port Android apps into VR with our new Meta Horizon Android Studio Plugin. There’s the Meta Spatial SDK for spatializing your app. And the Spatial Simulator is coming soon to help you test how your apps will work in VR.

And just like James Cameron and Lightstorm Vision are committed to producing a new generation of cinematic content that’s native to VR, Meta said it sees non-gaming entertainment as a huge opportunity that’ll drive a new wave of adoption in the future. (Yesterday, it introduced something called Meta TV).

Native media apps can deliver content people love in a much more compelling format. For example, Blumhouse’s new app adds 3D special effects to enhance the immersive horror experience. Disney+ is also coming to Quest and bringing along content from Hulu and ESPN.

Investment in Mobile + Continued Ecosystem Growth

Meta Connect 2025 took place at the company’s HQ in San Mateo, California. Source: GamesBeat/Dean Takahashi

Making windowed content available alongside immersive options helps Meta grow the ecosystem in-headset, and for that same reason, Meta said it is just as invested in bringing high-quality Horizon content to people on mobile and the web. The more people who can come together in worlds, the better — no matter what device they’re using.

Meta said it has worked hard over the last year to make Meta Horizon on mobile a best-in-class experience, and we’ve made it easier than ever for creators to promote their worlds on Facebook and Instagram. A world can appear in someone’s feed just like any other kind of post, and they can jump straight into it with a single tap.

Thanks to these changes, more people are using Horizon on mobile than ever before. Monthly active users in mobile worlds are up 4x over the last year, and creators have published more than 5,000 new mobile worlds in that time.

As usage grows across VR and mobile, there’s a big opportunity for Horizon creators to turn their worlds into real businesses. Our $50 million Creator Fund has paid out millions in prizes since it launched in February, and Meta is announcing another $2.5 million contest today. In-world purchases are up 280% this year, and new AI tools make it even easier to create unique items to help you monetize.

Whether you’re a seasoned dev, a creator getting started, or somewhere in between, we recommend following the advice that GOLF+ CEO Ryan Engle shared on stage during today’s keynote: Find your niche. Think long term. And know your audience.

One More Thing…

Booting up the Ray-Ban Meta Gen 2 AI glasses. Source: GamesBeat/Dean Takahashi

While VR is and will continue to be home to some of the deepest and most magical experiences that AI can make possible, Meta also knows that glasses will bring personal superintelligence into our lives in a radically different way. With today’s AI glasses, Meta has already seen that your AI assistant levels up when it can see and hear the world from your point of view and start to understand your context. And with the upcoming launch of Meta Ray-Ban Display and the Meta Neural Band, things are about to get even more interesting.

To help the dev community make the most of this exciting new opportunity, Meta introducing our new Wearable Device Access Toolkit in developer preview. This lets your app access the vision and audio capabilities of our AI glasses — which unlocks a level of contextual awareness and real-time information that simply isn’t possible otherwise.

And early results are promising. Disney’s Imagineering R&D team is working on early prototypes to see how AI glasses could help give guests access to tips while in their parks. Major streaming services like Twitch will enable creators to livestream straight from their glasses — creators could even reach multiple platforms simultaneously via streaming software partners like Logitech’s Streamlabs. And HumanWare is building an integration that gives blind and low-vision people live guidance as they navigate the world.

It’s incredible to live through a paradigm shift as monumental as what we’re seeing, both with today’s VR headsets and tomorrow’s AR glasses. And we’re not just living through it — we’re building it together with you. We can’t wait to see what the next year holds.

Disclosure: Meta shared a demo of the Ray-Ban Meta Gen 2 smartglasses with me for review.