Mentra has raises $8 million and launched MentraOS 2.0, an open-source operating system and app store for smart glasses.
MentraOS 2.0 is a cloud-native, cross-device platform that finally gives smartglasses the software layer smartphones have had for over a decade. It’s already used more than eight hours a day by deaf and hard-of-hearing users, and ships with built-in apps for live captions, translation, notifications and a proactive AI assistant. It also has an app store.
Investors include Rich Miner (founder of Android), Jawed Karim (founder of YouTube), Eric Migicovsky (founder of Pebble), Paul Graham, Y Combinator, Toyota Ventures, the Amazon Alexa Fund, Alan Rutledge, TIRTA Ventures, and Hartmann Capital.
Users can now install new apps on their glasses, and developers can write a single app that runs on any pair of smart glasses. To enable proactive AI that runs in the background,
MentraOS allows multiple apps to access context at the same time.

The company said this marks a pivotal step toward a software ecosystem for lightweight XR devices rooted in practicality, not hype.
“Eight hundred years ago, glasses were invented,” said Cayden Pierce, Mentra CEO (and an MIT dropout), in a statement. “But they didn’t go mainstream until the 1920s — when they
dropped below 40 grams and became wearable all day. It’s now the 2020s. Smart glasses are finally under 40 grams. The hardware is ready. The AI is here. But there’s still no OS. That’s what we’re building. MentraOS is like the Android for smart glasses.”
Pierce added, “Seven years ago, I read a study where students with subtitles learned better than those without. That clicked for me – subtitles are intelligence-extending. I built a captions app for smart glasses and realized: there’s no framework to do this. That moment started everything. MentraOS is the result.”
Pierce said that MentraOS upgrades the phone by giving it a new interface — your face. Suddenly your phone can take POV photos, capture thoughts as they’re spoken, answer calls without being touched, and feed real-time info into your view.
“The glasses rely on the phone for connectivity and light compute, while most of the intelligence runs in the cloud. MentraOS doesn’t replace the phone for now,” Pierce said. “It extends it and turns the phone into a real-time hub for AI and communication that stays in your pocket while you interact through your glasses.”
MentraOS runs on multiple hardware platforms, including the Even Realities G1, Vuzix Z100, and Mentra’s own hardware. The company is preparing to ship Mentra Live — camera, mic, speaker, and open SDK – this fall, with more devices coming in 2026.
“Our tools aren’t separate from us — they extend us,” said Pierce. “Just like the neocortex extended the animal brain, technology extends the human mind. That’s what Mentra is about. We’re building the infrastructure for the next personal computer.”
Pierce and Alexander Israelov started Mentra in 2024 in San Francisco and Shenzhen. The company has 10 people and it’s growing in engineering, operations and growth. Prior to the $8 million round, the company had raised $75,000.
Origins

In a message to GamesBeat, Pierce said, “I cofounded Mentra as the best way to ensure that human-computer interaction (HCI) advances fast enough to keep up with AI, and to ensure that the next personal computer is open and user-controlled for everyone.”
In 2018, Pierce said he was working out in his undergrad university gym, noting an idea he had on my phone, when he was struck with a realization.
“I realized that technology is really an extension of our minds, not something separate from ourselves,” Pierce said. “Just like our neocortex extended the animal brain, so are we now extending our neocortex with our technological exocortex. I looked at my phone, the current state-of-the-art mind-extension technology, and realized we’d move the display to the eyes, the speakers to ears, the sensors to capture our world.”
He added, “That day I became an interface engineer and set out to build the next interface. But at the time, I was well aware that the hardware, AI, and software wasn’t ready.”
In the next couple of years, he worked on building custom smartglasses hardware and applications while at school. There, he met Steve Mann, the inventor of smartglasses and smartwatch the University of Toronto in Canada. He worked on developing brain-sensing smartglasses at a startup, and in 2022 to 2023 he went on a cross country roadtrip building the OpenSourceSmartGlasses.
During 2023 and 2024, AI matured thanks to OpenAI and the efforts of others.

“I realized it was time to take the interface and combine it with advancements in LLMs to allow for proactive AI agents on smart glasses. So I went to study at the MIT Media Lab under Pattie Maes, proactive AI agents pioneer,” Pierce said.
By December 2024, the hardware became ready.
“I used Even Realities G1s with an early beta version of MentraOS, and I saw that the hardware was ready,” he said. “So only the software was left. One day, I decided I’d have to drop out to make this happen.”
He and his cofounder Israelov applied to YCombinator. They were accepted within 36 hours and he dropped out of MIT, moved out of his house, shipped a dozen beta units and then flew to China.
“Been working on Mentra every moment since,” Pierce said. “Why do I care? Because smart glasses are inevitable, and they’re a technology that has the ability to literally control what you see and what you hear. We want that to be open and user-controlled.”
He added, “I also see a beautiful future where, instead of AI taking off and becoming something else, it instead extends our capabilities. Interfaces like smart glasses will enable us to increase our bandwidth with AI, and eventually merge with AI.”