GitHub: private
Goal: Build open source smart glasses with a plugin framework. A HUD for everyone. Not Google's $1,500 beta program for Silicon Valley insiders. Your Glass. ur Glass. The idea had been in my head for years. The vision: a tiny OLED screen and a mirror element that projects text onto real glass lenses — a miniature heads-up display, like the HUD you know from cars but mounted on your face. "Incoming call." "Google Maps: turn left in 200m." "WhatsApp: Mom says hi." But the real innovation wasn't the hardware. It was the software architecture. I wanted to build an open source framework on GitHub where anyone could write their own extensions. A connectivity layer — a standard protocol — so that today you get phone notifications, and tomorrow some guy codes a plugin for his smart doorbell that does "ding dong" on your glasses. Basically: MCP before MCP existed. A universal connectivity standard for wearable displays. In 2020. (For the uninitiated: MCP is Anthropic's Model Context Protocol, published in 2024, which does exactly this — a standard for connecting AI models to external tools and data sources. Same idea, different domain. I was four years early and in the wrong industry.) The deeper motivation: less phones, more reality enhancement. Stop pulling a glass rectangle out of your pocket 200 times a day. Instead, overlay relevant information directly onto your field of vision. Enhance reality rather than replacing it with a 6-inch screen. This is the same obsession that shows up across multiple projects: - urGlass: augment vision with information - Real-O-Tron: augment sensation with hardware - Realify: strip away illusions, reveal actual reality - ChordKiller: augment audio beyond what commercial DACs deliver Different inputs, different outputs, same drive: make reality better than the default settings. Stack: - Arduino (Adafruit ecosystem — Flora, Bluefruit LE, NeoPixels) - SSD1306 OLED display (128x64 pixels) - BLE (Bluetooth Low Energy) for phone-to-glasses communication - iOS / Swift (Adafruit Bluefruit app as foundation) - Mini mirror/projector optics for HUD effect
What happened:
Started with the projection technology. Got the Adafruit PCB,
the OLED, the BLE module. Experimented with iOS-to-BLE
communication — that part was genuinely cool. Built a proof of
concept that could receive text over Bluetooth and display it
on the tiny OLED. The test message was a simulated incoming call:
"Eingehender Anruf
-----------------
Andre Rickmeyer
Status: Klingelt..."
It worked. Text from phone to glasses via BLE. The basic
pipeline was proven.
Then I gave it up. Not because the tech didn't work — it did.
But building physical hardware products is a different beast
than building software. The optics (getting the OLED image to
project cleanly onto glass), the form factor (making it look
like glasses, not a science experiment), the power management
(batteries small enough to fit into an arm piece) — these are
industrial design problems, not software problems. And I'm a
software guy who occasionally does hardware, not the other way
around.
The Adafruit PCB is still here. Somewhere. In a drawer. With
the other dreams that needed a mechanical engineer.
Learnings:
- The idea was right. The timing was right. Meta released
their Ray-Ban smart glasses in 2023. The market validated
the concept. I was three years early with a prototype and
no supply chain.
- Open source hardware frameworks need a community, not just
a vision. One person can code a plugin system. One person
cannot manufacture lenses, design hinges, and figure out
where to put a 100mAh battery.
- BLE communication between iOS and Arduino is surprisingly
elegant. The Adafruit Bluefruit ecosystem is genuinely well
designed. Learned a lot about embedded BLE in one day.
- The "reality enhancement" pattern in my work isn't random.
It's a consistent thread: augment what's real, don't replace
it. Glasses, not goggles. HUD, not VR. Sound quality, not
sound effects. This is apparently who I am.
Timeline:
- 2020-08-07: One day. Four commits. Arduino + OLED + BLE +
iOS proof of concept. Incoming call notification displayed
on a 128x64 pixel screen via Bluetooth. Then: drawer.
Status: Hibernating. The PCB is in a drawer. The idea is in the
zeitgeist. Meta, Apple, and Google are spending billions on what
I prototyped in a day with an Adafruit board and spite.