JavaScript DIY Smart Glasses Progress

So I’ve been working on a tool to help people with disabilities browse the web handsfree. Here’s a quick history of the project (links to code, prototypes, etc in description if interested…it’s all buggy still so maybe just watch :joy:):

Because I’m extremely low income, I walk everywhere. A lot. I live on the outskirts of Portland, OR which is a very eccentric city and I’m always running into oddities (people, places, things). I recently had a thought: “instead of pointing the camera at you, what if it was pointed the other way?”

The idea is that when the camera is pointed at you, you’re web browsing - in other words, you’re getting information from the internet. But if the camera is pointed away from you then you can use the internet to get information about your environment.

So I used tools I had in my apartment to hack a $13 WiFi drone:

And I just managed to get the glasses working! (excuse the poor lighting)

My next steps is to use TensorFlow.js to do basic object classification. There’s no screen (it’s basically just a WiFi camera) but because it’s connected to your device you can hear via your headphones. Everything above (minus the actual stream) is powered by JavaScript through a web browser.

I recently got to chat with the founder of the “Seeing AI” app (he now heads Aira, a smart glasses company for the blind) and I’ve been getting flooded with ideas. Ultimately, this is what I’m building (imagine you use these glasses instead of the phone):

Total cost of materials so far have been $13, but that’s because I ripped apart a drone. Sourcing the materials myself, I should be able to create a DIY parts list for people. And by attaching the components to a glasses arm sleeve, you can retrofit any pair of glasses for less than the cost of a Starbucks coffee (minus shipping)!

These don’t actually have a display, the idea being you would have headphones and it would just dictate to you but you could sign in front of your face to activate commands - sort of like this demo (also done in JavaScript):

Anyways, I’ll update again soon once I actually get the computer vision working on it! Let me know what you think of it, and what use cases you think it could be used for!

4 Likes

Check it out! I’m using Google’s Teachable Machine code to translate my hand gestures to Emoji’s (there’s no actual machine learning code required, you just press buttons). I think I might be able to translate sign language in real time with enough work!

For it to be useful, I would have to have earplugs or something and instead of emoji’s it would dictate what it’s seeing as I’m just seeing what’s to the left (what I actually see in real life).

I might do one more update tomorrow and then switch back to my other project because I can already see myself going down crazy rabbit holes

2 Likes

This topic was automatically closed 91 days after the last reply. New replies are no longer allowed.