See that little circle? That’s a camera. | Photo by Vjeran Pavic / The Verge
All around Meta’s Menlo Park campus, cameras stared at me. I’m not talking about security cameras or my fellow reporters’ DSLRs. I’m not even talking about smartphones. I mean Ray-Ban and Meta’s smart glasses, which Meta hopes we’ll all — one day, in some form — wear.
I visited Meta for this year’s Connect conference, where just about every hardware product involved cameras. They’re on the Ray-Ban Meta smart glasses that got a software update, the new Quest 3S virtual reality headset, and Meta’s prototype Orion AR glasses. Orion is what Meta calls a “time machine”: a functioning example of what full-fledged AR could look like, years before it will be consumer-ready.
But on Meta’s campus, at least, the Ray-Bans were already everywhere. It...
Pinhole cameras small enough to hide in glasses frames etc have been around for a decent while. The real changes are the continuing shrinkage of computing power, meaning that the footage can be stored and processed by the glasses themselves rather than communicating with a device over radio or requiring the user to remove a microsd to access the footage. Also increases in video quality possible with ever smaller lenses
Edit: basically, in terms of video gathering, intelligence agencies have been able to do this for generations. This really only is improvements in on board compute as far as they're concerned, and potentially useful in shifting the public opinion about these devices (so they can use off the shelf stuff instead of "spy" stuff).