Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This is a fantastic demo. I've tried to build these type of prototypes for all Meta Headsets in the past but their very limited API/SDKs block you from doing anything meaningful with computer vision. They are too scared of devs getting access to the camera.

Hope that Apple Vision Pro gives a more robust api to developers and that forces Meta to open up.

I will use this approach for a poc I have in mind. Great job and thank you for open sourcing!



> Hope that Apple Vision Pro gives a more robust api to developers

I was under the impression that Apple is keeping camera access completely locked down?


Third-party apps can access a single composite "front camera", but only if a "spatial persona" is found on the device.

https://developer.apple.com/videos/play/wwdc2023/10094/?time...


I did recently try to reverse engineer the connection Instagram are doing with the glasses in order to livestream through the glasses.

However, not too familiar with trawling through decompiled APKs (also presume there's some sort of internal secret they're using too)

No worries!


That’s been on my endless todo list since I got the first version of the View glasses from FB/Meta ages back.


No jailbreak yet.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: