Meta announced the prototype of their AR glasses today called Orion. They are a pretty impressive device. Compared to everything else with similar capabilities, Orion is a breakthrough. Similar to Apple Vision Pro, it allows the user to place windows like browsers, chats, and video calls into the 3d space around them and then interact with them through hand gestures. As compared to the Apple Vision Pro, it’s pretty slick since the glasses are relatively small, lightweight, and see-through. They are a prototype and prohibitively expensive so Meta won’t be launching them commercially.
The problem is this is not the direction I’d be taking AR glasses. See, I don’t want to replace my computer. My laptop works fine and my desktop has 3 monitors and a lot more computing power and RAM than will ever fit in something I can wear in the next decade or two. What I really want is an Apple Watch I can wear in my field of view at all times.
Hear me out. While I have an Apple Watch Ultra, I rarely wear it since every time I do, I find I’m more bothered by notifications than finding any actual use for it besides when exercising. I also still have to always be looking down at it just like my phone and I don’t find it significantly easier. However, what device do I use much more frequently during the day than even my laptop? My iPhone. For countless reasons. Probably the top are:
Reading and replying messages
Reading email
Making calls
Getting directions whether walking or driving
Calling an Uber or making an Uber Eats or Instacart order
Checking my calendar and getting reminders and notifications
I’d say those activities probably comprise over 50% of my usage of my iPhone on a daily basis, or at least what prompts me to pick it up. And the thing is, all of these fit very well into relatively simple glasses you could wear. I would love to:
See text/iMessages as they come in and reply with my voice
See and read emails without having to pick up my phone
Make a call effortlessly (to an extent AirPods already allow this)
See a map with directions on it at all times, rather than constantly have to turn my phone back on and look at it
See where my Uber or delivery order is
Quickly sneak a peek at my calendar or get a nice notification to join a meeting
Now, the iPhone already handles these pretty well, and in theory so does an Apple Watch, but just imagine having a heads up display like many cars have at all times right in front of you. There’s so much more it could do too. Perhaps my favorite concept is, within everyone’s comfort of privacy, reminding me of people’s names with AR overlays at networking events. It could possibly pip up some information on them like where they work and what their specialty is. I’m terrible at remembering names so this would be a game changer for me. I have always wanted something I could reference like a personal CRM so I could have better and more efficient communication. I mean, all I use Facebook for is being reminded of birthdays anymore. Imagine if that popped up over someone’s head.
But I digress. Again, what I’m saying is I don’t want a device for sitting on my couch and computing in the air. I want something that moves with me everywhere, prevents me from having to always reach for my phone, and interacts with the world. The best part of Orion’s tech demo was actually when it looked at a bunch of ingredients set out on a table. The presenter asked Meta’s AI to come up with a smoothie he could make with them. Orion properly identified the items and put little tags above them before giving a recipe with step by step instructions and nutritional facts.
I want more of that and less of a browser in my face
The hardware either way is looking about 5-10 years down the line. But I sincerely hope we redirect AR to actually augmenting the world and our connectivity instead of trying to make laptops in the sky.