This is my (Meta) Quest, to follow that star…

2.4

Brief one about VR and explaining tech to people.


🌟 Feature

So I did something crazy and got a Meta Quest 3 VR headset. I’ve had it for about a week and a half, and I have some thoughts.

This thing is cool as hell

Okay. In short, this thing rules. The technology is very, very cool. The controller and hand tracking is unreal. When it works, it truly feels like entering another domain overlaid upon our reality. I downloaded one app where you connect a piano keyboard via MIDI to the headset and then you can view a sort of Guitar Hero-esque animated scroll of notes descending onto the piano to play along to. That is: onto the real world piano. So it overlays the interface on top of your real world piano and helps you play. It literally feels like magic.

When I was about 13 years old there was this insanely buggy augmented reality game you could get on the iPhone. You’d have to print out this piece of paper and then the game would use your camera to overlay a sort of battle on top of the paper. I legitimately could not believe it at the time. And that was almost 15 years ago. Now? To so casually do some of the VR/AR stuff that this device can do is unreal.

The “passthrough” still isn’t quite there

A key facet of the success of these headsets is their ability to display the world outside the goggles to the user inside them. “Passthrough” is the term typically used to describe a feature in which cameras on the outside of the device display real time visual information to the user inside. It’s extraordinarily difficult to get right, because, frankly, nothing is faster at processing visual information from your surroundings than your high resolution eyeballs that are hardwired to your brain. Your brain is highly efficient at this, lol. To have high resolution cameras display as-close-to real time as possible while ALSO displaying other user interfaces like menus or games is a difficult processing task.

All this is to say that the passthrough on the Quest 3 is very good, but still kinda wobbly and weird. It’s not quite as high fidelity as I would like, but does good enough to do things like finding the controllers, staying away from furniture, and knowing where Mabel is. I would not use it, as some people I have seen on TikTok, to cook or clean around the house.

This is not the future

There’s been a lot of marketing and billions of dollars of investment in this idea that VR/AR is the future. In fact the device that most directly inspired this newsletter is Apple’s Vision Pro.

I see, now, why people like Mark Zuckerberg are so convinced this technology actually won’t be isolating and will actually offer new avenues for presence and community with others. And I see how they feel that with more investment, higher fidelity cameras/graphics, and faster processors that use less power, the goofiness of wearing a device like this, even scaled way down, won’t be as weird.

But I continue to feel that these devices will not “replace our phones” or become our primary entertainment devices. So long as our lives are intertwined with other actual, flesh and bone human beings in our midst, which I anticipate will be quite a while, people will want devices that not only make clear to the user when they are in use, but also OTHER people. I think “Little thing you take in and out of your pocket,” whether it’s glasses or not, will remain the prevailing mode of our connection to the internet, and thus each other.

If you’ve tried one of these VR things, would love to hear about your experience. For now, I’m just gunna play Beat Saber to Lady Gaga songs.


📚 Reading list

I love a good engineering vid on YouTube. Sue me.


⚡️ Lightning

Nothing this week.


📕 Glossary

Nothing this week.


📱 Home Screen

Nothing this week.


☎️ Answers

Rosie brought up a great question to me this week that went something like this:

How do I explain to my father the breadth of data surveillance? How do I explain weird tech things to people in my dad’s generation in general?

This kinda gets at one of the missions of this newsletter. I would love to kind of serve as a “consumer tech educator.”

For some context, Rosie mentioned how her dad had received an ad for an airport, and had asked her how the ad might have shown up on his screen. Was it the GPS data that showed that he was around the airport? Was it the search in Google Maps to pull up the route to the airport? Was it emails or texts that he exchanged about the airport?

The short answer here is just a simple “yes.” It is all of these things, all of the time. Our current structure of the entire internet is built on these sorts of so-called-free transactions, where you get to email and search directions for free, but then the companies get to sell that you did that to advertisers.

But it gets at this deeper question. How do we explain to people, of all ages, these sort of massive systems underlying our every tap, swipe, and click?

The movie Everything, Everywhere, All At Once is actually kind of about this. And it took Mariah pointing it out to me to make that clear to me. If you haven’t seen the movie, the title actually does enough for me to make this point.

An inherent tension in our modern American existence is our ability to have everything we could ever need or ever know mere taps away on these little rectangles we take with us everywhere. All at once, we can be talking with family, buying a smoothie, reading the Wikipedia page for Paul Simon, and listening to a podcast. This access to literally infinite troves of data sedates us by overwhelming us with choice. There’s an argument to be made that the human brain literally was not meant to deal with as much information as it receives in the way it currently receives it, and it can result in burnout, fatigue, and overall despair. How do we exist in a universe in which it can so often feel like all of us are in our own dimensions, absorbing our own truths, responding to our own reality?

I think the simplest answer I have here is to genuinely and sincerely answer questions like “how does this work?” I have found myself often breaching into editorializing my explanations of the technology around us (what do you think this newsletter is, lol). I describe things as “bad” or “good” in the midst of laying out the basics, I extend metaphors that don’t adequately encapsulate the issue, I talk down to people who don’t know as much as I do about these massive systems that surround us every single day. The most success I’ve ever had in answering questions about the vast, overwhelming reach of our technology today has been when I just simply explained, in logical steps, how things work.

The average person knows extremely little about how all of this works. Kids and older people in particular, who have not grown up as completely immersed in these things as people around my age, need the compassion and sincerity of clear, concise answers.

And if I’m being completely honest with you, if you are struggling with how to explain legitimately anything to anyone anywhere, ask ChatGPT to explain the thing to you like an 8th grader, and just parrot that. And then ask ChatGPT to explain to you how ChatGPT explains things to you like an 8th grader.


I know this was kind of a weirdly balanced one today. Hope you enjoyed.

Previous
Previous

My Vision Pro demo, tech literacy, and a poll on social media apps

Next
Next

Will we still be Googling things in 10 years?