Two weeks with Ray Ban Meta smart glasses

Two weeks with Ray Ban Meta smart glasses

Two weeks ago, I bought a pair of sunglasses from Ray Ban. They’re glossy black Wayfarers, exactly as you’d expect them. I bet you can picture them: they’re iconic in design.

There are a few differences with my pair though. They have a small computer in the arms – along with speakers, microphones and a front-facing camera.

They’re the Ray Ban Meta smart glasses. At £299, they’re not unreasonably priced. Some Ray Ban configurations are more expensive than this without computers in the arms. So I thought I’d give them a go, to test out how they’d actually fare in day-to-day use. I wanted to answer some questions: how are smart glasses supposed to be used? What do they add to everyday life?

I think I have some answers to those questions today.

So let’s talk about glasses you can talk to, can take photos for you and that can play music for you.

Partnership goals

The Ray Ban Meta smart glasses are a clever collaboration. It’s partnership at its finest: each brand is borrowing from the other. Meta, once the creepier-feeling Facebook, are borrowing Ray Ban’s cool. They’re also borrowing Ray Ban’s huge sales and marketing channels; their website, physical stores and channel partnerships like Sunglass Hut. Meta didn’t have a standing start – they had a hardware partner with scale, and in sunglasses there is no bigger partner than Ray Ban.

On the Ray Ban side, they’re borrowing Meta’s smarts. Meta provides the technology here; the hardware cameras, microphones and computers-in-the-arms. But they’re also borrowing integration with the world’s biggest social apps – the glasses work directly with Instagram, WhatsApp and Facebook. Livestream, video call or post your point of view to your story, direct from the glasses.

This is a perfect partnership. Ray Ban gives Meta the permission to be on people’s faces in a way that they couldn’t have gained themselves. And Ray Ban get to integrate with the world’s most popular social apps. Win, win. A case study in brand partnership.

Buying and setting them up was easy

I went to a Ray Ban store in a local shopping centre – there I tried on a few different sizes and styles and had a tutorial from a store employee. When I was ready to buy, I was handed a small, grey box and advised to download the Meta View app.

I found a bench nearby the shopping centre and opened up the packaging. Inside was an iconic-looking leather case holding my new smart glasses. That case has a USB-C port on the bottom, and it charges the glasses much like AirPods charge in their tic-tac chase.

I opened up the charging case and pulled out the glasses. The opened box had a QR code: I scanned it and was taken to the App Store to download Meta View. The set up process was super easy – before long, my glasses were connected to my phone, my messages, phone calls and WhatsApp. I connected to Apple Music for music controls and Instagram for posting Stories.

Meta AI isn’t available in the UK currently, but I found a way around it (if you’re in the UK and want my workaround, email me at jack at speci dot news) – and set up was complete.

I’m really impressed by how Meta has managed to work around the limitations iOS places on non-Apple devices. The experience felt polished, premium and guided. There are beautiful, full screen videos of how to use the glasses, mainly through depictions of cool people doing even cooler things. I learned that I can press the button on top once to take a photo, press and hold it to start and stop a video and tap the right arm to pause/play my music. Swiping up or down the right arm turns the volume up or down. Photos and videos are exported to your Photos app automatically if you choose, or manually in a few taps in the app.

It feels like a polished experience. Meta really cares about making these glasses easy. Being there where you expect it, and getting out of the way and just being plain glasses when you don’t.

They do three cool things

I took my Ray Ban Meta glasses for a two-week spin and I thought they made for nice glasses. This is my main takeaway: these are not a piece of breakthrough technology an early adopter would be thrilled by. No – they are good, well-built glasses that let you do three cool things. Here are those cool things.

Cool thing 1: Camera quality

I was very pleasantly surprised at how good the 12MP cameras are. As I swipe through my Photos app, many of the photos taken on the Metas are indistinguishable from photos taken on my phone. The only thing that gives them away are the slightly strange point of view: no one takes a photo from where their eyes are, but this is exactly what these glasses do.

I found having cameras on my face meant I was taking more photos of the things I was seeing, and I felt more interested in taking them because there was zero friction in doing so. It sounds silly – but without the need to pull a slab of glass out of my pocket, unlock, point and shoot, I found myself looking at a scene: a river, a field, an interesting building, my cat and just tapping the top of my glasses to capture it. These are great quality photos. The Meta View app even applies some fixes to your photos if they’re not framed perfectly or in good lighting. Neat.

Cool thing 2: Podcasts

I wore my glasses a lot walking between places on my own: from my house to a tube station, or from a tube station to the pub. Usually in these scenarios, I’d pop my AirPods in and listen to music or podcasts. First, I tried music. And where the audio quality was fine, they were no match for my AirPods Pro with Adaptive Audio. They made me realise just how much work my AirPods do as I navigate London to cancel loud noises. Where I can always hear my music through AirPods, sometimes I couldn’t hear through my glasses.

But podcasts sounded perfect. I loved walking without anything in my ears, with the world’s sounds around me, and the sound of Rory Stewart and Alastair Campbell telling me what’s happened in the world of politics that week.

Cool thing 3: Look, and…

You can look at anything and say “Hey Meta, look and…” and then ask a contextual question. I mainly tested it for notable and non-notable buildings and it did a very impressive job identifying them. I asked it to look at the Abingdon County Hall, for example – a locally famous building in my hometown – and it knew exactly what it was. I then walked up to a nondescript council building and asked it the same question. Astoundingly, it knew that building too.

I can imagine this being great whilst travelling. Apparently you can also ask it to translate food menus – which I haven't test tested yet, but I imagine is perhaps one of the best use-cases for it.

The AI isn't there yet

One of the most interesting things for me was having a powerful, conversational AI next to my ears, with the ability to see what I could see. I was excited to try.

But actually, as I tried it out, Meta AI felt pretty curt. I’m used to AI chatbots going on for too long – ask ChatGPT anything and it’ll wax lyrical for what feels like days. Yet it felt like Meta AI in my glasses was too short. I’d ask it to tell me about the history building in front of me, or to have a conversation with me to prep for a meeting, or to help me brainstorm ideas for a party. And in each of these interactions, where ChatGPT would go on for too long, Meta AI gave me a sentence or two.

This feels like a deliberate product choice: like Meta knows that customers want short, succinct answers when spoken to. But at times it felt rude, and I felt bad! My instinct was almost as though I’d upset the AI, and I needed to apologise – which I realise sounds a little insane. I expect that Meta will play with this over time and create better conversational experiences. Right now, though, it didn’t feel natural at all.

The thing that made it worse, I think, is that the voices sounded natural. Way better than anything I’ve heard anywhere else. It sounded like I really was talking to someone. But then when the AI was short with me, it made me feel bad.

This tells us something about AI in 2024

Conventional commentary tells us that AI is still waiting for its perfect form-factor. Like mobile technology was looking for the iPhone.

And I don’t think these glasses are it. I don't think Meta does either. But what I love about these glasses is that it’s Meta trying. They’re putting things out there and seeing what sticks. And where these glasses might not be the perfect AI device they allow Meta to ship and learn and test and build.

They had some hypotheses, clearly, like partnering with Ray Ban would be a way in. And that is clearly validated. Meta AI in your ears is another hypothesis to test. And where this is interesting, they haven’t landed it yet.

In the meantime, these are good glasses. They’re really good glasses. They’re better than any glasses I’ve ever owned. They help me capture more moments whilst staying in the moment. I can listen to podcasts without AirPods. That’s cool. And next time I’m on holiday, I’ll use them to translate the menu. And they protect my eyes in the sun!

So that’s what these glasses are. Great glasses. And if you want good glasses with more to them, you should buy them. They do some things your phone can do. And that is very cool.

But they’re not a breakthrough AI device. They’re not going to change the world. But that’s okay. They don’t need to.