Skip to main content

Meta's AI glasses now hear you better and play Spotify by looking

2 min read
United States, United States
5 views✓ Verified Source
Share

Why it matters: this update to meta's ai glasses empowers users to better engage in conversations and access music, improving social connection and personal enjoyment for those who use the technology.

Meta's Ray-Ban and Oakley smart glasses just got an update that makes them slightly less like wearing a camera and slightly more like having a smarter pair of ears.

The v21 software update rolls out two features this week: one that amplifies the voice of whoever you're talking to when you're in a noisy room, and another that plays Spotify music based on what you're looking at. Neither is revolutionary, but both solve small, real problems that people with glasses actually face.

Hearing conversations without the noise

You know that moment at a crowded café or holiday party when someone's talking to you but you can barely hear them over the background chatter. Conversation Focus is designed for exactly that. The glasses' open-ear speakers will amplify the person in front of you while leaving the ambient noise at normal volume—think of it like a directional microphone pointed at their face.

Wait—What is Brightcast?

We're a new kind of news feed.

Regular news is designed to drain you. We're a non-profit built to restore you. Every story we publish is scored for impact, progress, and hope.

Start Your News Detox

You adjust the amplification by swiping the right temple of the glasses or through your phone settings. It's rolling out now to people in Meta's Early Access Program in the US and Canada, which means it's still being tested before a wider release. The feature was announced at Meta's Connect conference earlier this year, so this is the first time it's actually hitting people's faces.

Music that matches what you see

The Spotify integration is the more playful addition. Point your glasses at an album cover, a snowy landscape, a holiday decoration—anything visual—and say "Hey Meta, play a song to match this view." The glasses' camera analyzes what you're looking at, Spotify's algorithm considers your taste, and a playlist appears.

It's the kind of feature that sounds gimmicky until you imagine using it: standing in a record store and wanting the vibe of an album without committing to a full listen, or walking into a friend's decorated living room and wanting music that fits the mood. The integration is available in English across 19 countries, including the US, UK, Canada, Australia, and much of Western Europe.

These aren't the kind of updates that make headlines on their own. But they point to where smart glasses might actually become useful rather than just novel—solving the friction points of real conversations and real moments, one small feature at a time.

65
HopefulSolid documented progress

Brightcast Impact Score

This article highlights positive updates to Meta's AI glasses, including the ability to amplify voices in noisy environments and integrate with Spotify to play music based on what the user is looking at. These features aim to improve the user experience and provide helpful solutions, contributing to a sense of progress and hope. The article focuses on constructive solutions rather than harm or risk, aligning with Brightcast's mission.

20

Hope

Solid

25

Reach

Strong

20

Verified

Solid

Wall of Hope

0/50

Be the first to share how this story made you feel

How does this make you feel?

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50

Connected Progress

Share

Originally reported by Meta Newsroom · Verified by Brightcast

Get weekly positive news in your inbox

No spam. Unsubscribe anytime. Join thousands who start their week with hope.

More stories that restore faith in humanity