Preferences

sleepyguy parent
My elderly mother-in-law is slowly going blind. She relies on Meta glasses to read print on everything — from the back of a can to the mail. She also uses them to help locate items around the house, whether it’s something on the counter or in the living room.

I’ve tried the glasses myself, and I’m convinced that wearable eyewear like this will eventually replace the mobile phone. With ongoing advances in miniaturization, it’s only a matter of time before AR and AV are fully integrated into everyday wearables.


Janicc
I believe it's going to replace smartphones like smartphones replaced computers or more specifically laptops.
const_cast
I doubt it, these devices have a serious user input problem. The cornerstone of computers is human-computer interaction. That's what makes these pieces of silicon useful. They're tools for humans - meaning, it doesn't matter if the tool is better if it can't be used easier.

Smartphones were a step back in a lot of ways. Typing is slower. No mouse. Fingers are fat and imprecise. The result is most applications were severely dumbed down to work on a smartphone.

The trade-off was portability. Everyone can carry a smartphone, so it's okay that the human-interaction is worse in a lot of ways. Then, when we need that richer interaction, we can reach for a laptop.

The problem with smart glasses is they go even a step further in how poor the interaction is. Speech as an interface for computers is perhaps the worst interface. Yes, it's neat and shows up in sci-fi all the time. But if you think about it, it's a very bad interface. It's slow, it's imprecise, it's wishy-washy, it context dependent. Imagine, for example, trying to navigate your emails by speech only. Disaster.

Smart glasses, however, are not more portable than phones. Not by much. Everyone already has a phone. So what do we gain from smart glasses? IMO, not very much. Smart glasses may become popular, but will they replace the smartphone? In my opinion, fat chance.

What I think is more likely, actually, is smartphones replacing smart glasses. They already have cameras. So the capabilities are about the same, except smart phones can do WAY more. For most people, I imagine, the occasional "look at this thing and tell me about it" usecase can be satisfied by a smartphone.

MailleQuiMaille
> The result is most applications were severely dumbed down to work on a smartphone.

Good point, and it could be argued the user soon followed that dumbification, with youngest generations not even understanding the file/folder analogy.

I think we can go dumber ! Why need an analogy at all ? It will all be there, up in your face and you can just talk to it !

goda90
Voice is slow, but it can be sped up with vocal macros. One syllable/non-word noise commands.

There's also touch pads on the side of the smart glasses as another input option. And I could imagine some people liking little trackball-esque handheld controllers(like from the Black Mirror episode "The Entire History of You").

And there's also air gestures using cameras on the smart glasses to watch what your hands are doing.

I don't think any of these has the raw data input bandwidth that a keyboard has, and for a lot of use cases even a touchscreen could be better. But maybe that can be made up by the hands-free, augmented reality features of smart glasses.

itsdrewmiller
Eye tracking is a UI in its infancy but should be as fast as manual manipulation. Either form factor could use it but glasses are more motivated to figure it out. Headwear is also well situated for neural interfaces.
imiric
> Smartphones were a step back in a lot of ways.

I was among the nerds who swore I'd never use a touch keyboard, and I refused to buy a smartphone without a physical keyboard until 2011. Yes, typing on a screen was awful at first. But then text prediction and haptics got better, and we invented swipe keyboards. Today I'm nearly as fast and comfortable on a touch keyboard as I am on a physical one on a "real" computer.

My point is that input devices get better. We know when something can be improved, and we invent better ways of interacting with a computer.

If you think that we can't improve voice input to the point where it feels quicker, more natural and comfortable to use than a keyboard, you'd be mistaken. We're still in very early stages of this wave of XR devices.

In the past couple of years alone, text-to-speech and speech recognition systems have improved drastically. Today it's possible to hold a nearly natural sounding conversation with AI. Where do you think we'll be 10 years from now?

> Imagine, for example, trying to navigate your emails by speech only. Disaster.

That's because you're imagining navigating a list on a traditional 2D display with voice input. Why wouldn't we adapt our GUIs to work better with voice, or other types of input?

Many XR devices support eye tracking. This works well for navigation _today_ (see some visionOS demos). Where do you think we'll be 10 years from now?

So I think you're, understandably, holding traditional devices in high regard, and underestimating the possibilities of a new paradigm of computing. It's practically inevitable that XR devices will become the standard computing platform in the near future, even if it seems unlikely today.

tyg13
For me, voice input is an immediate no-go because I don't want to have to talk to myself while I'm in line at the grocery store, or waiting for my oil change, or in the dozens of other situations where I typically use my smartphone to do things.
bandoti
Curious to see how this goes. It seems to me it’s hard to match reality—for example, books, book shelves, pencils, drafting tables, gizmos, keyboards, mouse, etc. Things with tactile feedback. Leafing through a book typeset on nice paper will always be a better experience than the best of digital representations.

AR will always be somewhat awkward until you can physically touch and interact with the material things. It’s useful, sure, but not a replacement.

Haptic feedback is probably my favorite iPhone user experience improvement on both the hardware and software side.

However, I will never be able to type faster than on my keyboard, and even with the most advanced voice inputs, I will always be able to type longer and with less fatigue than if I were to use my voice—having ten fingers and one set of vocal cords.

All options are going to be valid and useful for a very long time.

imiric
> It seems to me it’s hard to match reality—for example, books, book shelves, pencils, drafting tables, gizmos, keyboards, mouse, etc. Things with tactile feedback. Leafing through a book typeset on nice paper will always be a better experience than the best of digital representations.

There's nothing tactile about a glass pane. It's simply a medium through which we access digital objects, and a very clunky one at that. Yet we got used to it in a very short amount of time.

If anything, XR devices have the possibility to offer a much more natural tactile experience. visionOS is already touch-driven, and there are glove-like devices today that provide more immersive haptics. Being able to feel the roughness or elasticity of a material, that kind of thing. It's obviously ridiculous to think that everyone will enjoy wearing a glove all day, but this technology can only improve.

This won't be a replacement for physical objects, of course. It will always be a simulation. But the one we can get via spatial computing will be much more engaging and intuitive than anything we've used so far.

> I will never be able to type faster than on my keyboard, and even with the most advanced voice inputs, I will always be able to type longer and with less fatigue than if I were to use my voice—having ten fingers and one set of vocal cords.

Sure, me neither—_today_. But this argument ignores the improvements we can make to XR interfaces.

It won't just be about voice input. It will also involve touch input, eye tracking, maybe even motion tracking.

A physical board with keys you press to produce single characters at a time is a very primitive way of inputting data into a machine.

Today we have virtual keyboards in environments like visionOS, which I'm sure are clunky and slow to use. But what if we invent an accurate way of translating the motion of each finger into a press of a virtual key? That seems like an obvious first step. Suddenly you're no longer constrained by a physical board, and can "type" with your hands in any position. What if we take this further and can translate patterns of finger positions into key chords, in a kind of virtual stenotype? What if we also involve eye, motion and voice inputs into this?

These are solvable problems we will address over time. Thinking that just because they're not solved today they never will be is very shortsighted.

Being able to track physical input from several sources in 3D space provides a far richer environment to invent friendly and intuitive interfaces than a 2D glass pane ever could. In that sense, our computing is severely constrained by the current generation of devices.

const_cast
I'm not saying I don't believe you. But I am saying that, as a programmer, if you told me I had to only use an iPhone at work I'd probably set myself on fire.

> It's practically inevitable that XR devices will become the standard computing platform in the near future

Yeah I mean I just really doubt it. I'm not seeing a whole lot of benefit over smartphones, which are already ubiquitous. At best, I'm hearing that it won't suck that much. Which... okay not really high praise.

I'm sure, like the smartphone, it will replace SOME usecases. The difference is that the usecases the smartphone replaced were really important ones that cover 80% of common stuff people do. So now everyone has a smartphone.

Will that be the case with XR? I doubt it. The usecases it will cover will be, at absolute best, incremental as compared to the smartphone. And, I presume, the smartphone will cover those usecases too. Which is why I think it's more likely smartphones swallow these glasses thingy than the other way around.

imiric
> I'm not saying I don't believe you.

I'm not trying to convince anyone. Believe what you want to believe :)

> But I am saying that, as a programmer, if you told me I had to only use an iPhone at work I'd probably set myself on fire.

Sure, me too. But that's a software and ergonomics problem. There's no way you will ever be as productive on a 6" display, tapping on a glass pane, as you would on a much larger display(s), with a more comfortable physical keyboard with far richer haptics. Not to mention the crippled software environment of iOS.

But like I mentioned in other threads, it would be shortsighted to think that interfaces of XR devices will not be drastically better in the future. Everyone keeps focusing on how voice input is bad, ignoring that touch, eye and motion tracking in a 3D environment can deliver far richer interfaces than 2D displays ever did. Plus voice input will only get better, as it has greatly improved over the last 2 years alone.

> I'm not seeing a whole lot of benefit over smartphones, which are already ubiquitous. At best, I'm hearing that it won't suck that much. Which... okay not really high praise.

Have you seen the user avatars in visionOS 26? Go watch some demos if you haven't.

Being able to have a conversation with someone that feels like they're physically next to you is _revolutionary_. Just that use case alone will drive adoption of XR devices more than anything else. Video conferences on 2D displays from crappy webcams feels primitive in comparison. And that is _today_. What will that experience be like in 10 years?

I'm frankly surprised that a community of tech nerds can be so dismissive of a technology that offers more immersive digital experiences. I'm pretty sure that most people here own "battlestations" with 2+ screens. Yet they can't imagine what the experience of an infinite amount of screens in a 3D environment could be like? Forget the fact that today's generation of XR displays are blurry, have limited FoV, or anything else. Those are minor limitations of today's tech that will improve over time. I'm 100% sure that once all of those issues are ironed out, this community will be the first to adopt XR for "increased productivity". Hell, current gen devices are almost there, and some are already adopting them for productivity work.

So those are just two examples. Once the tech is fully mature, and someone creates a device that brings all these experiences together in a comfortable and accessible package, it will be an iPhone-like event where the market will explode. I suspect we're less than a decade away from that event.

int_19h
What is your wpm with a touch keyboard (however fancy) vs an actual physical one?
CamperBob2
Something that needs to be considered before answering that question is that current predictive text engines are ridiculously stupid compared to what an LLM (or even an "SLM") with access to all of your previous texts could do.

When somebody finally gets a clue and implements that, no typist on Earth will be able to keep up with it.

kalleboo
Since iOS 17, Apple already uses a transformer language model that trains on your input in the keyboard.
kgwxd
i'll never wear them but i'm sure they'll have wireless conn for a keyboard, mouse, and other sane inputs, just like phones. for me the worst part of touchscreen is having to hold the device like a fancy glass egg (on a sane device i'd look up how to spell the word for that) no matter what i'm doing out of fear the wrong thing will happen if i don't. at least a plain monitor strapped to my face doesn't have that concern.
naveen99
To play devils advocate, Speech is how humans delegate to other humans. Usually faster and clearer to communicate with an employee via voice in person or over the phone than on email.
Eddy_Viscosity2
> Usually faster and clearer to communicate with an employee via voice in person

That's because the communication is going from a person to a person and both are very highly tuned to not only hear the words, but the tone, context, subtext, and undertones. There can be all kinds of information packed in a few words that have nothing to do with the words.

Machines, even LLMs, can't do this. I don't think they every will. So typing and shortcut commands and the like are far more efficient interacting with a computer.

naveen99
That’s my point. It’s not the interface that’s the bottleneck. Ai needs to get a lot better and faster …
A lot of people spend hours consuming auto-playing short-form video content. I would guess the majority of young people, in the West.
bee_rider
Smartphones are not even that similar to laptops. Smartphones wiped out beepers, old cellphone, PDAs, and decimated MP3 players and cameras.

Laptops, of course, have the much bigger screen and keyboard, not really replicated by smartphones. They have use-cases that smartphone can’t cover well for hardware reasons. So they’ve stuck around (in a notably diminished form).

If good AR glasses become a thing… I dunno, they could easily replace monitors generally, right? Then a laptop just becomes a keyboard. That’s a hardware function that seems necessary.

What niche is left for the smartphone?

Talanes
>Smartphones are not even that similar to laptops.

I believe that was the entire point of the comparison. Smartphones replaces SOME use cases of laptops in the same way ubiquitous smart glasses could replaces SOME use cases of smartphones.

jazzyjackson
A large plurality of young people rarely use a laptop if they’re not so called knowledge workers, most everything can be done by phone. Maybe clubhouse style group audio chats will make a comeback and people will jump on the ambient computing trend as clearly better than interacting with screens all day
foobarchu
The screen itself isn't really the problem people are talking about when they refer to "too much screen time". Suddenly having the screen be your entire field of vision sounds like an even worse situation for the average person's attention.
jazzyjackson
Totally agree as far as AR goggles but the meta glasses have no screen, they’re just a voice in your ear
jerlam
Not just young people, I see a lot of elderly adults using tablets and phones as their primary computing device. They're cheaper and more user-friendly if you don't care about performance or multitasking. At the same price, a tablet is a far better choice than a laptop.

If you are afraid of technology, Android or iPadOS is lightyears ahead of Windows or MacOS.

mrweasel
It always seems insane to me when people book plane tickets, do taxes, banking, writing email and stuff like that on a phone. That's big screen tasks, you can't do those on your phone, not enough space to navigate safely.
999900000999
A lot of lower income people might only have a cheap android phone.

It's more than enough to handle paying bills, applying for jobs, etc. Hell, a Bluetooth keyboard and a bit of grit + GitHub CodeSpaces and you can write develop applications.

You can also cast your screen to a TV or on a handful of phones use USB c to HDMI.

bee_rider
I’m not sure how to respond to your post, because it seems to ignore the vast majority of mine, including the parts that look at pretty similar ideas to what you’ve brought up.
Talanes
Well, I wasn't sure how to respond to yours missing the entire conceit of the post before it, so I guess we're even.
bee_rider
I don’t think I missed anything. Maybe if I’d only posted the bit you quoted that would make sense.

But I also don’t think either of us is gaining anything through this interaction, so... shrug

airstrike
You're missing the fact that the original comparison with laptops was a bit tongue in cheek
bobthepanda
Smartphones are mobile. Glasses with a keyboard would require either being fixed to a keyboard location or a keyboard with the form factor of a smartphone, and if that’s the case why do you need the glasses?
int_19h
The idea is that you'd use smart glasses without keyboard most of the time, mostly in the same scenarios you'd use a smartphone today. But unlike a smartphone, smart glasses can also replace a laptop if and when needed by pairing with a keyboard.
shortrounddev2
Smartphones replaced laptops. A huge amount of people don't own a laptop or desktop PC - they do all computing via smartphone or maybe tablet. My wife almost never opens her laptop, nor does my mom
layer8
Global PC shipments haven’t decreased over the last 20 years. It’s more like smartphones have expanded the number of people who do computing.
bigfatkitten
Millions of smartphone users never owned a laptop (or even a desktop computer) to start with. Smartphones are their only real exposure to computing.
bee_rider
But people still do buy some laptops.

It is hard to say when the peak of laptops in circulation was, right? Because simultaneously the tech has been maturing (longer product lifetimes) and smartphones have taken some laptop niches.

I’m not even clear on what we’re measuring when we say “replace.” Every non-technical person I know has a laptop, but uses it on maybe a weekly basis (instead of daily, for smartphones).

shortrounddev2
I don't know any non-technical people who own laptops, personally. Other than work laptops
XorNot
Sure but lots of people use work laptops as general laptops. Which is why we keep having to advise people not to do that.
Henchman21
You have missed the point utterly. “AR glasses will replace smartphones the same way smartphones replaced laptops” — they didn’t replace laptops. Therefore AR glasses won’t replace smartphones in the same way smartphones didn’t replace laptops.
bee_rider
I’ve already responded to this sentiment in another thread. I do kind of find it puzzling that folks are reading my post and coming to the conclusion that I missed the point, but hey, if I confused enough people then I guess I’ll take the blame. I’ve tried to address in the follow up.

https://www.hackerneue.com/item?id=44330537

sandcat_
FWIW when I first skimmed your comment I came to the same conclusion as everyone else. I don’t think people are reading closely.
I don't think "replaced" is the right word, just like with smart glasses. The form factor and user experience are key attributes when choosing a device, independent of raw hardware power. It's likely we'll continue to live with multiple device types coexisting.

BTW, I have to consciously turn off my cybersecurity mindset when thinking about smart glasses. It's hard not to see all the new attack vectors they introduce.

paul7986
It won't replace you can't take selfies with smart glasses!

I wear my Ray Ban Metas a lot (bought in 2023) and love them but i can't take selfies with them. I have to pull out my phone. They are complimentary to phone tho i do enjoy not having my phone on me to take pics, vids and ask it for the time now (add 5G to it and it will do more like stream music).

Whatever Open AI is working on to replace the iPhone it will need to be able take selfies! I'm betting it's just an AI Phone with the experience of the movie H.E.R. where almost everything is done from the lock screen and it takes the best selfies of you (gets you to the best lighting) and everything under the sun.

nsxwolf
Just stand in front of a mirror.
paul7986
huh so all selfies will no longer then show peoples' complete face (eyes) and taking outdoor selfies you need to carry a mirror?
nsxwolf
Not all solutions are perfect.
1659447091
> you can't take selfies with smart glasses!

Sounds like a value proposition for society, to me!

derwiki
Why are selfies so important?
paul7986
60% of all Americans take selfies. Im way out of the "selfie," demographic yet take a good amount of selfies especially when traveling.

Selfies are apart of culture now.. that won't change!

kepano
In what way did smartphones replace laptops?
acuozzo
OP is trying to say it'll only be a partial replacement.
TiredOfLife
Ordinary people do everything on smartphones nowadays.
racl101
Yeah I can see that. Teenagers are in specially adroit at doing most computer related work from their phone. My niece owns a new Macbook and barely cracks it open. Prefers to do most things on the iPhone and actually manages it.

Me, (old millennial) can not even conceive getting any real work done just on the smartphone. But I'm a power user. I need to log onto linux servers and administer them. Or I need to crack open Excel files and use spreadsheets. Not an ordinary user.

andoando
For most people, whats the use case of a laptop?

You only really need one for doing some type of work

fnord77
it's gotten to the point where genz doesn't know how to use laptops/desktops
kube-system
Outside of this tech bubble that we are all in, many use them as their primary (or only) computer. More than 60% of internet traffic is from mobile phones.
That's exactly the point.
shortrounddev2
Lots of people don't own laptops or desktops. They do all computing through a smartphone.
iancmceachern
Smartphones didn't replace laptops.

Laptops and tablets replaced desktops. Nobody sits down in an office and does work on a smartphone.

Smartphones replaced phones, pagers, music players and cameras.

mulmen
The smartphone completely replaced the personal computer for most people.

10 years ago all my non-tech friends and family had laptops. Now they all use their smartphones as primary computing devices. My nephew who just graduated from high school and works in IT doesn’t even own a personal laptop.

bobthepanda
This makes sense; a personal computer at this point is either a phone or a desktop for high performance niches, and laptops are in the unsatisfying middle. Particularly anything in the netbook or ultrabook segment.
freehorse
I used to agree to this statement, before apple silicon came.

Also, mini pcs is a new trend nowadays. I wouldn't say that this is the direction things go any more.

bee_rider
My dad worked from the 90’s until recently. He never owned a laptop. Until he retired, he went out and bought one almost immediately upon retirement, hah.
BurningFrog
Smartphones replaced laptops, but not for everything.

Smart glasses will probably do the same to smartphones.

Things are rarely completely replaced, at least not quickly.

gopher2000
Smart glasses will have the potential to cover more use cases than a smart phone ever did due to the potential of AR-enabled viewing display.
dehrmann
> Nobody sits down in an office and does work on a smartphone

Now that we have USB-C monitors, phones have USB-C, and high-end phones have CPU performance similar to low-end desktop CPUs (A18 vs Intel 14100), we could actually start replacing laptops with phones for some use cases.

freehorse
The biggest hindrance to this is apple itself with ios.

I would be glad to only have to take an external monitor to use with my phone while traveling, but there is little I can do and iphones is not very user friendly in such a way.

absurdo (dead)
gwbas1c
> and I’m convinced that wearable eyewear like this will eventually replace the mobile phone

Once there is an actual usable in-glasses screen, I will agree.

A few years ago I tried someone's smartglasses with a screen. It basically had similar functionality to my first Fitbit: it would show texts, notifications, caller ID.

I really want one of those and went looking, but couldn't find it.

themanmaran
You might be interested in the EvenRealities G1 [1]. It's the absolute best form factor I've seen for just the text HUD

https://www.evenrealities.com/

stickfigure
Looks amazing. Unfortunately from a Chinese company, and given how deeply integrated with my email, calendar, etc it would be... no interest.
alex1115alex
I have a pair - they're not as integrated as you'd think. It's essentially a BLE device that projects text/data sent from your phone, so any data transmission depends on the companion app you use.

Shameless plug: We build an open source OS for glasses that works with them. AugmentOS.org

stickfigure
Oh, then you must be familiar with the landscape! Which do you like best, and which companies do you think will still be around in a year?
alex1115alex
In terms of all day wearable HUD glasses:

Even Realities G1 are the best HUD glasses on the market right now. They’re the first pair (with prescription) that I can wear all-day without pain, and without looking like a dork.

My team used to main the Vuzix Z100 glasses, starting with the Vuzix Ultralite reference design that predated them. We won’t touch them these days (and recently stopped selling them on our store).

Others… Meizu StarV Air 2 and INMO GO2: both lack public SDK, GO2 is too heavy. Brilliant Labs Frame: cool prototyping toy, awful glasses.

For “AI glasses” that have camera, no display:

You have the RayBan and a number of companies making these. The only one I can recommend is our upcoming Mentra Live (https://mentra.glass). It has the same camera sensor as the RayBan, but runs open source software & has an SDK.

For more sci-fi glasses that run Android and have display + camera, see the INMO Air 3, TCL RayNeo X3. These are too heavy to be worn as regular glasses, but are fun prototyping tools.

All these companies will exist in 2026. As for a 5 year horizon, I’d place my bets on Even Realities, and Vuzix (as a waveguide supplier, not consumer HW). Meizu and TCL will stick around as Chinese megacorps, though I’m 50/50 they will continue developing consumer smart glasses. Brilliant Labs is cooked unless they turn things around with their next pair of glasses.

Google & Android XR: I don’t expect their glasses to be competitive for at least a few HW generations at minimum. In terms of public information, we know they’re monocular and heavy (>45g), which is an immediate killer for the majority of users.

dontlaugh
Do you live in China? If not, why would you care?

Meta, Microsoft, Apple, etc. are far more likely to snitch on you to the government you actually live under.

ethersteeds
You seem to assume that the risk begins and ends with government persecution/prosecution.

I'm not the gp, but for me, there are several bigger concerns:

First, the possibility that access could be leveraged for intelligence gathering or industrial espionage. The goal might be geopolitics, but I still don't want my data to be fodder nor do I want to explain to my employer that I'm responsible for their breach.

Second, the possibility of becoming collateral damage during an escalation of hostilities between my country and China. If I've grown dependent a device, I face significant disruption if they block cloud services or even outright remotely brick it. The war in Ukraine demonstrated this isn't limited to the other country's exports, but they're still at the greatest risk.

So yeah, a company snitching on me to big brother I live under is just one threat I have to consider when giving access to all my data.

stickfigure
Despite the trainwreck that is the current presidential administration, we have a hell of a long way to fall before our government is as malevolent China's.
gwbas1c
Yeah, that's what I'm looking for.

Do you have any experience with their progressives? The ones I'm trying are so lousy that I'm going to try multifocal contacts next week. According to the order form, their progressive lenses seem somewhat decent.

sroussey
I want that with a camera so it can do facial recognition from my LinkedIn when I’m at a networking event.
Just walk up and ask them their name like a normal person rather than doing some creepy fucking surveillance on them from across the room.
What style was it, nreal (bulky) or something like Frame (though lower end in quality)
hn_throwaway_99
> I’ve tried the glasses myself, and I’m convinced that wearable eyewear like this will eventually replace the mobile phone.

I sure as f* hope not. I already struggle with my cellphone addiction and all of its constant distractions and assaults on my attention span, the last thing I want is something from one of the largest advertising companies on the planet glued to my face.

erikig
I'm optimistic and I can only hope that the limits of what one can wear on the face for longer time periods will create a consequent limit to the distracting features that can be packed into a daily-use device.
redeeman
you are very largely able to control what you do on your phone yourself
hn_throwaway_99
Sure, I'm not blaming anyone else. But some of the smartest, most highly paid people in the world have as their sole job looking at data and feedback loops to build more successful ways to highjack your attention.

It's sort of like blaming the obesity epidemic on lack of willpower. Yes, any individual is responsible for himself. At the same time, companies have found better and more ingenious ways to addict lots of us using food. When I look back at pictures from the 1950s and see that nearly everyone is skinny/normal weight, am I just supposed to think that they had so much more willpower than today's people?

SoftTalker
I don't think so. You still would have to wear glasses, which is annoying.
cshimmin
some of us have to wear glasses anyway :/
SoftTalker
Yes, and I am one of "us" but I still think they are annoying. I wear contacts most of the time. Glasses are just awkward in many situations. In the heat when you get sweaty they slide down your nose or completely fall off, in the cold when you walk in to a warm house they fog over, in the rain they get water spots, the frames are always visible and interfere with peripheral vision. I just don't care much for them.
eloisant
Then they'll have to find a way to separate the "smart" frame from the prescription lenses, so you can change the glasses when your sight changes without having to buy smart frame each time - or the other way around, upgrade your frames without having to buy prescriptions lenses again.
xnorswap
Maybe we'll work out how to stimulate the optic nerve directly and skip to bionic eyes for both corrective vision and AR.

We'll need to overhaul the concept of limited liability before we do that though, the thought of someone being left without their eyes because a company goes bankrupt and no-one is at fault is pretty horrifying.

terribleperson
Unfortunately the unmaintained bionic problem is already real.
sleepyguy OP
Lensology, you tell them the frames, and upload your prescription, and they send you the lenses to pop in. It's called reglazing, and millions of people do it all the time.

Ray Ban does it for their Meta glasses, but Lensology can handle stronger prescription lenses.

I often get updated lenses for my frames. Is that not what you mean?
Izikiel43
> Then they'll have to find a way to separate the "smart" frame from the prescription lenses, so you can change the glasses when your sight changes without having to buy smart frame each time - or the other way around, upgrade your frames without having to buy prescriptions lenses again.

Ehh, there is nothing special about the lens, all the magic is in the frame, and the rayban and oakley frames look very similar to their standard versions. Getting new lenses for sunglasses is very common.

Have you never had prescription sunglasses?

barbazoo
Anecdotally, I haven't found it possible to buy lenses for a particular frame other than when you buy both new at the same time. Good luck getting the same lenses next time the prescription changes.

The frame will probably change slightly over time to make them incompatible.

Very confused by this. As far as I know it's standard for lenses to be custom made for your frame even when you purchase them at the same time.

I just sent an old pair of glasses to eyeglasses.com for new lenses. I never considered this to be a big deal.

> The frame will probably change slightly over time to make them incompatible.

This is probably true.

The rest of your comment is probably not true for most people.

It just depends on how strong your prescription is and how willing the shop/website is to do special orders.

If you are almost blind, then your choice of lenses/frames will be much lower than if you are only slightly blind(most people). Any reputable eyeglass/optician shop should be able to make custom lenses for pretty much any frame. They can't always do the super sleek shades that some people like to wear.

dmarcos
And contact lenses and lasik are popular because many don’t want to wear glasses. I see head mounted displays useful in constrained scenarios (e.g construction site and tasks where you already wear safety glasses and need free hands). I have a harder time seeing a world where people ditch phones and start voluntarily wearing glasses which is often uncomfortable and inconvenient. Just finished 5 miles run on treadmill, went to sauna and did bouldering. There’s no room for glasses but can occasionally check my phone.
someuser2345
> I have a harder time seeing a world where people ditch phones and start voluntarily wearing glasses which is often uncomfortable and inconvenient

I see this world all the time at the beach; lots of people wear sunglasses there.

dmarcos
To be able to see and remove them as soon as they can. And even in those scenarios not everybody wear them. Run my own little study at beaches, concerts and other outdoor activities and noticed less people wear glasses than I was expecting in ideal conditions to do so (<50%)
mollerhoj
i dont think youre very representative of the general population
dmarcos
Contacts and especially lasik are growing in popularity. Strong signal people don’t enjoy wearing glasses if they can avoid it
SirMaster
Right, I have no interest wearing glasses.
vinoveritas (dead)
And what is the non-verbal input method for these glasses that isn’t painful to use?

Because it’s not going to ever be socially acceptable to just start talking to your glasses vs silently typing on a phone in most public places/situations.

terminatornet
Gonna assume you're not in the US. Here it seems to be encouraged to watch TikToks at full volume on the bus at 6 in the morning.
cortesoft
Brain interface
Lorin
At that point we might as well skip the glasses entirely and have it output directly to the visual/audio cortex (with shut offs, of course)
patapong
I's possible that communicating brain > computer is much simpler than the other direction. I would expect that to be the case.
paulcole
1. Ever is a long time.

2. How confident would you have been about predicting the smartphone’s effects on society today back in say 1995?

mrweasel
Even if AR glasses can replace smartphones, I think there will be a bigger push back than on smartphones. A lot of us have seen what smartphones have done to society, and will be reluctant to adopt any new form of technology that could have the same level of disruption. It's the same as with e.g. Facebook or Twitter/X. I've seen what these social media companies have inflicted on humanity and I will never signup for another one.
I’m glad your mother found some use case for them but honestly day to day interactions on the street… if you think you can just walk around filming people 24/7 with no sense of consent while beaming all that shit back into metas digital surveillance machine… I don’t know what to tell you other than to expect violence.
I’m very glad your mother in law has use for them.

With that said, I don’t think these can replace phones until they’re quite a lot smaller and lighter. And to make it worse, you’d need at least two pairs - regular and sun. Possibly three if you’re someone who regularly uses safety glasses.

criddell
I don't think I would be super comfortable walking around with Meta cameras seeing everything I see in my home. I'm not sure I'd trust any of the companies likely to build the product with that kind of access to my personal life.
PaulHoule
MQ3 is crawling with cameras for ‘inside out’ tracking which hypothetically could be used in privacy violating ways. Currently these are locked down so that you can’t build interesting AR apps —- you should be able to look at a QR code and access a ‘location based’ XR app but they don’t allow it, gotta scan with a phone and transfer it to your headset with Meta’s janky app which shows all the “carelessness” of someone who doesn’t care to make money.

Meta says they will open it up though.

criddell
I might trust individual developers. I don't trust Meta though so as long as the XR app is running on Meta hardware, I'm not interested.
PaulHoule
I’m more worried that shoddy development practices will cause the video to freeze up, cause me to fall or crash into something and experience “VR to ER” myself.
nhecker
Photochromatic coatings -- https://en.wikipedia.org/wiki/Photochromic_lens -- have existed for a while and are sold on safety glasses, at least according to a cursory look at a large online retailer's site.

That said, I'm not sure I'd want smart glasses. Being stuck on a computer for work, I try to take some time every day to be completely free of digital things. It's hard enough to do that with a smart phone in my pocket vying for my attention. I imagine it would be only harder with smart glasses over my eyeballs.

foobarian
They may not replace the current gamut of phone features. However; I question how much of current phone functionality is actually something users strongly need/want, vs. how much is pushed by big tech. It would be pretty great if a small core feature set done well in-glass turned out to be enough to kick off large scale adoption. Ultimately I think the input is probably going to be the hardest issue
Loughla
Read and send messages. Make phone calls. Navigation and maps. Set reminders. Navigate a basic Google search, even if it's just a top level summary.

Those things on glasses and I ditch my phone immediately.

What would be the interface, talking? I know they have pinching and hand tracking, guess it's no different than people talking "to themselves" while wearing earbuds.
amazingamazing
I don't understand how she's using Meta glasses to read print. You mean it's dictating it, or are they prescription? If the former, do you need meta glasses for that? If it's the latter, wouldn't it work with any glasses?

I also don't understand how they're used to locate items around the house. Is there some sort of GPS? Or do you mean it helps by virtue of seeing (e.g. prescription)?

AR glasses will be a hit, no doubt, but I don't see what's so special about glasses with a mic, camera and speaker on them. Seems especially for an older person that it would be more useful getting a phone with a screen and pointing at things and seeing it on a display.

Dfiesl
Yeah the glasses will be dictating the text. For identifying objects the cameras in the glasses will be substitutes for her failing eyesight, no GPS or prescription needed.

A phone you have to hold in your hand whereas glasses you don't. Therefore glasses are superior for these use cases.

amazingamazing
Seems scary. If you’re using it to read some prescription and it says the wrong thing then game over I guess - or if internet goes out.

I’m very curious what this person did before these glasses were released.

rocketpastsix
they probably used a magnifying glass to help read.
ackfoobar
Yeah same thought here. When I got the glasses and was ready to be disappointed by the AI feature, I ask it to tell me the sweetener from the ingredient list on a can of coke zero. It hallucinated a whole bunch, so I took a photo to see for myself what the LLM saw. The resolution was very low.
> I don't understand how she's using Meta glasses to read print.

The glasses have a camera, and small speakers near your ears. They also have a microphone, so you can give them voice commands. Like Amazon Alexa, but in the glasses.

sleepyguy OP
The glasses need to be connected to your smartphone, and then you ask.

Hey Meta, read the text on this label and tell me what it says.

Hey Meta, do you see the keys on the counter?

Hey Meta, can you tell me what is in front of me?

It projects the sound into your ear.

deadbabe
I have perfect vision I have no interest in wearing fake glasses all day.

And mobile phones aren’t going anywhere because mobile computing has peaked: there are no use cases that require a device with a different form factor, it’s just a matter of lifestyle preference.

If we’re abandoning screen based devices, I’d rather have a small 2000s style flip phone with all the latest tech and LLM features built in, than something like glasses, which clash too much with fashion choices. Bonus if the battery life is insane.

TheGRS
I haven't seen a lot of progress on it, but I would definitely jump on whatever device lets me not have this chunky block in my pocket all the time. The concept I saw years ago was like a slap bracelet that you could remove from your wrist and unwrap into a tablet form-factor.
andoando
Just get rid of screens entirely and focus on software for the blind :]
deadbabe
A device slightly bigger than a car key would be perfectly fine.
leptons
If you don't currently wear glasses all day every day because you need them to see, I can assure you that wearing glasses all day every day is not the luxury you might think it is.

It tends to wear on the bridge of the nose after a while. And I'm sure these e-glasses are going to be heavier than normal glasses with a battery and electronics in addition to the normal things glasses have.

racl101
That's pretty cool. My mom is experiencing a lot of eye issues lately. So this is encouraging to hear.

I couldn't fathom if I would use these things for myself (at least not now, cause I'm ok with my Smartphone and don't really want to get a Meta account), but this, definitely changes my perspective a little.

layer8
I think this underestimates how many people dislike wearing glasses, and how much people don’t like interacting with people wearing non-transparent/colored glasses. You can flip a smartphone in and out of your pocket very quickly. The same is less practical (where do you put it?) and takes longer with glasses.
jazzyjackson
They come with a charger case that’s pretty pocketable. I had some with transition lenses so they weren’t full time sunglasses

I returned them cause I didn’t like forcing a camera into everyone’s face unannounced and the photos it took weren’t very good (vertical pics cutoff most stuff in my field of view, weird choice of focal length. Maybe with two cameras they could have a wide angle and a telephoto but the ray bans at least just had the one.

pizzathyme
I agree. Some threshold in the past few years has been passed. I wear mine every day (and I don't wear glasses normally). Music, photos, videos all super useful. AI is lacking but will get better. Feels cool and not-embarassing in public

I think that they've done it, this is Meta's iPod

ghostpepper
> I think that they've done it, this is Meta's iPod

I would love to try these types of devices but there is no way I'd ever give money to Meta or put my personal information into their systems or encourage my friends and family to do so either.

Hopefully Meta puts in a bunch of R&D to see what works in this space and then someone else (Apple?) just copies it.

bredren
Me too, I’d never trust that company with anything personal. It is bad enough that they can track health meta via in quest.

Meta running the show is a non-starter.

1shooner
Do you encounter people that would rather you not point a Meta camera at them as a condition of interacting with you? Or is it more task-specific?
dustbunny
I'm interested in hearing more use cases! Anyone else got one?
Make sense why Meta invested in Scale AI.

This item has no comments currently.