Hacker News new | past | comments | ask | show | jobs | submit login
Google co-founder Sergey Brin spotted wearing Project Glass prototype IRL (engadget.com)
130 points by jasondc on April 6, 2012 | hide | past | favorite | 94 comments



You'd have thought the company would be capable of subtler PR than pimping an AR headset at a blindness awareness event.


Blindness is defined as worse than 20/400 with the best correction possible. At the low end of the scale you could probably still focus on an HUD. Even with only the ability to sense light from dark I could see these being useful.


You know, now you've got me wondering. If glasses like this could be put into an area where a legally blind person could focus, could it show a video feed of what's ahead and allow them to see clearer?

Or am I misunderstanding just how bad "worse than 20/400" is?


This might work. My vision is probably in the 20/600 or much worse range (Though correctable with lenses).. and without corrective lenses I have maybe a 2-3 inch maximum focal distance. Though, I do find that with my corrective lenses I lose that focal distance and end up with a 6inch-ish minmum focal distant.


Glasses like this could take whatever you are looking at it and magnify it in the heads up display. It could probably even be selective about it, e.g. only magnify and sharpen text that is at the center of your field of vision.

There are already assistive devices for people with visual impairments, but they are clunky and cost a LOT.


If a HUD could focus it, then why couldn't corrective glasses?


Glasses focus the light into your eye, still relying on your eye to focus on distance. The HUD in my example would flatten the distance and show a 2D picture in front of your eye at a point where you can naturally focus.

Like I said, I don't know if this is possible.


Yeah, seems like a more salient benefit to the visually impaired from a HUD would be for a patient with significant damage to regions of their visual field. Then the computations available with the wearable could be used to apply a real-time transformation to the video so as to push more of the video stream's information into the regions of their visual field which still work.


I suspect this comment is largely misunderstood by those responding about the usefulness of such a device to the legally blind.

The point is that it's not subtle as far as PR goes, not that "legally blind people have no use for this thing".


No, it's not insensitive. These glasses could be a huge breakthrough for the vision or hearing impaired: http://news.ycombinator.com/item?id=3806640


But, but... outrage!


I think that's exactly why he used it there. This device could be amazing for blind people if it has accessibility features built-in (like the voice controlling part).


An ear piece would suffice for the blind, unless they need to use the nose bridge as a stand.


You mean like the Opti Grab? We learned a lot about blindness from having guardianship of our blind teenage nephew for a year. True, the ear piece would be the big benefit for him (though one eye was very slightly sensitive to light). Others at his school had varying degrees of low vision and might at least benefit from bright flashes of light. But the big gain would be something that describes his environment out loud. Blind people do still aim their faces toward things. And it's much better than a similar capability in a handheld device such as the iPhone or iPod Touch, because for the blind, orienting a handheld device correctly can be a major problem. We turned on the accessibility features for our nephew's iPod, but the thing would switch from landscape to portrait as he turned his hand slightly. Head-mounted is the way to go. And for the sighted, it will be great for games, as I found out from the prototype glasses I got: http://youtu.be/lQGpOibUsyo


That's true but the advantage with a generic device is that, same applications can be used by both normal and blind people with a (hope fully) little tweaking on the app's part. For example, if it is a route finding app, a tweak to provide more frequent and detailed audio when set in disabled/blind mode will make it compatible for a larger user base.

If they release some device only for the blind , then you will have to find developers who are willing to develop to a niche market.


Well there is something questionable there, but I could imagine it being useful for partially sighted people. Being legally blind does not necessarily mean you 100% cannot see.


Augmented vision technology makes perfect sense for blind and vision impaired users. For example, the latest version of Google Goggles (http://www.google.com/mobile/goggles) performs image recognition on the live camera feed and speaks out the results as it recognizes items such as banknotes. Running the same software on a headset would allow the user to receive an aural interpretation of text or images around them.


I'm responding 9 hours after your comment. A healthy discussion has evolved over all of the utility a legally blind person could find in this device. I would say this was a major PR success.


Am I the only one who wants these only to write new software for them? Voice-controlled is nice and flashy, but I doubt would suffice for any "real" work. I, personally, like the idea of gloves and a windowed interface on HUDs, the kind of gloves that let you tap your fingers to do certain actions, along with some identifier for tracking so you could "click" on the screen. I was also thinking about looking into a 3d windowed interface, relative to your glasses, but detachable into the enviroment so you could put you virtual keyboard on your desk and go get coffee; when you came back, it'd still be there.


Absolutely. The concept video they released did nothing for me, but I'd absolutely love to tinker with them.

Let's hope they aren't going to be too expensive, and that they will be open to modifications (though knowing that this is Google, I'm not too worried about it).


Data-input gloves honestly sound like an awful idea. At least when typing you have something to rest your arms on. Just waving them in the air for however long sounds like hard work.

But your general point is something I strongly agree with. I can't wait for better man machine interfaces and a better metaphor than "the desktop" or whatever is being used for mobile nowadays.


I really hope we can remove the gloves and use image recognition instead.


The reason I like the gloves is it provides you with the 8 click-like functions, more if you use more then two fingers, and you don't even have to have it in the field of vision of the camera to use it, letting you keep your hands by your sides and maybe use eye-tracking for "mousing"


Gloves would be a '90s throw-back solution when the Google Glass device should be using the camera to detect your finger/hand positions.


This is exciting. It shows how different Google is to Microsoft and other companies that produce these "future product" videos. Google appears to have made real world progress, rather than just a great marketing video. I can't wait to find out what functionality these early prototypes have!


Hey, don't bad-mouth Microsoft out of all giant tech companies. Have you seen what Microsoft Research has been putting out? (Not to mention they finally brought 3D scanning/mo-cap into the consumer price range)


This is probably the general public's idea of Microsoft's version of augmented glasses: http://www.youtube.com/watch?v=ZwModZmOzDs


not really. i know they publish a lot of good papers though. the only exception is kinect. that was a game-changer.


MSR developed the idea further, but Kinect originally came from an acquisition.


Head mounted displays have been available for years. The fact that Sergey Brin is able to test a prototype demonstrates nothing.

Admittedly, if they do have the hardware nailed, then these would be cool.


> Head mounted displays have been available for years.

Yes, similar hardware already exists. What about the software and services? Google has an OS, voice recognition, navigation, etc., already coded. Most importantly, it has the brand recognition so it can seriously attempt to make this technology mainstream. Look at all the media coverage it's already gotten.

Of course, we have no clue if they've managed to mess up or not because we haven't really seen anything yet, but Google really looks like they have a chance at this. Imagine if these became popular and there was an app store!


That OS they have 'already coded', is hardly designed for this kind of device. But yes, I'll agree they have done a lot of work on key facilities that would be needed for this.

Media coverage doesn't mean anything. Look at the media coverage Google Wave got.

As for an App store, yes - they might follow the Android model and make these things somewhat open. Equally likely is that they might make them as open as Google+ or Siri - which is very much what the videos seemed to be illustrating.


> Media coverage doesn't mean anything. Look at the media coverage Google Wave got.

I still think media coverage counts for something here. Without media attention, most people might not hear about this device. Of course, if the product sucks, then simply knowing it exists won't save it. It's easy to get something noticed if it is from Google, but that's not going to guarantee quality.

> As for an App store, yes - they might follow the Android model and make these things somewhat open

I hope that they do keep it open. The really interesting apps for this type of device are probably going to be third party apps. A closed ecosystem is one of the biggest mistakes that Google could do on the software side. There are all sorts of niche or unexpected things you could do with this hardware if it was popular, given the right apps.


I could imagine the security risks with this though, much more than with a phone. Already software can track your location without GPS (by picking out landmarks, I think Google does this, I know Microsoft does, and I know Facebook does). Malicious apps on this would be killer to privacy. Recording video, sound, and location on a device that is primed at all times to be recording that (unlike a phone).

Right now it's too early to tell exactly what this project will amount to, but I really hope Google makes some serious security considerations when/if they open access to an app store.


In the photo, the design of the glasses looks like the prototype images that were shown when the glasses were announced.

But head mounted displays made by Google haven't been available for years. This is coming from the company who are massively pioneering self driving cars. Those were around before google became involved but since then Google has arguably made much more progress than anyone else (At least to the extent where there cars are most tested and have garnered the most publicity).


"But head mounted displays made by Google haven't been available for years."

Sure, but this makes your point into a circular argument.

In your first comment you indicate that the fact that these glasses have been spotted shows that Google is able to make progress.

Now you're saying that the reason that progress has probably been made on the glasses is because they're made by Google.

I do agree with you that they are probably pouring a lot more resources into this than others have. I just don't think that sighting of the glasses in public indicate anything about what progress they've made.


The real symptom of progress in this area is here http://www.psfk.com/2012/04/movie-projection-glasses.html This market is about to bloom I guess.



How do these work ? Do they project on the lens, or on your retina ? Is there any possibility these might be good for people who already wear glasses?


They project on the lens, on a peripheral area that doesn't obstruct vision. I guess you can wear contact lenses behind them.


Given that your eyes move around Google would need some very advanced eye-tracking to pull this off... seems unlikely.


Maybe there's a camera somewhere in the front housing.


Neither, there are no lenses, really we should call this a "headset" rather than "glasses". There's a small rectangular glass screen which is projected on to. Seems like there would be no problem wearing contact lenses with them but not glasses.


How does that work? If I hold a screen that far from my eyes, it's just a blurry mess (and I'm near-sighted!)


The image won't be focused on the reflecting glass, and the glass won't be flat.

If you are near sighted, I suppose you have glasses, to figure out what's happening just try to look at the reflexions on the convex (exterior) side of your glasses, and you'll see you will manage to see a clear image much nearer than usual.


1) Scoble wrote Brin's glasses had lenses.

2) If it were just the tiny rectangle, wouldn't they be very inconvenient? I mean, I would have to look up constantly, instead of having image projected on my field of view. Pretty tiresome in the longer run. Plus, the "screen" would be too tiny, even this close to the eye.

I know the demo video was just CG, but it did suggest that the image would be projected across the whole field of view. (Well, the "main" part at least - i.e. whatever your typical glasses cover.)

So I'm really hoping this tiny rectangle is just a reflector / prism that projects the picture on glass lenses or the eye itself. Otherwise it's just not good enough.

(And I disagree this would obstruct your normal vision. The projected image would be semi transparent and it would affect just one eye.)


Scoble was mistaken - there are no lenses in the photos, take a look at the hi-res version at http://mobile.theverge.com/2012/4/6/2929927/google-project-g...

It does seem like that lens/prism/screen is doing something clever though, as it appears to be too close to the eye to focus on - but who knows?


Well, I didn't see anything in the article where Scroble mentioned whether they had glass lenses or not, but did you look at the pictures in the article? There were very clearly no lenses.


It's not in the article, but in the comments on Robert Scroble's G+ post, he does indeed state that they had lenses. Which, yes, is very odd, since I can't see any lenses either.


At this point we don't know that they do anything. The blue light could be just for effect for all we know. It might just be a way to hype the product before they have a functional prototype.

I think if we want them to demo the actual product and show us what it really does, we should start telling Brin and the rest that we don't believe there is a real product.

Why should we believe that anyway unless they demo what it can do? That marketing video doesn't prove anything. It probably isn't anything like that actually.


Do you mean your eye's lens, or an external lens? I think there is some confusion in these child comments over which you mean. At any rate, it seems it projects onto your eye's lens


Also, it would probably be possible to use lenses in the glasses with a certain dioptre strength applied, so that people could use those instead of their regular glasses.


But there are no lenses in these "glasses"...


I'm not sure now where I read it, but I've seen speculation that the glasses contain a flat holographic optical element which takes the place of a lens. There seem to be existing products that use this approach:

http://www.konicaminolta.com/about/research/core_technology/...

http://www.digilens.com/Head_Mounted_Display.html


For all we know those might've been regular glasses without any functionality.


There was a blueish light flashing on Brin's eyes.


Maybe he had a little too much spice that night?


He who controls the spice...


Or Google is branching out into real androids.


I'm pretty excited for these glasses, but I'm curious. Is there any speculation on the potential damage this could do to an individuals vision?


I can only speculate on the damage to your social life.


I got out of the office at 11pm last night, and I moved to the other side of the charles river.... so, not much to damage :D


Anyone seen the spoof? http://youtu.be/5vrxfiXU5lo


Yeah, it's quite violent. But I loved it.


He wore this at a "Dining in the Dark charity event for the Foundation Fighting Blindness" event? Does anyone think that seems a bit… insensitive? I mean I guess they have audio as well but is version 1.0 going to be blind-accessible?


"Blind" doesn't necessarily mean "zero vision". There are a lot of people with vision impairment (legal blindness, but partial sight) who could benefit from these glasses.

For example, the glasses could automatically magnify small text that you are reading. They could speak out directions as you walk around a city. They could help you identify products at the store.

That's just a handful of things I came up with off the top of my head. Consider all of the assistive devices (mostly expensive btw) that are currently being used by the blind (and don't forget the deaf!), and think of how these glasses could augment or even replace what's available now. It could be huge.


I certainly know that blindness isn't binary as my neighbor while growing up was legally blind yet mowed her yard every day with a hand pushed mower. She used a giant magnifying glass a few inches from her eye to read. However, I understand and appreciate what you're getting at.

I'm specifically asking if they are going to go the extra mile and make this technology blind-accessible. The iPhone is extremely successful product for the blind for example because it has a well thought out alternate input mode that they spent a lot of time and effort developing.

If Google isn't going to actually add accessibility provisions in then this seems like rolling up to a environmental charity event in a custom Hummer.


It would be fantastic if mainstream tech went that extra mile and had well thought out accessibility stuff built in form the beginning.

Imagine these glasses being able to tell you what colour that shirt is; no more garish colour clashes for people with colour blindness. Or a bajillion other ideas.


There might be some good stuff to come out of this. If the glasses do "augmented reality", they could do stuff like read things, locate obstacles (cars? pedestrians? Google's car is doing this, right?), give directions, and so on.

If the information is there, some kind of speech assist plugin could be a literal lifesaver for the blind.


Do you stop using your eyes around blind people just to avoid being insensitive?


Yes, if its a charity event for blind people and the most media attention this event gets is for a product that is possibly the coolest thing to ever happen to your eyeballs.


You think if he'd gone to the charity before and said "hey, I'm going to be wearing a prototype pair of glasses to this thing, it's likely to get you a whole bunch of publicity, would you rather I took them off?", they'd have said anything other than "Hell no, keep them on"?

Everyone I've ever met who works for a charity is massively pragmatic about these things. They would understand that he's not doing it to mock the blind and accepting that would very quickly move on to the potential good for the charity.


I find your lack of imagination disturbing. I guess it's kind of vogue to be a politically correct whiner these days. Plus, it has been quite a few years since Star Trek Next Gen has been on so let me help you here. Let's say we are both blind and we have a pair of glasses like these with a camera, a computer, some image recognition, perhaps even the ability to read. Now, we're blind so none of this technology is useful to us right? Siri, please help this HN reader.


Ugh, I got downvoted for my sarcasm. It also seems like the original message to which I replied was changed so I guess my message was received, but my reply does seem out of place.

Anyway, if you're bind, your lenses will be opaque and you'll be using audio, and maybe ... just Google Jordy on Star. This type of device will change your life if you're blind. You could "read" HN, for example.


True. Though "Dining in the dark" does involve largely not using your eyes - that's kind of the point of the "in the dark" part.


That's fine but when the shot was taken the lights seemed to be on.


Maybe the glasses could help people with impaired vision, for example by highlighting things that would otherwise be difficult to see.


Indeed. Pairing this sort of hardware up with something like [http://www.bristol.ac.uk/vi-lab/projects/casblip/] could work well for the partially-sighted -- I mean, that research group already have prototype hardware, but Google's setup looks a lot more compact.


I don't speak for the blind, but I guess if I were blind, I would dislike people not using what I wished I could use just because I couldn't.


Yeah, I guess you'd see it as people being ungrateful for the fact they had sight.


Oh, I was ambiguous. I meant to say I'd dislike that they do not use something, not that I'd dislike the people.


In the image he looks like a borg.


This was a big thing in the 1990s. A lot of people walked around with heads up displays attached to their glasses. I'm sure technology has gotten smaller, but I've not seen anything indicating that google has made a break thru. The google promo video is very much like the kinds of promo videos people made in the 1990s.

Here's a picture of what they looked like back then: http://www.sciencephoto.com/media/349371/enlarge


> A lot of people walked around with heads up displays attached to their glasses.

No they did not. You make it sound like this was a popular trend. Wearing an awkward head-mounted display connected to a crappy laptop was something only a few super-geeks with disposable money did.


Yes. Share your location, check in, record everything. Enjoy your yoke. Big brother is watching, idiots.


You use the internet?

You use a credit card?

You pay taxes?

You have a driver's license or state ID?

You have a social security number?

You have a mortgage, or pay rent?

Your birth was recorded with a birth certificate?

You went to a public school?

You have a credit rating?

You use a bank?

You use customer loyalty cards from vendors?

You use email?

You walk on streets with security cameras on them?

You have an employer?

You have a CELL PHONE?!?!

GET OFF THE GRID, MAN!

You don't need to share your location, you don't need to check in, you don't need to record everything - your location is already known, and everything is already recorded. Unless you've already rejected all of the technologies I've listed above, objecting to this new one is pretty absurd.


To be fair, that data is distributed across many entities. What people worry about is any single company aggregating too much data. Google can get all your searches, email, some purchases (through Wallet), social connections and interests, phone calls, voice mails and location data. Now they could literally record everything you see. I think that's why some people might be reticent about using this.


Let's say you had all that information about me. What would you do with it?


Sell it to the highest bidder!


Who would then do what?

I'll start: given my credit card transaction log, thieves could show up at my apartment and steal my Amazon packages before one of my neighbors brings them inside.


So? Even if you don't trust the software, the hardware is still extremly interesting. I would love to get my hands on these, if only to start coding drivers and interface.

One thing I can becoming big with AR is gloves that let the glasses use your finger taps and pinches as user interface. That's something I see lacking from google's implementation, which seems mostly voice based. Great for finding a moive, but I'd doubt you'd be able to get any real work done


These Google links have become worse than Flash for me. I would really rather just read the article and look at the pics and not have to "sign in" and everything that entails.


The article is on Engadget, not on Google.


Then Google should pick better people to leak their products to.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: