Google Glass: Making Its Presence Felt
It was being written off as a ‘has-been’ or a ‘failed experiment’ but trust Google to surprise you. The recent teaser about how the company plans to use Augmented Reality in eyewear did get a lot of appreciation
It was almost a decade ago, somewhere in April 2012 that Google launched its Google glass and although it had a wild but brief run, it stopped its sales in 2015. During its tenure, it had fans as well as haters. Some loved it and some thought it was ugly and then there were privacy concerns that considerably reduced the popularity of the product.
There was not much heard of Google glass since then and people often referred to it as as a ‘failed experiment’. However others tried to pick up where Google left the idea and there were some interesting products that made smart glasses cool for some time but never really useful in the long run. Whether it was for audio like the BOSE sunglasses or the recently launched Facebook Rayban, there really was nothing to push it beyond an initial interesting launch.
That is, not until recently, when there was a pleasant surprise for everyone at Google I/O 2022 and it’s something that could have a long term impact on the optical industry.
The key feature Google showed off was the ability to see languages translated right in front of your eyes, which seems to me like a very practical application for AR glasses. While a big part of Silicon Valley is heavily invested in making AR glasses a reality, thus far, no one has suggested a truly “killer” app for AR that would let you overlook the wide variety of privacy concerns inherent with the tech. Live translating the spoken word would definitely be a killer feature.
The company didn’t share any details about when they might be available and only demonstrated them in a recorded video that didn’t actually show the display or how you would interact with them. But what was shown in the video painted a very cool picture of a potential AR future.
The user can see what someone is speaking to him or her, just transcribed for you in real time — kind of like subtitles for the world. The translated language appears in real time in the wearer’s line of sight.
Until these become a real product we can try, we won’t know how well they might work in practice. But Google’s vision shown at I/O, if it pans out, would be incredibly useful.
The Smartglasses market has so far always connected itself as an accessory to the smartphone and rarely has been seen as an independent device. But with this new development, things might just change!
Google CEO Sundar Pichai shared some context about how the company views AR and based on what he said, it seems the company believes that AR can exist in many places that aren’t a smartphone.
One of the most interesting parts of its new glasses initiative is a focus on practical utility. The ability to understand and be understood is actually useful. These glasses aren't focusing on floating dinosaurs or magic experiences; they're trying to assist. Meta's recent smart glass ambitions also aim at providing utility, but Google's experience and tools seem well suited for the challenge.
Like always there’s 2 sides to everything and to counter the optimism on Google, there’s the pessimistic view about how much accuracy is really possible in real time translation.
Is Translating Like A Human Possible?
Like many people, we’ve used Google Translate before and largely think of it as a very impressive tool that happens to make a lot of embarrassing misfires. While we might trust it to get us directions to the bus, that’s nowhere near the same thing as trusting it to correctly interpret and relay our parents’ childhood stories.
Then again there’s the issue of visibility of the text and if the real world circumstances allows for being able to read clearly.
As if to confirm the inaccuracy of the translation, Google showed over half a dozen backwards, broken, or otherwise incorrect scripts on a slide during its Translate presentation. But then nobody really expects perfection in such a thing as real time translation but those kinds of mistakes from Google, itself, make it seem incredibly unlikely that an acceptable level of translation would be available soon.
Google is trying to solve an immensely complicated problem. Translating words is easy; figuring out grammar is difficult but possible. But language and communication are far more complex than just those two things. So when someone speaks multiple languages and jumps from one language to another and throws in words in between that are from a completely different language, that type of thing is relatively easy for a human to parse, but could Google’s prototype glasses deal with it?
It’s not that Google’s goal isn’t admirable. We absolutely want to live in a world where everyone gets to experience what the research participants in the video do, staring with wide-eyed wonderment as they see their loved ones’ words appear before them. Breaking down language barriers and understanding each other in ways we couldn’t before is something the world needs way more of; it’s just that there’s a long way to go before we reach that future. Machine translation is here and has been for a long time. But despite the plethora of languages it can handle, it doesn’t speak human yet.
A Dose Of Optimism
The video that Google shared to display the product’s use showed an emotional mother-daughter angle where the mother and daughter spoke different languages and were finally able to communicate using Google Glass. The concept was touching and gave an insight to possible use cases for the product.
One of the first use cases that come to mind is travel. For most tourists in a foreign country, this could be a boon and communicating with locals in a foreign country could be made much easier. So it’s one for things on the list to pack for those travelling to another country.
Learning a Language
It could be a really handy tool for beginners learning a language and wanting to practice a real time conversation. It’s a known fact that there’s no better way to learn than having real conversations but most learners tend to shy away from conversations because of the probability of being embarrassed. The glasses could help in such situations and make them more comfortable initially. But based on the fact that the translations are not 100% accurate as yet, this could be only a beginners tool and not for serious learning.
This list could be extended but there’s not much news from Google after the teaser and we are all waiting on when and how the Google Glass can help us see and understand the world better.