Gemini Live will learn to peer through your camera lens in a few weeks

Well, I watched the demonstration videos.

I, too, can pick out an outfit to wear and identify things around me that I see.

This is supposed to be worth $20/month to me, and a net loss to Google? Who exactly is this for again?
Amazon tried the what to wear thing. How did that turn out?
 
Upvote
-1 (1 / -2)

perrosdelaguerra

Ars Scholae Palatinae
868
Subscriptor
Google glass + Gemini vision would be a special gift to all blind people.
They would be able to be more independent: read book, shopping, explore environments ...
And Google could tell them which books to read, where to shop, how to navigate there...all because of ad keywords and sponsored businesses. Talk about a captive audience! Even if it's a sliver of the general population, I would not subject blind people to only knowing what Google wants them to know. Plus, with enshittification en vogue, it would quickly become unusable.

Speaking of Google's business strategy acumen, has the kill date been announced yet?
 
Upvote
8 (10 / -2)

JamesW

Ars Tribunus Militum
1,932
Subscriptor
Well, I watched the demonstration videos.

I, too, can pick out an outfit to wear and identify things around me that I see.

This is supposed to be worth $20/month to me, and a net loss to Google? Who exactly is this for again?
The pottery one was OK, but the jeans one was terrible.

a) The thing to associate with those jeans is a bonfire.
b) Never go double denim.
 
Upvote
1 (1 / 0)
Years ago I had this dream of an assistive device that could help people with early symptoms of Alzheimer's. The system could look around, be programmed to know typical daily routines, and would have infinite patience to explain things. "Your husband left to the supermarket, but he will be right back" (repeat ten times throughout the 40 minutes). It could help with things like "where did I put my sweater" ("you just took it off and left it in the bathroom"). It could warn the other partner "Your wife is trying to leave the house" etc.

Present-day elderly might not be that welcoming to such audio cues, but future generations that got used to talking with an AI/LLM might actually use it. Depending on how rapidly the patient's disease is progressing, this could make a partner feel more confident leaving someone at home for a while.

Edit: forgot to add: a system like the one in the article could be adapted to do exactly this. I get the skepticism about the current AI craze, but there may be some genuinely beneficial applications that could improve people's quality of life.

I love this idea and I don't want this to come across as a reason it can't work, just a general caution:

You know how people joke about an LLM telling people to drink bleach? There are actual documented case reports of people with dementia confusing bottles of bleach, drain cleaner, and other substances, for orange juice. I mention this because it's a real risk, and in theory one that such an AI could help with by scanning the label on the bottle...

But it also highlights that there are potentially deadly and unpredictable consequences if an LLM "hallucinates" its advice to vulnerable people with neuropsychiatric difficulties.
 
Upvote
13 (13 / 0)

perrosdelaguerra

Ars Scholae Palatinae
868
Subscriptor
Wow, so many negative comments. So much lack of imagination.
The number of negative comments are because we have all imagined how badly this could go, and weighed it against how much good we imagine it could do. The scales clearly tip toward the negative. Including in a business sense.
 
Upvote
8 (10 / -2)

Ianal

Ars Scholae Palatinae
971
I genuinely don’t understand the appeal of this technology.

Setting aside the obvious privacy concerns, and the fact that the companies pushing this idea never saw a bit of personal information that they didn’t want to slurp up, I just don’t grok (in the original sense) why folks would want to hand over this kind of trivial decision making to a computer?

It’s like the Copilot ad that Windows has been pestering me with in which it suggests that I let Copilot recommend a movie for a cosy night in, or some such.

Why? Why do I need a computer to decide this for me? Why would I want a computer to decide this for me?

I mean no disrespect to the folks talking about dementia assistants - that sounds like an excellent use of this kind of technology, and I can absolutely see the benefit there.

But mass-market AI assistants for everyone? To me that sounds like something being sold by some techbro or other, who can’t get past their late 60s, early 70s version of science fiction.
 
Upvote
7 (8 / -1)

ocramc

Wise, Aged Ars Veteran
106
Well, I watched the demonstration videos.

I, too, can pick out an outfit to wear and identify things around me that I see.

This is supposed to be worth $20/month to me, and a net loss to Google? Who exactly is this for again?
Wall Street. It’s the next growth market! Any company that isn’t pumping untold billions into the AI bubble is doomed to become the next Yahoo or BlackBerry.

At least until the growth slows and someone sensible stops and asks whether you can actually make any money off the vast amounts of computing (and actual) power thrown at it, at which point the bubble bursts and wipes 10-20% off the stock market and triggers another recession.

Or who knows, maybe one of the companies is actually far enough along the AGI path that they can release it before the bubble bursts. In which case we get… the hunger games?
 
Upvote
1 (2 / -1)

graylshaped

Ars Legatus Legionis
61,464
Subscriptor++
Years ago I had this dream of an assistive device that could help people with early symptoms of Alzheimer's. The system could look around, be programmed to know typical daily routines, and would have infinite patience to explain things. "Your husband left to the supermarket, but he will be right back" (repeat ten times throughout the 40 minutes). It could help with things like "where did I put my sweater" ("you just took it off and left it in the bathroom"). It could warn the other partner "Your wife is trying to leave the house" etc.

Present-day elderly might not be that welcoming to such audio cues, but future generations that got used to talking with an AI/LLM might actually use it. Depending on how rapidly the patient's disease is progressing, this could make a partner feel more confident leaving someone at home for a while.

Edit: forgot to add: a system like the one in the article could be adapted to do exactly this. I get the skepticism about the current AI craze, but there may be some genuinely beneficial applications that could improve people's quality of life.
My grandmother at one point had been sweet-talked by phone marketeers into 87 magazine subscriptions back in the day. It took my mom three months to get them all canceled.

In what world does your scenario not lead to rampant abuse of the elderly?
 
Upvote
10 (11 / -1)

123username

Smack-Fu Master, in training
43
I feel like every ai product update or launch is just the adult tech fan version of getting another sweater or pair of socks for Christmas. The feeling is the same but the sweater and socks are infinitely more useful.

Google will start charging us 20 bucks a month to be able to turn it all off.

How much longer is this hype train?
 
Upvote
2 (3 / -1)
We are witnessing the birth of the hype Ourobouros.
And lo, I beheld a great serpent, tail clasped in its mouth, and its name was Hype Train.
"Serpent!", I called to it, "You are consuming yourself! This is folly! Any sustenance you find now only hastens your death!"
It regarded me with black, hooded eyes, and around a mouthful of tail it hissed one word: "HODL"
 
Upvote
5 (6 / -1)

Nemexis

Smack-Fu Master, in training
24
Geeeez the cynicism in this comment section :rolleyes: I think this looks pretty cool. I'd be keen to try it out.

Well, here what's up buddy :eng101:

1) The tech industry has done nothing but encourage cynicism through their actions, so fuck them if now I'm a cynical bastard

2) I would be less critical of it if this tech wasn't being actively pushed down our throat (like they tried with NFT's)

3) This is a genuine privacy issue and, as various actors in the industry already demonstrated multiple times, their security is for shit

4) Regardless of the previous point, can you be 100% sure they would not abuse this software to do whatever in the future? (ex. supercharged digital profiling for ads or sold to databrokers - your data used to train AI - etc...)

5) Considering the energy cost (for both training and general use), using AI for shit like this is absolutely unjustifiable, especially if you remember that we are in both an energy generation and enviromental crysis
 
Last edited:
Upvote
7 (8 / -1)

MagicDot

Ars Scholae Palatinae
878
Subscriptor
...long-awaited Gemini AI feature
Long-awaited by whom? Don't these bozos realize consumers are on to their tricks? This is marketed as a wonderful new feature when in reality it will be secretly scraping more data to generate more ads and make your life a bit less livable.
I see more and more videos on "the soc" of young people saying how boring and annoying tech is becoming. That's the kiss of death. Much like E.T. is blamed for crashing the video game market, AI will one day be blamed for crashing the tech market...except with E.T. you could actually win.
 
Upvote
0 (0 / 0)

Robin-3

Ars Scholae Palatinae
736
Subscriptor
Years ago I had this dream of an assistive device that could help people with early symptoms of Alzheimer's. The system could look around, be programmed to know typical daily routines, and would have infinite patience to explain things. "Your husband left to the supermarket, but he will be right back" (repeat ten times throughout the 40 minutes). It could help with things like "where did I put my sweater" ("you just took it off and left it in the bathroom"). It could warn the other partner "Your wife is trying to leave the house" etc.

Present-day elderly might not be that welcoming to such audio cues, but future generations that got used to talking with an AI/LLM might actually use it. Depending on how rapidly the patient's disease is progressing, this could make a partner feel more confident leaving someone at home for a while.

Edit: forgot to add: a system like the one in the article could be adapted to do exactly this. I get the skepticism about the current AI craze, but there may be some genuinely beneficial applications that could improve people's quality of life.
I think this is yet another reason that the immediate profit-maximization business culture, combined with a lack of any real protections for users' privacy or data rights, are a tragedy.

Sure, there are a number of details to fine-tune out about something like this (as comments flagged already). But it's a great example of a potentially beneficial use of an emerging technology, specifically aimed at a population that could use innovative solutions.

....BUT, it requires users to allow tech companies access to huge amounts of data about their private everyday lives: everything from food preferences (advertising $$$!) to medical issues to bathroom habits (even if you keep devices out of the bathroom, certain conclusions can be drawn from behavior patterns and conversation). Especially if this is being used by people who don't fully understand the implications (say, someone with memory or cognitive issues), there has to be a high level of trust that this data is being handled sensitively and not exploited.

And that trust is entirely absent.
 
Upvote
4 (5 / -1)

TimeWinder

Ars Scholae Palatinae
1,764
Subscriptor
Edit: forgot to add: a system like the one in the article could be adapted to do exactly this. I get the skepticism about the current AI craze, but there may be some genuinely beneficial applications that could improve people's quality of life.
Up until the "helpful" AI tells the partner to eat rocks, set the house on fire, jump out a third-story window, or microwave the cat because the AI matched a linguistic pattern that made those correct answers: They (the AIs) still don't have the slightest idea what they're talking about and cannot reason. Giving one of them any degree of authority over a vulnerable person is criminal negligence at best.
 
Upvote
0 (1 / -1)
Geeeez the cynicism in this comment section :rolleyes: I think this looks pretty cool. I'd be keen to try it out.

Edit: and downvotes. Ars used to be a pretty fun place....
There’s a huge difference between cynicism and pattern recognition.

I would rather get testicular cancer than trust google with my data.
 
Upvote
0 (0 / 0)

real mikeb_60

Ars Tribunus Angusticlavius
12,177
On the bright side, at least it's paywalled. For now.
Paywalled or not, it's collecting data. And if it's from Google and part of the System, you probably can't turn that off (at least completely). You just have to pay for the privilege of using it. Ah well, guess I'll have to start covering the cameras on the phone, much like the post-it covering the old laptop camera. Perhaps phones could come with little sliding shutters over the camera(s) like the one on my Lenovo Flex laptop?
 
Upvote
0 (1 / -1)
I think this is yet another reason that the immediate profit-maximization business culture, combined with a lack of any real protections for users' privacy or data rights, are a tragedy.

Sure, there are a number of details to fine-tune out about something like this (as comments flagged already). But it's a great example of a potentially beneficial use of an emerging technology, specifically aimed at a population that could use innovative solutions.

....BUT, it requires users to allow tech companies access to huge amounts of data about their private everyday lives: everything from food preferences (advertising $$$!) to medical issues to bathroom habits (even if you keep devices out of the bathroom, certain conclusions can be drawn from behavior patterns and conversation). Especially if this is being used by people who don't fully understand the implications (say, someone with memory or cognitive issues), there has to be a high level of trust that this data is being handled sensitively and not exploited.

And that trust is entirely absent.
yeah. new tech needs to be evaluated not on the possible or even probable benefit it can bring, but on the potential for abuse and cost of nefariousness-ity. New word, heard it here first.
 
Upvote
0 (1 / -1)