Think twice before taking Google AI Search advice

News Flash

The use of artificial intelligence is hotly debated, with some people and industries believing it will revolutionize our lives while others fear it will be the end of us.

What isn't debated is that AI is currently far from perfect and this article from CNET provides examples of its shortcomings, courtesy of Google's AI Overviews search tool. Users have received answers to questions that are not only incorrect but bizarre.

Google says it's taking swift action to improve the user experience, but not all AI responses have been benign. Twitter users have posted dangerous advice from AI Overview like adding oil to extinguish a fire and combining bleach and vinegar as a cleaning agent (creating a potentially lethal Chlorine gas).
https://www.cnet.com/tech/services-and-software/glue-in-pizza-eat-rocks-...

Present Valley

I share the concerns about AI....how can it really help us and how could it harm us. Especially perhaps when all the correct players are not sitting around the table...ethicists and social science experts in particular.

My favorite nickname for AI right now is "Artificial Intimacy" coined by Esther Perel. Her theory is that technology has altered how we approach dating, friendship and community.

Well Street

I've watched movies like The Terminator and The Matrix many times where AI becomes self-aware, sees itself as the superior lifeform, and takes over the world. This likely contributes to my concerns about the technology.

I've listened to several of Esther Perel's podcast episodes, and her "Artificial Intimacy" term is spot on. There's no denying that large swaths of the population struggle to connect, communicate, and develop intimacy due to technology overdosing.

AI use will only continue growing, and my fingers are crossed that its proponents' vision of the good it can bring will be realized. I much prefer that to being hunted by robot armies.

Youngdannville

Yikes. I’m glad I double check and verify everything in this day of misinformation

Well Street

That's a great habit we'd all do well to practice.

Evangel

Adobe Illustrator, which I use to create artwork, recently added AI to its software so that with one command I could generate an image. I tried it out for fun thinking it might generate a great image, but the generated image gave me people with misplaced body parts such as a nose stuck on top of the persons's head😂. The real problem, though, is not art or writing. Despite those flaws, the technology will improve as real writers and artists lose jobs. What's more troubling is how they're using it in medicine as a diagnostic tool. One doctor who participated in a medical demonstration using AI was shocked at the results, stating that it was going to kill people.

Earlier this month, a group of current and former employees leading the development of AI technologies wrote an open letter to warn the world of the dangers of possible human extinction. They want the right to speak out and warn of the dangers, but are currently hamstrung by nondisclosure agreements. We should all be very concerned. Here's a link to the letter: https://righttowarn.ai/

Well Street

Thank you for the post to the open letter. The fact it comes from people in the trenches of AI's development is further evidence of the technology's potential dangers.

You're also spot-on about how AI has already brought challenges to people whose job positions are now obsolete or soon will be. Its impact on the employment landscape is in the early stages but from our current vantage point, it will eliminate far more jobs than create.

With AI's many shortcomings, it seems beyond foolish for the medical industry to consider its use in patient treatments.

Slipstream

When I was listening to a radio program in the car, the topic was robots replacing doctors. The reporter said that in one hospital where they were testing the use of robots, a patient complained that a robot had delivered the news that he had cancer. This is so totally wrong!

Evangel

It sounds like the end of compassion. If there ever was a good use for the word "inhumane," this is it. This is also all about greed and hospitals and insurance companies wanting doctors to spend less and less time with patients. This is beyond wrong. It's a form of cruel and unusual punishment for having gotten sick.

Well Street

For many of us, the doctor's office can elicit anxiety and vulnerability, and a robot doctor won't do anything to calm those feelings.

There's currently an epidemic of loneliness, disconnectedness, and not feeling heard. The systematic replacement of people with AI will likely be gasoline on a fire.