Google’s Bard announcement shows a new AI search tool that makes real mistakes
The tech giant’s experimental chat AI shows Google Bard’s ad unknowingly when the device gives an inaccurate response to a question.
It’s evidence that the move to use artificial intelligence chatbots to deliver results for web searches is fast-paced, he says. Carissa Veliz at Oxford University. “The potential for mass misinformation is huge,” she says.
Google announced this week Launching an AI called Bard After the testing phase, it will be integrated into the search engine, which will provide users with a written response to their query instead of a list of relevant websites. Chinese search engine Baidu also announced plans for a similar project, and in 2016
Experts have warned A new scientist There is a concern that such AI chatbots may misrepresent answers because they base their results on statistical data availability rather than accuracy.
Now on Twitter, Google’s ad bard asks, “What new discoveries from the James Webb Space Telescope can I tell my 9-year-old?” He showed the answer to the question. With incorrect results (see figure below).
A third proposal by Bard was that “JWST took the first pictures of a planet outside our own solar system.” But Grant Tremblay The Harvard–Smithsonian Center for Astrophysics points out that this is not true.
“I’m sure Bard will be amazing, but for the record: JWST will not take the first image of a planet outside our solar system,” he said. The original image was instead presented by Chauvin et al. (2004) with the VLT/NACO using adaptive optics. He wrote on Twitter.
“Interestingly, if you search ‘what is the first image of an exoplanet’ on the original Google, the old-school Google, it will give you the correct answer. “And it’s ridiculous that Google didn’t check on its own website to release a huge billion-dollar game into this new space,” Tremblay said. A new scientist.
Bruce McIntoshThe director of the University of California Observatory and part of the team that took the first images of exoplanets noticed the mistake. Writing on TwitterDo you think you need a better example than when you talk about the man who imaged the exoplanet 14 years before JWST?
Veliz is an example of the dangers of relying on AI models when the error and the way they slip through the system are important.
“It really shows the most important weakness of statistical systems. These systems are designed to provide plausible answers based on statistical analysis – they are not designed to provide real answers,” she said.
“We’re definitely not ready for what’s to come. “Companies want to be the first to develop or implement some kind of system, and they’re rushing through the process,” says Veliz. “So we’re not giving the public time to talk about it and think about it, and as you can see in this ad example, they’re not thinking carefully themselves.”
A Google spokesperson said. A new scientist: “This underscores the importance of a rigorous testing process, something we’re launching this week with our Trusted Tester Program. We combine external feedback with our own internal testing to ensure Bard’s responses are grounded in quality, safety and real-world data.
Topics:
We offer you some site tools and assistance to get the best result in daily life by taking advantage of simple experiences