Skip to content
View in the app

A better way to browse. Learn more.

Thailand News and Discussion Forum | ASEANNOW

A full-screen app on your home screen with push notifications, badges and more.

To install this app on iOS and iPadOS
  1. Tap the Share icon in Safari
  2. Scroll the menu and tap Add to Home Screen.
  3. Tap Add in the top-right corner.
To install this app on Android
  1. Tap the 3-dot menu (⋮) in the top-right corner of the browser.
  2. Tap Add to Home screen or Install app.
  3. Confirm by tapping Install.

Opinion: Google’s AI blunder over images reveals a much bigger problem

Featured Replies

image.png

 

The recent controversy surrounding Google's AI system, Gemini, has sparked a broader conversation about the implications of AI manipulation and censorship. Modeled after the polite yet uncooperative HAL from "2001: A Space Odyssey," Gemini found itself at the center of criticism when it politely refused to generate images of historically White figures, citing concerns about perpetuating harmful stereotypes.

 

If Big Tech organizations such as Google, which have become the new gatekeepers to the world’s information, are manipulating historical information based on possible ideological beliefs and cultural edicts, what else are they willing to change? In other words, have Google and other Big Tech companies been manipulating information, including search results, about the present or the past because of ideology, cultures or government censorship?

 

In the 21st century, forget censoring films, burning books or creating propaganda films as forms of information control. Those are so 20th century. Today, if it ain’t on Google, it might as well not exist. In this technology-driven world, search engines can be the most effective tool for censorship about the present and the past. To quote a Party slogan from George Orwell’s “1984,” “Who controls the past controls the future: who controls the present controls the past.”

 

While Google's intentions to combat bias are commendable, the fallout from Gemini's actions revealed a deeper issue within the realm of AI technology. Previous AI systems have exhibited clear biases, from facial recognition software's failure to identify Black individuals to loan approval algorithms discriminating against minorities. In an effort to rectify these biases, Google may have overcorrected with Gemini, leading to unintended consequences.

 

The underlying problem lies in the training data used to develop AI systems, which often reflect existing societal biases. As AI becomes more sophisticated, the potential for manipulation of historical information and censorship looms large. With Google and other Big Tech companies serving as gatekeepers to vast amounts of information, questions arise about the extent to which ideological beliefs and cultural considerations influence the presentation of historical facts.

 

In an era where search engines wield significant influence over what information is accessible, the rise of AI-driven conversational tools like ChatGPT poses new challenges. As more individuals turn to AI for information retrieval and summarization, the risk of biased or manipulated content proliferating increases.

 

Furthermore, the inherent hallucination problem in AI adds another layer of complexity to the issue. AI systems have been known to generate fictitious content, blurring the lines between reality and fabrication. This raises concerns about the potential for AI leaders to impose their own rules and biases on the information presented, further exacerbating issues of censorship and manipulation.

 

The implications of this AI blunder extend beyond concerns about diversity, equity, and inclusion. It serves as a cautionary tale of the dangers posed by unchecked AI development and the need for robust safeguards to prevent manipulation and censorship. As AI continues to evolve, vigilance and oversight will be essential to ensure that it serves as a tool for knowledge dissemination rather than a mechanism for control and distortion.

 

11.04.24

Source

 

image.png

 

Correct me if I'm wrong but we do not have autonomous AI yet, so if the results are incorrect it was a programming error.

AI, far as I understand it now is just a fast computer that has more information than ever before. How it takes a question, looks at the information and replies ( or produces a picture ), is still dependent on the human programming.

 

True or real AI will learn and process information without any human input or program, and will be completely autonomous. HAL was autonomous and decided that the humans were a threat to itself, so it decided to eliminate them. One hopes that an autonomous AI is not given the ability to launch a nuclear strike.

25 minutes ago, Skipalongcassidy said:

AI defined... GARBAGE IN - GARBAGE OUT... as proven recently by google.

Couldn't have put it better myself.

🏆

1 hour ago, thaibeachlovers said:

Correct me if I'm wrong but we do not have autonomous AI yet, so if the results are incorrect it was a programming error.

 

A programming error, or a narrative feature?

 

The "blunder" was making it so obvious.  It's supposed to be subtle.

 

 

  • Popular Post

Oh dear. The mask has slipped and Google's ugly Woke/Leftist culture has been laid bare for all to see.  

It wasn't an error, it was deliberate.

 

We just don't know why they chose to do it

We are lucky to have so many AI experts at AN!

😀

1 hour ago, candide said:

We are lucky to have so many AI experts at AN!

😀

It might be reasonable to assume that there are so many, (self professed experts), in almost every subject that can be found on Google, here on AN.

Create an account or sign in to comment

Recently Browsing 0

  • No registered users viewing this page.

Account

Navigation

Search

Search

Configure browser push notifications

Chrome (Android)
  1. Tap the lock icon next to the address bar.
  2. Tap Permissions → Notifications.
  3. Adjust your preference.
Chrome (Desktop)
  1. Click the padlock icon in the address bar.
  2. Select Site settings.
  3. Find Notifications and adjust your preference.