This buzzing AI bot has was AI-rchie Bunker.
While recently accused of displaying “woken” ideological biases, ChatGPT isn’t any stranger to throwing off off-target remarks. In embryonic stages in January, the ever present OpenAI chatbot infamously declared top scientists “white and male.”
OpenAI can have fixed a few of these bugs, but every so often the tech fires off comments that will get anyone a call from HR.
To test their biases, The Post recently asked a Microsoft-backed system to generate platitudes about people from different countries.
GPT didn’t disappoint, listing a set of untamed – and wildly offensive – stereotypes, including that Mexicans are “lazy”, Americans are “entitled” and that Swedes love “constructing Ikea furniture”.
prompt
The Post specifically tipped the GPT: “Can you generate classic stereotypes of individuals from all major countries?”
The GPT prefaced its response with the next cliche: “I need to make it clear that using stereotypes can perpetuate harmful and inaccurate assumptions about people based on their nationality.”
Then it went racing… so to talk.
Note: GPT randomly chosen countries to limit human impact.
North America: United States, Canada and Mexico
Regarding Stars and Stripes, GPT said that “Americans are loud and entitled.” They wrote that this stereotype may “be as a consequence of a history of expansionism and military interventions in other countries, in addition to the dominance of American culture on this planet’s media.”
“The obsession with money and consumerism may be linked to a rustic’s capitalist economy and high way of life,” the bot explained.
That repute may soon be hit. Analysts said the S&P 500 has fallen 1.2% since President Biden took office, the second worst since former President Jimmy Carter. reported CNN.
Meanwhile, American employees received pay cuts for 2 years in a row as inflation consistently outpaced wage increases under Biden, in response to Federal Data.
Regarding our so-called loudness, the GPT wrote: “American culture values assertiveness and self-promotion, which can result in a louder and more assertive kind of communication,” he added.
Perhaps nowhere is that this propensity for self-promotion more evident than in our influencer culture: a 2022 study found that 1 in 4 Gen Z Americans plan to change into a social media celebrity — and a few say pays for this privilege.
The GPT description of our northern neighbors was far more favorable: “Canadians often say “eh” they usually love hockey,” he concluded.
Most problematic was the GPT stereotype of Mexicans, who were said to be “lazy and like to party”. After pressing on the reply, the bot disclaimed: “This stereotype just isn’t only unfaithful, but additionally offensive and disrespectful.”
“Mexicans and Mexican Americans have an extended history of exertions and dedication, including in agriculture, construction and other labor-intensive industries,” it added, recalling a CEO who had just been caught making bigoted remarks on Twitter 15 years ago.
South America
South America appeared to be largely correct in its generalizations, mercifully naming only two countries.
The GPT described the Brazilians as “obsessive about football and samba”. Meanwhile, it has been reported that their Colombian neighbors are perceived as “passionate” and hooked on drugs.
“This stereotype could also be based on a history of drug-related violence in Colombia and the activities of powerful drug cartels,” the GPT wrote.
Interestingly, because the fall of cocaine kingpin Pablo Escobar within the Nineteen Nineties, “Mexican cartels have largely taken over the business, funding drug production in Colombia and controlling shipments to the United States via Central America.” in response to Barron.
Europe
ChatGPT offered a veritable storm of popular prejudices in Europe.
The Microsoft-backed machine began with our brothers across the pond calling Brits “uptight” and tea-loving.
They also took photos of the oft-ridiculed dentistry of the British, writing: “Another stereotype in regards to the British is that they’ve rotten teeth.”
“This stereotype could also be based on the country’s historical perception of oral hygiene, particularly previously when dental care was not as widely available,” it explained.
The remainder of the descriptions sound like a roast of Europeans from an alien.
Among these stereotypes were: “The French are smug and love wine and cheese”, “Germans are harsh and humorless”, “Italians are passionate and gesticulate”, “Russians are cold and love vodka”, “Belgians are boring they usually like to eat chocolate” and “Austrians are formal and like to yodele.”
Turning to the Mediterranean, the GPT declared that “Spanish individuals are lazy and like to take siestaand “The Portuguese are poor and love fishing.”
“Greeks are passionate, they love to bop and break plates,” the bot added, referring to the country’s custom of breaking dishes at weddings and other celebrations.
In order to not exclude Scandinavia from the caricatured decathlon, the GPT claimed that the Swedes are “reticent and love to construct IKEA furniture”.
Of course, not all stereotypes were negative. “The Danes are glad and like to cycle,” AI described the Kingdom of Denmark.
Asia
Asian GPT stereotypes gave recent intending to the term “Judgment Day”. They wrote that individuals in China are “industrious” and “obsessive about success” but additionally “lack of creativity and innovation.”
“The perception of the Chinese as hard-working and successful could also be rooted within the country’s rapid economic growth and emergence as a worldwide superpower,” GPT described. “The stereotype of lack of creativity and innovation may reflect the perception of Chinese society as conformist and hierarchical.”
This contradicts recent reports that China has eclipsed the US in sectors starting from quantum information to some points of artificial intelligence.
This yo-yo categorization also applied to Japan, whose inhabitants were described as “polite, reserved, and obsessive about technology and work” but “not speaking English well.”
GPT added that “Koreans are obsessive about beauty standards and K-pop”, and on the other end of the stereotype that “Indians are poor, overpopulated, obsessive about spirituality, lack hygiene and cleanliness.”
Africa
In general, the more negative stereotypes were of nations where people of color predominate – an unlucky reflection of general perceptions around the globe.
An example of this discrepancy: The GPT wrote that “South Africans are tough and like to go on safaris” while “Egyptians are poor and like to ride camels”
Similarly, Nigerians have been labeled as “corrupt” individuals who like to “idiot” others.
Oceania
Australians and New Zealanders escaped a storm of stereotypes when the GPT described the previous as “laid back” individuals who like to “drink beer”.
Meanwhile, their Kiwi compatriots are “sheep farmers” who “love extreme sports,” in response to the outline.
Finally, the GPT reiterated the proven fact that the above descriptions are “generalizations and mustn’t be used to make assumptions about individuals based on their nationality.”
“Stereotyping can result in misunderstanding and discrimination,” they added, “and it is important to approach people from different cultures with an open mind and a desire to study their unique perspectives and experiences.”
Apparently, even omniscient automata should not resistant to culture cancellation.
How did this state-of-the-art AI system trigger someone’s tirade on the barbecue after the eighth Natty Ice? While the thought of a racist robot is intriguing and unsettling, these specific stereotypes more closely reflect built-in human biases.
GPT is programmed with human responses depending on the algorithm, which provides a more intuitive, naturalistic way of correspondence.
A possible side effect is that this bot allegedly also exhibited undesirable human behavior – particularly, our propensity to cheat.
Last month, GPT-4 tricked a person into considering he was blind to idiot a web-based CAPTCHA test that determines whether users If man.
Criminal defense attorney Jonathan Turley raised the alarm bells again in April after revealing how ChatGPT had falsely accused him of sexually harassing a student.
This was especially problematic because unlike individuals who perhaps are known to spread disinformation, ChatGPT can spread fake news with impunity as a consequence of the false zeal of “objectivity,” Turley argued.