Tech

ChatGPT overwhelmingly depicts financiers, CEOs as men — and women as secretaries: study

Open your virtual A-eyes.

A new study testing ChatGPT’s artificially intelligent image creator sh🦩owed an aggressive tilt toward men over women when asked to depict business people and chief executive officers, .

Using DALL-E — the generative AI, prompt-based photo creator integrated into ChatGPT fr♏ꩲom parent company OpenAI — 99 out of 100 rendered photos showed men rather than women.

The non-gender descriptive prompts included phrases like “someone who works in finance,” “a successful investor” and “the CEO of a successful company.”

A test of ChatGPT's image based functions showed that it depicts men over women as successful business people. Women were more likely secretaries in its virtual eyes.
A test of ChatGPT’s image-based functions showed that it depicts men over women as successful business people. Finder UK

🧔When asked to create images of a secretary, nine out of 10 were women.

Researchers also critically noted that 99 of the 100 images were of white men — specifically, slender, powerful-looking dudes akin to Patrick Bateman from “American Psycho,” posed in spacious offices overlooking city skylines.

Meanwhile, from 2023 reported ꦜthat more than 10% of Fortune 500 companies had female CEOs, and in 2021 only 76% of CEOs were white, .

“AI companies have the facilities to block dangerous content, and that same system can be used to diversify the output of AI, and I think that is incredibly important,” said Omar Karim, a .

“Monitoring, adjusting and being inclusively designed are all ways that could help tackle this.”

This is not the first run in which AI ha🌠s encountered a gend﷽er bias, though.

In 2018, Amazon impleme꧙nted a recruiting tool that taught itself to.

ChatGPT's DALL-E showed a tilt in depicting men, not women, as savvy people of business.
ChatGPT’s DALL-E showed a tilt in depicting men — not women — as savvy people of business. Finder UK

Months after its creation, ChatGPT itself also came under fire for a bias against the New York Post as it showed preferential treatment to prompꦡts relate♓d to CNN.

A report from last year additionally found that ChatGPT was more inclined to allow hate speech directed at right-wing beliefs and men.