Microsoft Engineer Says Company’s AI Tool Generates Sexual And Violent Images

Microsoft Engineer Says Company’s AI Tool Generates Sexual And Violent Images

Mr Jones claims he previously warned Microsoft management but saw no action A Microsoft AI engineer, Shane Jones, raised concerns in a letter on Wednesday. He alleges the company’s AI image generator, Copilot Designer, lacks safeguards against generating inappropriate content, like violent or sexual imagery. Mr Jones claims he previously warned Microsoft management but saw … Read more

Microsoft Worker Says AI Tool Tends To Create “Sexually Objectified” Images

Microsoft Worker Says AI Tool Tends To Create “Sexually Objectified” Images

A Microsoft Corp. software engineer sent letters to the company’s board, lawmakers and the Federal Trade Commission warning that the tech giant is not doing enough to safeguard its AI image generation tool, Copilot Designer, from creating abusive and violent content. Shane Jones said he discovered a security vulnerability in OpenAI’s latest DALL-E image generator … Read more