Microsoft Engineer Warns AI Tool Creates Violent, Sexual Images, Ignores Copyrights

Shane Jones, an artificial intelligence engineer at Microsoft, has raised concerns about the company’s AI image generator Copilot Designer which has been producing inappropriate and potentially harmful images, CNBC reports.

Jones, while testing the product, discovered it generated images inconsistent with Microsoft’s responsible AI principles, including depictions of violence, sexualization, underage drinking and drug use, as well as copyright violations featuring Disney characters.

Despite reporting his findings to Microsoft, the company has not taken the product off the market. Jones escalated his concerns by writing letters to the Federal Trade Commission and Microsoft’s board of directors, urging them to address the issues with Copilot Designer.

Jones highlighted the potential harm caused by AI, particularly concerning misinformation and election-related content. He emphasized the need for better safeguards and accountability mechanisms to prevent the spread of harmful images generated by AI technologies like Copilot Designer.

Share This Story