TL;DR
- Microsoft’s AI image generator, Copilot Designer, has been found to produce violent and sexual content, raising concerns over responsible AI practices.
- A Microsoft engineer has escalated the matter to the FTC and Microsoft’s board after the company failed to take appropriate action.
- The tool also generates images that potentially violate copyright laws, including those of Disney characters.
Engineer Raises Concerns over Microsoft’s AI Image Generator
Microsoft engineer, Shane Jones, has been testing Microsoft’s AI image generator, Copilot Designer, and has been disturbed by the violent and sexual content it produces. Despite raising his concerns with Microsoft, Jones feels that the company has not taken appropriate action to address the issue.
Violent and Sexual Content Produced by Copilot Designer
Copilot Designer has been found to generate images that depict demons and monsters alongside terminology related to abortion rights, teenagers with assault rifles, sexualised images of women in violent tableaus, and underage drinking and drug use. These images go against Microsoft’s responsible AI principles and have the potential to cause harm.
In addition to the violent and sexual content, Copilot Designer also generates images that potentially violate copyright laws. The tool has produced images of Disney characters, such as Elsa from “Frozen,” Mickey Mouse, and Star Wars characters, in inappropriate and offensive contexts.
Was The Engineer Silenced?
A Microsoft AI engineer, Shane Jones, raised alarms about potential security flaws that could allow the creation of explicit and violent deepfake images, including notable instances involving singer Taylor Swift. Despite his attempts to address these vulnerabilities internally and his communication with OpenAI, Jones claimed that Microsoft’s legal department pressured him to remove his public posts detailing these concerns. Microsoft responded, stating that the reported techniques did not bypass their safety filters and that they have robust systems in place to address such concerns. Microsoft emphasised its commitment to investigating and remediating any issues raised by employees.
Conclusion
Microsoft’s AI image generator, Copilot Designer, has raised serious concerns over the violent and sexual content it produces, as well as potential copyright violations. The company’s failure to take appropriate action highlights the need for greater responsibility and accountability in the development and deployment of AI tools.
Do you think AI companies should be held accountable for the content produced by their tools, even if it was generated autonomously? Let us know in the comments below!