Apple has delisted several AI-powered apps from its App Store that have been generating explicit images. These apps were allegedly capable of generating non-consensual explicit images without the subject’s consent. And, Apple has taken action against such apps as the company does not allow the creation and distribution of nude images through its platform.
The capabilities of generative AI technology act as a useful tool of assistance for designers and photographers to create images based on prompts. However, this technology has also been equally misused for creating deepfakes and non-consensual images. In a report by 404 Media, several AI image-generation apps were identified on the App Store. These apps offered features like face swaps on adult images virtual stripping, etc. After being alarmed about these apps and related advertisements, Apple finally decided to remove them from the App Store.
How does Apple determine which apps should be delisted?
Apple determines which apps to be delisted based on various factors outlined in its App Store Improvement process. This process evaluates aspects such as
- Titles
- Content quality
- Compliance with guidelines
- And safety
Additionally, Apple may remove apps that have not been updated for an extended period, apps with restricted content, or those that pose privacy or security risks.
Conclusion
The company aims to maintain a high quality, safety, and compliance standard within its App Store ecosystem while ensuring that apps meet the necessary criteria to provide a positive user experience.
This recent decision of the company also suggests that Apple has delisted or removed apps that enable the generation of explicit images as part of its efforts to prohibit the creation and sharing of this type of content on its app ecosystem.
Also Read:
Apple and Google Remove eSIM Apps in India Due to Cyber Fraud Concerns
Apple Announces Shutdown of San Diego AI Team with Employee Relocation