Friday, December 13, 2024

Apple is warning that artificially intelligent (AI) systems could manipulate real-world images to create deceptive and potentially harmful ‘deepfakes’.

As Apple prepares to debut its AI-powered camera capabilities, the tech giant finds itself pondering anew: what indeed constitutes a photograph? At a recent conference, Apple’s senior vice president of software engineering, Craig Federighi, revealed the company’s plans to develop AI-driven image editing tools that preserve photographic integrity.

“Federighi noted that our company’s merchandise and telephones are in use,” “It’s crucial for us to provide accurate information, rather than perpetuating myths.”

Will we wish to streamline the process of removing that water bottle or microphone? Federighi explained that the water bottle’s presence in the photo was a byproduct of using Clear Up, which eliminates unwanted objects from the background of an image, as illustrated by its functionality. “The public’s strong desire to clarify seemingly irrelevant details in the photo has been overwhelming, leading us to take a small but meaningful step.”

Apple Intelligence does not currently allow customers to apply AI-generated manipulations to photographs, unlike some competitors that offer this feature. Photos edited using the innovative object removal feature will be marked as “Modified with Clear Up” in the Images app, accompanied by metadata indicating their modified status. 

Related Articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Latest Articles