Facebook wants to use Meta AI on photos in your camera roll that you haven't shared

Facebook wants user camera rolls for Meta AI to suggest edits and themes via cloud, raising data privacy concerns.

: Facebook is prompting users creating new Stories on its app to opt into 'cloud processing' for AI-edited photo suggestions. This feature allows automatic suggestions like collages, recaps, and restylings by uploading unshared photos to Meta's cloud. Users must agree to Meta’s AI Terms, permitting analytical use of their media and facial data. Although this move underscores Meta's AI ambitions, it also raises privacy concerns about user consent and data handling.

Meta is prompting some Facebook users to enable a "cloud processing" feature that uploads photos from their camera roll to Meta’s servers—even if those images haven't been shared. This allows Meta AI to scan the images for metadata like location, timestamps, and content, which can be used to create personalized AI features such as recaps, stylized edits, or birthday collages. While users must agree to Meta's AI Terms of Service to activate the feature, the language suggests Meta can retain and use these photos and their data, raising concerns about privacy.

Unlike past AI training practices that used publicly shared content, this new move represents a shift by targeting private, unpublished images on a user’s device. This is different from Google's approach, which explicitly states it does not use Google Photos data for AI training. Meta has not clarified whether it will use the uploaded private photos for training AI models, despite the broad language in their terms of service.

Meta claims the photos are used only to generate suggestions visible to the user and are not currently used to improve its AI models. However, the uploaded content is stored for 30 days and can be deleted if the user disables cloud processing. Users can access these controls in their app settings, but many are unaware of the scope of what they're agreeing to.

Critics say the feature could mislead users, as the opt-in process is framed as a creative tool rather than a data-sharing agreement. Some have already noticed AI-generated restyles of their photos, including anime versions of wedding images, without fully realizing what permissions they had granted. This raises ethical questions about informed consent and the opacity of Meta’s data handling.

Privacy advocates remain concerned about how this sets a precedent for deeper AI access into personal media libraries. Though the rollout is currently limited to regions like the U.S. and Canada, there are fears this approach could become standard globally, blurring the line between user creativity and surveillance.

Sources: The Verge, TechCrunch, AppleInsider