Does Snapchat scan or analyze images locally before saving to Memories or My Eyes Only, and how do its AI suggestions work?
Executive summary
Snapchat’s public documentation makes clear that its AI features use images and text users supply to generate responses and that some AI tools are powered by on-device research models and third‑party LLMs, but the company’s support pages do not state unequivocally whether images saved to Memories or stored in My Eyes Only are pre-scanned locally before being written to device storage — instead, the documentation describes how submitted images are processed by models and how users can opt out of training [1] [2] [3].
1. How Snapchat frames “Generative AI” and image processing
Snapchat’s support pages define generative AI as models that learn from large datasets and say explicitly that “when you share text and images with generative features, the AI uses that information to provide the generated … response to you,” making clear that images sent into AI flows are processed by models to produce outputs [1]. Snapchat also documents dedicated generative features in Memories (AI Snaps) that create personalized images from selfies users submit, which demonstrates that submitted images feed the generation pipeline [2].
2. What the company says about My AI, storage and model backends
Snapchat states that My AI is powered by large language models such as GPT and Gemini variants and that content shared with experimental bots can be stored and used in the app’s database, implying server‑side handling of chats and images sent to that bot [4] [5]. Support guidance on reply suggestions also confirms that the app generates canned replies during chats, a behavior consistent with server or on‑device inference producing suggestions based on conversation context [6].
3. The local‑vs‑cloud ambiguity in the official reporting
There is no direct statement in the provided Snapchat support material that Memories or My Eyes Only are locally scanned with on‑device AI before being saved; the company explains what AI features do with content users submit but does not publish a clear checklist of which operations occur purely on device versus in Snap’s cloud [1] [2]. Separately, reporting notes Snap has developed a mobile text‑to‑image research model intended for devices, but the company declined to confirm whether that specific model powers new lenses, illustrating that Snap blends on‑device research with cloud services and has not fully documented which feature runs where [3].
4. What users can control and what Snap discloses about training data
Snapchat’s help pages say users can opt out of having their public content used to train generative AI going forward, while cautioning that opting out doesn’t erase past training and that content posted publicly by others may still be processed, a disclosure that concerns use‑for‑training rather than the immediate mechanics of local scanning or transient inference [1]. The transparency is about data use and training preferences, not a step‑by‑step technical audit of memory write workflows or My Eyes Only handling [1].
5. How AI suggestions and Lens recommendations behave in practice
Feature descriptions and third‑party guides show AI suggestions surface as caption or reply recommendations and Lens suggestions based on the context of a snap or chat — these are presented to users as tap‑to‑use options and are removed when users start typing, which aligns with real‑time inference that could plausibly run either on device or server depending on the model used [6] [7]. Snapchat also places visible markers (sparkle icons, watermarks) on AI‑generated content to help users identify machine‑created images, signaling product choices to surface provenance even if processing location is not always explicit [8].
6. Practical takeaway and limits of the reporting
The evidence in official Snap documentation confirms that images shared with AI flows are processed to produce outputs, that some AI capabilities have on‑device research models, and that My AI content can be stored server‑side, but the sources do not provide a definitive, feature‑by‑feature statement that Memories or My Eyes Only are scanned locally before being saved; therefore, a conclusive claim about automatic local pre‑scanning of all saved items cannot be supported from the provided materials — users seeking firm technical guarantees must consult Snap’s privacy whitepaper or engineering disclosures beyond these help pages [1] [2] [3] [4].