근데 한국에서 구글 csam으로 수사된 사례는 드라이브밖에 없던데

Checked on February 3, 2026
Disclaimer: Factually can make mistakes. Please verify important information or breaking news. Learn more.

Executive summary

Provided reporting does not document a clear, public Korean criminal investigation that originated from Google’s automated CSAM detection outside of general descriptions of how Google reports CSAM to U.S. authorities and separate Korean regulatory actions on privacy and competition; therefore the user’s impression—that only a Drive-related probe in Korea has been reported—cannot be fully confirmed or disproven from the sources supplied [1] [2] [3] [4].

1. What Google says it does with CSAM reports — a U.S.-centric pipeline

Google describes a process in which suspected child sexual abuse material (CSAM) flagged by its systems is reviewed by humans, reported to the U.S. National Center for Missing & Exploited Children (NCMEC), and then may be referred by NCMEC to relevant law enforcement agencies, with any further data disclosures to police requiring valid legal process; that description makes clear Google’s reporting pipeline is routed through NCMEC rather than directly to non‑U.S. police [1].

2. How CSAM leads to local probes in practice — the generic pattern

Outside the supplied Korean reporting, legal explainers note that once NCMEC receives a report from a company like Google it can refer the matter to local law enforcement, and that local agencies then must follow their domestic legal processes to seek user data from Google as part of criminal investigations — a two‑step referral and legal‑process model described in U.S. practice and in legal commentary [2].

3. Korean enforcement and public reporting — focus has been privacy and competition, not named CSAM referrals

The materials provided show South Korean regulators have publicly fined Google and Meta for violations of the Personal Information Protection Act and pursued antitrust cases around Android and app stores, with high‑profile fines and legal actions noted by the Personal Information Protection Commission and the Korea Fair Trade Commission [5] [3] [4] [6] [7]. Those actions concern behavioral data, consent and market dominance; none of the supplied pieces explicitly documents a Korean criminal investigation that was initiated because of a Google CSAM report.

4. Known vexing cases that are related but not Korean prosecutions

Reporting elsewhere has documented consequences when automated systems misflag lawful images — for example, a high‑profile non‑Korean case where Google disabled a user’s account and a U.S. police probe opened after Google’s systems flagged medical photos as CSAM, illustrating how automated reporting can cascade into criminal inquiries and account suspension even when the material is contested [8]. That example demonstrates the mechanism at work but does not establish a Korean instance.

5. Why the record might look sparse in Korea — procedural, jurisdictional and secrecy factors

The absence of a clear public record in the supplied sources does not prove such investigations never occurred in Korea; CSAM referrals routed via NCMEC can be opaque, local police files may be sealed during investigations, and companies often limit disclosure for legal or privacy reasons — factors noted in Google’s own CSAM reporting description and in international legal practice [1] [2]. In short, public reporting bias toward high‑profile regulatory fines (privacy/antitrust) means CSAM‑originated probes may simply be underreported in the available material [3] [4].

6. What can be checked next — records and watchdogs that could confirm or refute the user’s impression

To move beyond the limits of the supplied reporting, the records most likely to show CSAM‑originated probes would be (a) formal disclosures or press statements from the Korean National Police Agency or prosecutors, (b) public referrals or transparency logs from NCMEC indicating referrals to Korean authorities, and (c) PIPC or court filings if data access or takedown disputes arose — none of which are present in the sources provided, so the current reporting cannot verify that Drive is the only CSAM‑linked investigation in Korea [1] [2].

Want to dive deeper?
How does NCMEC report referrals to non‑U.S. law enforcement and are those referrals publicly logged?
Has the Korean National Police Agency published aggregate statistics or press releases about investigations originating from foreign CSAM referrals?
What legal transparency obligations do Korean prosecutors and PIPC have for disclosing company referrals or data‑disclosure requests?