ImageGuesserDaily AI Detection Game
M04. TOOLS & DATA

Digital Forensics

When your eyes fail you, use tools. Digital files carry invisible fingerprints that can reveal their synthetic origins.

Context Verification

The most powerful tool isn't a complex algorithm—it's Reverse Image Search. When a controversial image appears (e.g., "Explosion at the Pentagon"), verify it immediately.

The "Context Void" Theory

Real major events generate thousands of photos from hundreds of angles. Deepfakes usually start as a single image with no corroborating angles. If only one picture exists of a massive event, it is almost certainly fake.

  • Google LensGreat for finding the exact source or debunking articles.
  • TinEyeExcellent for finding the oldest version of an image.

Metadata & EXIF

Real photographs contain EXIF data: Camera Model, Shutter Speed, ISO, Date Taken. AI images are born in a void.

However, be warned: Social Media strips EXIF data. A photo on Twitter will have no metadata whether it's real or fake. This technique only works on original files (sent via email, Discord, or direct download).

Typical AI Header Tags

Creator: "Midjourney v5.2"
Parameters: --ar 16:9 --s 750 --v 5.2
Software: "Adobe Firefly"

Error Level Analysis (ELA)

When a JPEG is saved, it compresses the image. If you photoshop an object into an image, that new object will have a different "compression level" than the background because it was saved at a different time.

ELA Tools visualize this compression. In a real photo, the noise should be uniform. In a manipulated photo, the fake object will glow brightly (or be pitch black) compared to everything else.

Note: Fully AI-generated images (not edits) often pass ELA because the entire image was generated at once, creating uniform noise. ELA is best for spotting "bad photoshop" edits rather than full AI generations.