Skip to content

When you share photos from a protest, you might think you are just documenting history or raising awareness. What you are actually doing is potentially putting people at risk. Every photo you post contains hidden information that can be used to identify and track protesters, with consequences ranging from job loss to criminal charges.

This guide will show you how to protect yourself and others when sharing protest photography.

The surveillance threat is real

During the 2019 Hong Kong protests, authorities had access to facial recognition technology that could match faces from video footage to police databases, and protesters responded by donning masks and using umbrellas to shield themselves from surveillance cameras. The Hong Kong movement made privacy its strongest weapon, with protesters concealing their identities and using the slogan "be water" to remain shapeless and formless, outside the reach of authorities.

In Miami, police used the facial recognition programme Clearview AI to identify a woman accused of throwing rocks during a 2020 protest. In 2024, the NYPD circumvented its own ban on facial recognition by asking a fire marshal to use the FDNY's access to Clearview AI to identify a pro-Palestinian student protester at Columbia University. The case was later dismissed, but the damage was done.

Clearview AI offers its customers the ability to secretly target and identify any of us, and then to track us, whether we are going to a protest, a religious service, or a doctor, and even to reach back in time to find us in old selfies, school and college photos.

This is not theoretical. This is happening now.

What is hiding in your photos

Before you can protect yourself, you need to understand what you are up against. Every photo contains layers of hidden information called metadata.

When you take a photo with your phone, it embeds EXIF data that can include GPS coordinates showing exactly where you were standing, the timestamp of when you took it, and details about your device. This information stays in the file even after you post it online.

But metadata is just the start. The photo itself contains identifying information. Faces are the obvious risk, but there are others. Street signs reveal locations. License plates can be traced. Name badges expose identities. Even text visible in the background can give away details you never meant to share.

Research has shown that facial recognition algorithms tend to put more emphasis on the eye region, because it deforms less with facial expression than the mouth region. This means that even partial face coverings might not be enough to evade facial recognition systems.

Clean your photos before sharing

The solution is straightforward. You need to remove the hidden information and blur the identifying details before you share anything.

Start by stripping the metadata. ClearShare does this automatically. Open the app, select your photo, and you will see everything that is hidden in the file. Location data, timestamps, camera details. You choose what to remove.

Next, blur faces. Not just the faces of people you know, but everyone in the photo. Prosecutors may claim that peaceful protesting is protected, but that changes if a crime is committed, and throwing objects or blocking roads can be classified as crimes. You cannot predict which actions authorities will later decide to prosecute.

ClearShare's face detection feature finds faces automatically and lets you blur them with a single tap. Do not skip this step.

Then look for other identifying information. License plates need to be blurred. Street signs that show locations should be obscured. Name badges, protest signs with names, and any text that could reveal personal details all need attention.

Use ClearShare's text blur feature to handle these elements. The app detects text automatically, but you should review the photo manually as well. Automated detection is not perfect.

What photographers need to know

If you are documenting protests, you have a responsibility to the people you are photographing.

Wire service photographers are instructed to identify themselves and ask for names whenever possible, particularly if there are fewer than three people in a photo. This is good practice, but it is not enough.

The ethical framework around protest photography is shifting. Concerns are growing over the misuse of facial recognition technologies used to target and potentially endanger activists, forcing photographers to navigate privacy concerns whilst documenting history.

Some argue that blurring faces makes images less true and removes the emotional power that drives social change. Historic images of protests galvanised the general public because we could see protesters' faces and feel the full impact of what they were feeling. This is a valid point.

But consider the alternative. Several major cities used facial recognition technology to identify Black Lives Matter protesters in 2020. Emails between police officers show some discussing whether to omit mention of Clearview AI in official reports to keep their use of the technology vague.

The technology has outpaced our ethical frameworks. You cannot rely on traditional journalistic principles when the surveillance apparatus has changed this fundamentally.

If you are shooting protest photos, blur faces before you publish. If the photo loses its impact without visible faces, reconsider whether you need to share it at all. Document the movement, not the individual protesters.

Protect yourself in the field

Prevention is better than remediation. Think about what you are capturing before you press the shutter.

Frame your shots to avoid capturing faces when possible. Photograph from behind, focus on hands and signs, capture the scale of the crowd rather than individual people. Some photographers deliberately take photos of people's backs or ask for permission before photographing faces.

Turn off location services on your camera before you leave for the protest. This prevents GPS coordinates from being embedded in your photos in the first place.

Use a burner phone or a separate camera that is not linked to your personal accounts. If authorities seize your device, they should not be able to access your main photo library or social media.

When you get home, immediately process your photos through ClearShare before uploading them anywhere. Do not post raw photos to social media, even to private accounts. Because biometric identifiers are often used to enable access to secure locations and information, the capture of our faceprints without our notice and consent poses security risks.

The technical details matter

Here is what ClearShare removes from your photos:

GPS coordinates that show exactly where you were standing. Device make and model that could identify your phone. Timestamps that reveal when you were at the protest. Camera settings that create a unique fingerprint of your device. Software information that can be traced back to you.

The app works offline, which means your photos never leave your device. Nothing gets uploaded to the cloud. Everything happens locally on your phone.

For face blurring, ClearShare uses on-device detection. The processing happens entirely on your phone, maintaining your privacy whilst protecting others.

The same applies to text detection. License plates, street signs, name badges, all of this can be detected and blurred without sending your photos anywhere.

What about video

Video presents additional challenges. The metadata is similar to photos, but there is more of it. Video files contain frame rates, codecs, and editing software information that can all be used for identification.

More importantly, video captures movement. Even if you blur faces, gait analysis can identify individuals by the way they walk. Voices can be matched to individuals. Background conversations can reveal identifying information.

If you are sharing video from protests, be even more cautious than with photos. Blur faces throughout the entire video, not just in selected frames. Consider muting audio or adding music to obscure voices.

ClearShare currently focuses on photos, but the same principles apply. Remove metadata, blur identifying features, and think carefully about what you are sharing.

This is not paranoia

You might think this level of caution is excessive. You might believe that if you are not doing anything wrong, you have nothing to hide.

Facial recognition technology allows tracking across physical locations, photographs, and videos, painting a complete picture of lives and associations, and this threat of surveillance also chills speech.

The point is not whether you have done something wrong. The point is that surveillance itself changes behaviour. When people know they are being watched, they self-censor. They avoid protests. They stop speaking out.

This is the purpose of surveillance. Not to catch criminals, but to discourage dissent.

Legal Aid has accused the NYPD's Special Activities Unit of secretly working outside the bounds of regulations and purposefully avoiding documentation of its illicit activities. Technology developed by other companies has been plagued by claims of racial bias and false identifications, leading to innocent people being accused of crimes, with at least three known instances of people being jailed after being falsely identified.

The system is not designed to protect you. It is designed to track you.

Start protecting yourself today

Every photo you share from a protest is a potential security risk. Not just for you, but for everyone visible in the frame.

Download ClearShare. Process your photos before sharing them. Remove the metadata. Blur the faces. Think about what you are posting and who might use it against you or others.

Your right to protest depends on your ability to do so safely. That safety starts with the photos you share.

The surveillance state is real. Your defence against it can be too.


Further reading

Start Sharing Safely Today

Share photos and documents without accidentally sharing your personal information.

Get it on Google Play