• Cyber Dad
  • Posts
  • Here's why you shouldn't publicly post photos of your kid online

Here's why you shouldn't publicly post photos of your kid online

When it comes to the public internet, I think you should take the maximalist route. Here are four reasons why.

Happy Father’s Day! I’m going to celebrate my first one as a dad, and my first real post with this newsletter, by throwing a bomb: four reasons why you shouldn’t publicly post any photos of your child.

My intention today is service journalism. A lot of parents I know innately shy away from posting photos of their kids online. Plenty of others see no real harm in it. Both camps, however, seemed generally guided by vibes. So I sought out reasons why it is or isn’t a good idea. (Spoiler: I couldn’t really find any good argument for sharing a kid’s image with everybody on the planet.)

Some housekeeping: You can subscribe to the Cyber Dad newsletter here. The RSS feed is here. If you don’t subscribe already, this newsletter is free.

We all have a high-resolution camera with us at all times, and adoring your baby and wanting to capture every moment is the most natural thing in the world. So it's a given that we're going to want to take and share a lot of pictures.

But I'm here to argue — and privacy and cybercrime experts resoundingly agree with me here — that you should make a significant effort to not post any photos or videos of your young child's face to the public internet. Period.

To some people, I'm preaching to the choir, and you may find this post useful to send to friends and family members who might not think twice about posting a photo they snapped of your kid.

To others, this will sound extreme. But note that I said public internet. I'm going to follow this post with one about safer ways to share photos of your kids: in group chats with friends and family, to private social media channels, or with kids' faces covered.

Once you post a child's face to a public social media channel — like to an Instagram, Twitter, or TikTok account with default public settings, or even potentially to a large Facebook or WhatsApp group — you are instantly and effectively permanently ceding that photo, and your child's likeness, to the whole world. Even if you don’t have many followers, there likely are more bad actors watching than you might think. I find that generally unsettling, but I've got four specific reasons why it's crucial to avoid.

  • Online predators lurk even on what might seem to be innocuous channels where parents post seemingly innocuous photos of their children.

  • We are currently in the middle of an unprecedented gold rush of new artificial intelligence companies scooping up photos wherever they can for training data. There is very little regulation on these companies and we have no idea what will happen with the images they grab, save that you probably can't stop them once they have your kid's pictures.

  • Speaking of AI, an alarming trend is surging where AI apps makes it easy for creeps — whether they're adult pedophiles or your kid's classmates being awful — generate nude images from a normal picture of a clothed young person. Being teased or bribed with even fake nude photos can cause a child untold emotional turmoil, and has even led to deaths by suicide.

  • No young child can give consent to their photo being shared forever on the internet, by definition. If you think a child should have some agency over their own image — and I do — it simply isn't your call to make.

I realize that some people who read this will have already posted photos of their kids or grandkids or nieces or nephews to a public channel. You're not a terrible person and it's not the end of the world. But it does mean you should take them down or lock the accounts that share them if you can (again, I'll explain that more in depth soon), and avoid doing so in the future.

In other words, the best time to vow to not post pictures of your kid online is the first time you take one. The second-best time is today. Here's why.

1: Predators stalk photos of kids posted to social media.

There are more predators searching for photos of children than you can account for.

A profoundly disturbing New York Times investigation earlier this year found that among parents who devote public Instagram accounts to their young girls, posts that were more suggestive were more likely to receive likes and comments.

In some cases, Instagram will remove accounts that are repeatedly flagged as inappropriate. But accounts that are less overtly exploitative generally survive, often with pedophiles brazenly commenting about the girls' bodies.

The predators aren't always so obvious, though. Take the viral saga of Jacquelyn Eleanor on TikTok. An influencer mother who tirelessly documented her young daughter on the platform, Eleanor’s posts showing her child doing things like eating a corndog or drinking from a straw garnered more likes and saves.

Will the creeps descend on your child even if you aren't a popular influencer? They might. A report published earlier this year by the Australian Institute of Criminology found that among surveyed Australian parents who posted pictures or other information about children online, 4.8% have been contacted by online predators asking them sexualized questions about the child and/or for CSAM depicting them.

[Style note: Most American newsroom style guides no longer use the term "child pornography," and instead use the term Child Sexual Abuse Material, or CSAM, a convention I will use in this newsletter.]

It seems likely that if you don't have many followers and you aren't posting suggestive photos of your child, it will attract fewer predators. But the only way to definitively halt them en masse is to not give them access to pictures of your kid in the first place.

2: AI companies are currently scouring the internet for whatever videos and photos of people they can get. Do you want them to have your kid's likeness?

Right now, we're smack dab in the middle of a generative AI gold rush. Companies funded with untold investor riches are racing to put out programs to imitate human faces, speech, and bodies.

No one can predict where this is headed. But as I write this, that industry is relentlessly gobbling up online words and images to use as training data. They'll take anything that isn't explicitly prohibited — companies like Reddit and Slack have recently changed their terms of service to tell their users that their accounts are now fair game to be mined unless they opt out — and it's probably safe to assume some AI companies are scraping stuff they legally aren't supposed to.

A few weeks before I launched this newsletter, I spoke with Zach Edwards, a senior threat analyst at Silent Push and an advisor at the nonprofit Internet Safety Labs, about what I should tell privacy-minded parents.

"The thing that most parents need to not do is upload unedited photos of their children to major social networks, especially with a public tag on it," he told me.

"Basically all of the platforms — if you 'make your content public' — all of them are ingesting them. OpenAI ingests content that's public. Facebook, Twitter, any of the social networks, if you make a piece of content public, terms of service basically allows them to parse it for their own means," he said.

When I spoke with Zach, that was a theoretical concern. In a sign that shows just how quickly this stuff is moving, this week Human Rights Watch put out a first-of-its kind report that found a major company that sells AI training data sets, LAION, included at least 170 pictures of Brazilian children in one of its packages. The kids had no knowledge and couldn't give consent to be used for AI training, the report found.

I was curious, by the way, whether the rapid rise of facial recognition technology meant the technology will soon be able to identify an adult's baby photos, as that would be yet another reason to hold off. So I asked Kashmir Hill, author of the best current book on the subject, Your Face Belongs To Us. (Also, she's a parent too.)

"These algorithms aren't that good yet at linking our five-year-old selves, our 13-year-old selves, to our 30-year-old selves. But I just don't know what's going to happen over time as the algorithms are going to get better," she said.

3: Predators can use AI to generate fake nude images from normal photos, which can have devastating effects on minors.

One of the darkest uses of AI so far is the rise of nonconsensual deepfake porn: programs that take a photo of a fully-clothed person, usually a woman, and spit out a synthetic but convincing image of what they might look like naked. There's already a glut of programs to do this, only a web search away.

As you might imagine, these are also already being used against minors. Another recent, horrifying New York Times investigation found high school boys sharing deepfaked nude pictures of their female classmates. While AI didn't invent that sort of sexualized bullying, it makes it far easier.

A 2017 Canadian Center for Child Protection survey of CSAM surivors found that more two-thirds of them worried that once abusive images of them were circulated, their exploitation would never end, and that they worry "constantly" about being recognized.

In fact, cybercriminals bullying victims — often underage ones — with deepfaked nude images is the basis of one of the most insidious scams active today. In some cases, the underage victim then died by suicide.

4: You don't have your young child's consent to share their likeness with the whole world.

This one is simple and straightforward, but it often doesn't occur to people who were teenagers or adults when they first started putting their own photos online.

A baby or a young child obviously cannot give consent to having their image irrevocably shared to the public internet. They don't know what it means, so it simply isn't your call to make. You don't know that image will one day be weaponized against them, distorted, used to bully them, embarrass them, or take away an important part of their agency.

Maybe once they're older they'll tell you they wouldn't have minded. At that point they are free to upload the photos you were so careful with.

In addition to this being, in my opinion, a clear moral call, it may prove to be a legal one too. One state, Illinois, has already cleared the way for children of influencers to sue their parents for exploiting their likeness as children, and several other states are considering similar legislation.

Here's something I don't know: when you should let a child make the choice to upload their own image to the internet. According to a Pew poll last year, the overwhelming majority of teenagers use social media.

Do those older kids fully understand the risks? That's beyond what I'm capable of addressing today, or perhaps ever. But I do know no young child cannot give consent to post their pictures online, and that means nobody has it.