This post is a summary of Episode 24 of The Nebraska Governance & Technology Center’s (NGTC) Podcast Series, Tech Refactored. In this special episode, guest hosts and NGTC student fellows Jasmine Alexander and DeAndre’ Augustus were joined by Valerie Jones, Associate Professor of Advertising & Public Relations at the University of Nebraska College of Journalism and Mass Communications, Elsbeth Magilton, Executive Director of Space, Cyber and Telecom Law and the NGTC, and Elana Zeide Assistant Professor of Law, both of the University of Nebraska College of Law.
One thing that the last year has made clear is that parenting can be a joyful, yet at times lonely, pursuit. Social media might seem ideally suited to fill that void; from a place of complete social distance, parents could commiserate with one another, as well as share the simple moments of transcendent joy that punctuate the repetitiveness of pandemic parenting. However, commentators have begun to wonder about the costs of a concept known as “sharenting.” At its core sharenting is a parent’s “oversharing of their children’s private moments in a way that violates their children’s privacy and autonomy.”
In this episode of Tech Refactored, Alexander and Augustus were first joined by NGTC Executive Director Elsbeth Magilton to discuss the evolution in terms of her thinking with regard to what she was comfortable sharing about her children via social media. Magilton noted that, after a period of time during which she was more open about sharing her children’s lives, she eventually realized that that process of sharing was:
“really about me and not about them. I became very cautious of their privacy and decided that my desire to brag about them wasn’t as important as their getting to tell their own story as they grow up about who they are. Who they are at nine is not going to be who they are at 30, and they should have the right to develop that themselves.”
Magilton told a vivid story about the moment she realized that oversharing was problematic. She was at the grocery store, and her young son, who was walking a little ahead of her, started to round the aisle. From where she was standing she could hear someone say “Hi Max!,” and at that moment, her son realized that this was a person who recognized him, and knew his name and who he was. The person was an individual who had gone to college with Magilton and was a friend, but had never met her son in person. That led her to question whether she wanted to put her son in a position where someone who had never met him could recognize him. So while the moment that triggered her thoughts on the matter centered around safety, that was really a jumping off point for her to think about her children’s online privacy more broadly.
When asked whether she has noticed a change in the way fellow parents have begun thinking about how they represent their children on the internet, Magilton said that honestly, she hadn’t. In her experience, the issue hadn’t even occurred to most of her friends until they learned about the decision she had made, and as she recognizes, it didn’t really occur to her until it was really staring her in the face. So while it seems that people are accustomed to the upsides of sharing the joys and trials of parenting, they haven’t really reckoned with the ways in which those decisions might be impacting their children.
Turning to the guests, Jones first noted that there could indeed be value in seeking support and community on social media. In the case of a friend with a child born with a rare birth defect, and another with a child on the autism spectrum, both parents had not only been able to receive emotional support from friends, but also raise awareness about the conditions affecting their children.
In contrast, however, there are parents who actively integrate not only their identities as parents, but their children specifically, into their “branding identities.” As Jones explains, “a brand is a shortcut to an idea, and like it or not, I think a lot of us have this in mind when we share online (...) we’re using it as a form of self-representation and trying to build this image of who we are, what we care about, and what we stand for.”
When it comes to minor children, parents are exercising their own discretion about the ways in which their children are being portrayed, with potential consequences down the road, running all the way from being embarrassed about the a video from their childhood that their parent shared on the internet, to college admissions counselors or employers having access to information about a minor child that might affect the child’s college or employment prospects down the road. In France, this has led to a law whereby children, once they become adults, can successfully sue their parents so long as they can show that they didn’t meaningfully consent to the depiction (a context in which a child can meaningfully consent to their depiction personally escapes me).
The ways in which information about children might be used, in the future, by complex computer algorithms to draw correlations is something at which we, at present, can only guess. According to Zeide, “anything you put online is capable of being scraped by some entity somewhere and that information can be used to make assumptions about you, and those assumptions are based not on the things that parents usually think about when it comes to their kids. It may be whether you use two exclamation points or three exclamation points - if there is a correlation between people who do that, then machine learning systems can infer that you are perhaps high income, low income, that you have certain personality traits - or at least they allegedly can predict this. And if they are used for purposes for consumer profiling - (to make decisions about) loans or housing - these can have significant effects on children as they move forward in life.”
A skeptic might ask, what college admissions officer is going to take the time to look through all these images? But such human analysis is no longer necessary. According to Zeide, “there are automated systems that can look at these and catch suspicious words, and seniors have been unaccepted from college because they have posted things that were seen as racist or inflammatory on social media. So they can have serious consequences later in life.” A child might incur those costs even with regard to content that is not posted by the child themselves, but by a parent. Take the case of a child with a learning disability - that may very well be something that they do not want a potential employer to be aware of down the road.
One potential partial solution to this problem is what are known as “erasure laws”. Erasure laws allow courts to issue orders to search engines like google that require them to remove search results about a given person from their indexes. There is one law in California that does permit people, when they turn 18, to erase things that were previously posted about them online, but it's a very narrow law in order to avoid first amendment issues.
What incentives line up that might lead to over-disclosure on the part of parents? Jones characterized the issue in terms of neurochemistry. “There are immediate outcomes or effects and longer term outcomes and effects. And I think, as humans, we’re pretty bad about thinking about longer term outcomes and effects and posting on social media, like most social stimuli, creates dopamine, right? So there is this neurological chemical release.” This is especially true when you get some sort of positive reward after online posting, which creates a feedback mechanism that leads you to think that, what you're doing in sharing your child’s experience, is okay. The challenge, then, is to get people to think in terms of the longer-term consequences of their actions, both for themselves but also for their children.
Generally we think of the interests of parents, and the interests of children, as not being in sharp tension - but as we have seen in the case of individuals who use their children to promote their brand, enormous tensions can exist with potential disastrous consequences. One particularly vivid example involved parents who developed a youtube following of 750,000 subscribers through a series of videos where they “played pranks” on their children, including telling them that they were being put up for adoption. Ultimately child protective services were called in, and a determination was made that several of the children had been substantially harmed by the videos, and the children were eventually removed from the home.
That is of course an extreme example, but it illustrates that broader point that, everytime a parent posts something about their minor child online, they are making an irreparable decision for that child; we all know that on today’s internet, everything lives on forever. So parents do well to consider: when I post this about my child, am I doing it for their benefit, or for my own?
Tags: Tech Refactored Review