What porn research for the boy scouts taught me

Porn risk to children: Exploited By Commercial Algorithms

The Real Risk to Children: Exploited By Commercial Algorithms

I have spent my career dealing with research on the effects of technology and content on human beings. For decades, I have been on the front line of this work, analyzing, and utilizing what we learn. I have been particularly focused on child health and wellness online since 2003 (15 years ago), so it was natural that I was asked to direct a program that would help Boy Scout troops address healthy online habits and experiences. They wanted a program based on data and evidence that would prepare parents and leaders to support the Scouts in their online experiences, including healthy ways to respond to traumatic content like sexual violence, which is commonly streamed in online pornography.  So I started with reviewing current research in these areas, particularly sexual health and pornography. The alarming thing was: the effect of my online “search” for this project profoundly shaped me and my online experience, and not because of what I learned in developing the curriculum content for the Boy Scout project.

I was shocked at how data from my unique device use, processed into commercial algorithms, literally impacted everything I saw online and worse, it was what I didn’t see that upset me the most.

Before the Boy Scout project, I thought I understood the importance of data protection and privacy, especially for children. Now, I have an entirely new perspective. I am not only driven to educate everyone on why we must separate children from their unique identifiers, such as browser fingerprints, keystrokes, eye patterns,  and IP addresses, but I also want new data models and tools that empower individuals to choose and control where their data goes and how it is used.

Here is why. My new “customer profile,” threw a rod into the algorithms serving up efficient and credible information onto my small laptop screen. Instead, I was served an endless string of time-wasting, unhealthy ads and recommended videos. It didn’t just impact my search on Google or Youtube. It spread across nearly every website or service-connected to my Chrome browser. My new “algorithmic profile” even showed up in my work life. Imagine how mortified I was when my child online safety ended by recommending a video of four women being objectified on Youtube, with titles like Big Boob Pilates and Betty Boop’s Big Bootie Bun Workout. When a Youtube video is embedded in Google Slides, four new “recommended” videos pop up. The only website experience left unaltered was my fixed, closed websites such as my online banking.

Here is how it happened: The Boy Scout curriculum needed to explain to leaders and youth the importance of having productive conversations about pornography, porn’s impact on sexual health, and how to foster life long habits of using interactive media and technology wisely.[1] So, I began with a basic google search looking for data sets and studies that dealt with child safety policies online, children and online safety, and cybercrime. Then I moved into the effect on a child’s developing mind when exposed to extreme violence and sexually explicit content.

What the algorithms processing my personal data did not include was how highly I value the research and content from my partners for this project. My partners included The Center for Media on Child Health’s practitioners, skilled research librarians, and Penn State’s Megan Moss, a sexual health educator and researcher. Instead, MY new algorithmic pattern predicted that I no longer wanted credible content, rather that I must have been looking for sexually explicit content.

For nearly six weeks, every time I sat down to my computer to search for the studies I was working with, I could not find them. The algorithms used by software companies, sent the message that I might be interested in online sexual violence and explicit content. So rather than return to the scientific research, it pulled up articles, blogs, and other second and third-tier sources. It no longer sent me to the actual studies. I had been tagged as a person who sees the world through this lens, one who isn’t concerned with credible, hard data. So in order to find the research and data sets, I had to go deeper. It was buried three or more pages down into the “search result” process. In other words, my whole view of the world was altered.

It took me a few days to understand what happened and why my devices buried the quality content. To reverse the damage, I took matters into my own hands. I had to actively try to return to a healthy, learning algorithmic profile. I used DuckDuck Go, as a privacy-preserving search platform. While using Chrome, I searched university sites for academic studies, I shopped for and placed in the shopping carts of several online shopping platforms, work clothes like pencil skirts, three-quarter sleeves, and comfortable dress shoes. It required conscious effort to change the algorithms.

We are at the mercy of the commercial algorithms that make predictions about what we want and what we believe and what we will find interesting. We are not picking our own view of the world, rather the system is. In that experience, it confirms and sets this view of the world. And beyond the risk of children encountering harmful content, the real danger lies in what they won’t see, the truth and knowledge they won’t find, the videos and voices they can’t hear because they are crowded off device screens, and buried in search pages, or muffled completely by algorithmic bias.

Why we need to fix this. Children, when their personal data is processed by commercial algorithms, lose the opportunity to understand and articulate important ideas and concepts. They are exposed to content that colors their view of the world, sometimes violently, sometimes biasedly, without their understanding. In this context, a child’s world view, beliefs, attitudes, and behaviors are social normed not by what they search for, but by what is served from the commercial algorithms designed for digital platforms. There is no digital protection against this.

While at home parents and adults help shape a child’s view of the world, online these algorithms indiscriminately teach something else. The algorithms dictate what a child sees and what she does not see, every time the computer is on. Her IP address, behavioral patterns, and a myriad of data about her online presence is collected by these algorithms. She can be identified by these markers, and then the system decides what she will see and what she will not see. This, in turn, has a direct impact on what she believes, shaping her attitudes and behaviors. So though we often think everyone has the same experience online, the truth is the algorithms build a different view for every user, and this puts children at risk.

This is why I am an advocate for new and evolving internet design philosophies, data models, and architectures that lift children’s internet experiences off of commercial data algorithms. I want to be sure that children are not vulnerable to the algorithms that exploit the information from a child’s personal device and personal behavior identifiers. We can give children the best of the web technology, without letting others serve them a worldview that is artificially created for financial gain, at the expense of their education, health, and wellbeing.

 

Marsali Hancock, MediaX Stanford, IEEE Standards for Child and Student Data Governance Working Group Chair, marsalih@gmail.com

[1] When I was CEO at IKeepSafe, we did a whole training program with Dr. Megan Moss from Penn State University about Wise Tech Choice. It taught skills and competencies to foster choosing healthy content online, preparing  with healthy responses when exposed to harmful content, and choosing a healthy balance between technologies screens and face to face, in-person activities.

LEAVE A REPLY

Please enter your comment!
Please enter your name here