The importance of protecting and regulating children’s personal data

EP3 Foundation

EP3 Foundation discuss the modern day issue of protecting children’s personal data online in a growing era of influencers and expanding technology

Personal dataChildren are up against unknown professional influencers when online. They include marketers1, political activists 1,2, criminals 2,3,6 and sexual predators3 online, adults with experience and resources. At the heart, these adults want to change children’s behaviours and attitudes to benefit their own economic, political or sexual exploitation agendas. These professionals will use any digital means to achieve their goals, but how they access and use children’s data is important to note. Gathered from social media, toys, and digital devices, the data create digital profiles that are used for data-driven, tailored services, and targeted marketing. These profiles, curated and gathered by algorithms, may be helpful, however, often they are not. More importantly, we don’t know the greater impact of how this data collection and the profiles generated will affect children when they are older.

Social engineering

Part of the problem is children’s data is used to identify potential victims and those who are easily influenced and susceptible to social engineering. Because their brains aren’t fully developed, children cannot understand how targeted marketing affects their attitudes, beliefs, and behaviours, shaping their lives.4 In addition, current policies don’t provide choice and control over how children’s personal information is used and accessed. Something needs to change. To protect their health and wellbeing, we must protect the children’s data.

Of course, regulation can’t be effective unless people understand why it’s important that children’s data is protected. Currently, there is a lack of information surrounding how information gathered from email responses, daily events, searches, and other digital interactions are used to provide people with content and predict their behavior.1 There is internet safety training, but it currently focuses on stranger danger and information children directly give out, such as name, age, gender, and location. It is missing training about indirect data; the information captured through unique identifiers such browser fingerprints, IP address and geolocation, and inferred data; information that comes from the direct information children give out which tries to predict or redirect what they want to see, do or have based on that information.1 Indirect data and inferred data are what create digital profiles. In and of itself, these practices are not necessarily harmful. In fact, some may be beneficial, identifying and redirecting suicidal youth for example.

Understanding the links

Patrick Berlinquette wrote an article demonstrating how using redirect ads can change attitudes or behaviours. Many ads on Google take users where they want to go, but redirect ads take users to something that’s completely against the words they used to search.5 Berlinquette used a blueprint that Google put out to capture the searches of suicidal people. His ultimate goal was to change their behaviour and about one-third of people who saw his ad dialled the National Suicide Prevention Lifeline. With every click on his website, he was able to see the exact words users put in before they clicked on his ad, allowing him to reach out to more suicidal people and potentially change their behaviour. His campaign had benign goals, but he points out that others may not. Just as he created another campaign to de-radicalise school shooters, if there’s not a proper vetting process, a malicious redirect ad campaign could instead further radicalise whoever clicks on the ad using information from their data profiles. And therein lies the danger. Our current data practices leave children defenceless. We are unable to detect or intervene when vulnerable youth are groomed for abuse, targeted for illegal drugs or alcohol or served ad-sponsored content promoting fake news and propaganda.

Educating children and guardians about their digital profiles won’t stop the profiling or the ads they see because of their data profiles, but it will make them aware of how their personal information is used and the risks surrounding the large amounts of data being collected from and about them. After all, if they aren’t aware that there is a problem, how can it be fixed? Enforceable new policies and data regulation laws can act as a layer of protection around how indirect and inferred data is used.

Company compliance

Creating enforceable transparency, accountability, and regulatory frameworks gives users an expectation of what will happen when a company fails to comply, creating the possibility for users to advocate for themselves.6 Companies say they implement internet safety, but users often have a different experience. There are no unifying standards for how user concerns should be handled, and there may not be a response from the company when it does happen.3 When children’s personal data is involved, it is paramount that we have clear standards about how it can be used and the consequences when it is not used appropriately.

The example of CloudPets

One example of children and their guardians being put at risk are CloudPets. As a “connected toy,” it connects to the internet and can play messages to and from the child. In February of 2017, a security flaw was found in the way these messages were stored. Because the information wasn’t secured online, 2 million CloudPets voice messages were vulnerable to malicious individuals.7  The potential for harm from unsecured, sensitive data about children’s thoughts, fears, and daily activities is more than worrying. Because of the lack of transparency surrounding how data is used, the full risks and potential harms are unknown. Without strong, clear, enforceable data governance policies, we will never be able to fully protect children from malicious acts .

Positives and negatives

Having large amounts of data on children does have many advantages. Programs can be tailored to children’s educational needs or help guardians and health professionals track the health and development of children to make sure everything is on track.1 Inferred data can help social workers predict and flag potential dangers children may face.

However, algorithms are biased. They reflect the human biases of those who created them and are unable to capture the nuances that are a staple of human life.1 The potential risks for misuse and unintended consequences must not be ignored. For example, college test companies have been criticised for giving high school students optional questionnaires that ask for sensitive personal information. Once given, with permission, this information is no longer protected as “sensitive personal information” and can be used by college recruiters, employment services, and data marketers. Reidenberg, in a report on the student data market highlights, “The harm is that these children are being profiled, stereotyped, and their data profiles are being traded commercially for all sorts of uses — including attempts to manipulate them and their families.8 “

While we don’t know every risk caused by “black-box”, opaque data practices that leave children’s data to be sorted and sold as “profiles,” we can still reduce the following potential harms:

  • Continual surveillance and monitoring may prevent exploring and pushing boundaries, a healthy and normal part of growing up.1
  • Vulnerable youth, particularly when socially isolated, can be identified by their online profile behaviours making them easy targets for to receive unhealthy content, groomed for abuse and radicalisation, and crime.3
  • Being constantly required to provide personal information may become normalised, making it nearly impossible for a child to understand why their personal information is valuable and their data needs protection.1
  • Social media updates by parents inadvertently give criminals the three key things used in identity theft: name, date of birth, and home address.1
  • A child’s digital profile can negatively affect their life chances, such as not being accepted to a college or for employment, or impact their credit score when applying for a loan.1

Balancing the risks

The only way to balance the risks and the potential good that children’s data can provide is having enforceable transparency, accountability, and governance laws. The Science and Technology Policy Center for Development and EP3 Foundation are working with heads of industry and policymakers to foster new, improved data models, and implement control protocols and standards. By empowering data controls, we can ensure that data is used responsibly and securely. By protecting and regulating children’s data, we can create a safer environment on and offline.

We invite you to participate.

Contact info@ep3foundation.org to learn more.

 

By Marsali Hancock and Sierra Hawkins

 

Endnotes

  1. Wineburg, Sam and McGrew, Sarah and Breakstone, Joel and Ortega, Teresa. (2016). Evaluating Information: The Cornerstone of Civic Online Reasoning. Stanford Digital Repository. Available at: http://purl.stanford.edu/fv751yt5934
  2. Ghosh, Dipayan and Ben Scott. “#DigitalDeceit: The Technologies Behind Precision Propaganda on the Internet.” New America, January 2018, https://drive.google.com/file/d/1CeQG3_emVYZSRmTl3CM7UYu8NOhAqACI/view?usp=sharing
  3. “Dangerous Connections: You Face a Risk of Sextortion Online.” FBI.gov, 30 May 2019, https://www.fbi.gov/news/stories/stop-sextortion-youth-face-risk-online-053019?utm_source=twitter-fbi&utm_medium=twitter-tweet&utm_campaign=Sextortion&utm_term=sextortion&utm_content=Story%20-%20Sextortion
  4. Longfield, Anne. “Who knows what about me? A Children’s Commissioner report into the collection and sharing of children’s data.” Children’s Commissioner’s Office. November 2018, https://www.childrenscommissioner.gov.uk/wp-content/uploads/2018/11/who-knows-what-about-me.pdf
  5. Berlinquette, Patrick. “I Used Google Ads for Social Engineering. It Worked.” The New York Times, https://www.nytimes.com/2019/07/07/opinion/google-ads.html?te=1&nl=the-privacy%20project&emc=edit_priv_20190716?campaign_id=122&instance_id=10940&segment_id=15259&user_id=91f36a71f4adaeebd55a52ccb7e9e550&regi_id=71266615
  6. Wright, Jeremy and Sajid Javid. “Online Harms White Paper.” UK Government, April 2019, https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/793360/Online_Harms_White_Paper.pdf
  7. “Children’s messages in CloudPets data breach.” BBC, 28 February 2017, https://www.bbc.com/news/technology-39115001
  8. Singer, Natasha. “For Sale: Survey Data on Millions of High School Students.” The New York Times, 29 July 2018, https://www.nytimes.com/2018/07/29/business/for-sale-survey-data-on-millions-of-high-school-students.html

 

LEAVE A REPLY

Please enter your comment!
Please enter your name here