As adults, we are somewhat aware that the technology we use is monitoring or tracking us. While often unaware as to what our cell phones, notebooks or digital handhelds actually do collect, whom it is collected by and for, we are at least as adult’s part of the conversation about our own surveillance.
Children, on the other hand, are not. While some may think technology providers are who children are most at risk from, I would also raise a point about the role of parents or guardians.
Young people are rarely asked permission (especially by their parents) if being digitally monitored is ok. When they are involved in the decision, depending on their age and how it is explained to them, they also might not fully comprehend the implications of the activity in their future.
For our conversation, I decided to unpick how the plugin works to evaluate its possible human impact and consider the importance of empathy in not just its design, but its data management and end use.
Behavioral tracking of individuals through technologies is far from innovative. Today we are surveilled more often and more smartly, than any other time in history. Data is today’s currency, data about adults and now our children.
Retailers, media and credit card companies and data consortia like Experian have long tracked and analyzed the transactional and media behavior of adults. They use this data to build profiles of what we like based on what we do. Absent from many of these multi-billion dollar data warehouses is the profile of children.
Learning from these data practices has followed mobile and Internet service providers, digital media companies, and app developers. Our learning institutions can also be included with the growing practice of learning analytics packaged with most learning technologies in schools, colleges, and universities.
This plugin, in particular, appears to collect profile data — who the children and their parents are; media behavioral data — their web activity in click-stream (i.e., what content they click on) and narrative format (i.e., what chat comments they make and respond to). It also collects a form of social network data — who they are interacting with, when and how frequently.
Inferred from this data is an indicator score of the assumed impact this activity might have on the child’s mental health.
You will see I have highlighted the terms inferred, assumed and might. I did this because while the data does report what a child (and possibly parent) does, when, where in digital terms, and with whom; the plugin appears to make an inference about the activities mental health impact.
The plugin doesn’t directly ask the child how the web content makes them feel and why? How and why are two very important questions for our ability to empathize with another’s experience; and directly make sense of their practices.
In 2009, myself and colleagues published a study related to this. We asked over four hundred 13-19 year olds how web technology makes them feel. We didn’t monitor or track them over time, we didn’t infer their feelings from the content they were viewing — we asked them. We shared with them postcards on which were written one statement, “The web makes me feel …”. We asked them to give a one word response. Below the statement was the question, “Because” with room on the cards for them to write as much or as little as they wished to explain this feeling.
We identified that while there was no differences between the boys and girls in how positive, neutral or negative the web made them feel, younger groups of 13-15 reported more positive feelings; where as those closer to 18 or 19 years of age, reported more negative feelings. Albeit also with limitations, the study provided some insight for us into how web technology makes young people feel. We didn’t infer or assume it from their behavior or the content they viewed, as the plugin does; and we reported their feelings in their words, not in our words or as numbers.
One’s feelings are after all, messy and complex, and are best understood through words, color and images, through empathy.
Questions I have about how the plugin infers risk from a child’s web activity closely align with further questions I have about the design of the plugin’s data management. Questions like:
- How does it sync with other social technologies that a child might use?
- Who has access to all the data?
- How will or could the data be used by third party developers, parents and other likely interested parties — insurance and drug companies spring to mind here.
- Also, like many services targeting children should its design and use be regulated in some way?
These questions come from a place of empathy. A practice that is critical for the design and management of any digital data or information about people, especially children. Without empathy, tracking technologies become just a tool; and the data collected is just an asset to be owned, mined, shared and sold.