wordpress blog stats
Connect with us

Hi, what are you looking for?


Children and the Internet at UN Internet Governance Forum: Issues around gaming, online harms, education

The COVID-19 pandemic has pushed children into spending more time online, whether it be for education, recreation, or simply by the virtue to being stuck indoors. It has also increased challenges for parents, policymakers, and companies alike, given that children are exposed to online harms as they spend more time online. On the other hand, the pandemic has shown the deep digital divide, wherein millions of children do not have access to a device and internet connection.

The United Nations Internet Governance Forum 2020, held earlier in November, discussed some trends and challenges around children’s relationship with the internet, and what a governance framework should look to address. The following is a summary of the key issues discussed at the Forum.

Challenges in children accessing education online

The pandemic has shot up the requirement for children to stay online for longer, but has also highlighted the deep digital divide, wherein millions of children don’t have access to their own devices or internet connection. At the same time, ed-tech companies are amassing personal and sensitive data from children.

In the context of the explosion of ed-tech and the consequent collection of children’s data, what are the implications of educational platforms for children’s freedom right now and their future freedom when they grow into adults at a time when data from their childhood education on platforms may still be stored?

How the GDPR deals with it, the need for protection against marketing and profiling: The GDPR requires special protection with regards to children’s personal data as they might not be aware of the risks, consequences, and safeguards. The specific protections should apply particularly to the use of personal data for marketing, profiling, and basically in any digital services provided to children. If children are unable to assess critically the content that they might be encouraged to engage with and convinced to buy certain products or encouraged to adopt an unhealthy lifestyle. For example, the privacy statements have to be adopted and be understandable for people of any age. Additionally, people aged 14 years should have privacy settings that are different from others. — Nick Couldry, London School of Economics and Political Science

Advertisement. Scroll to continue reading.
  • At the same time, with the intention to protect children, there have been some aspects where it is infringing children’s rights, especially the right to privacy of children, that might be infringed when they need the consent of their parents up to the age of 16, which is incredible. — Jutta Croll, Stiftung Digitale Chancen, Germany

Use cases of AI in education

AI can be used in education beyond simply the classroom but in the system as a whole: some use cases are about the learning process itself but about the infrastructure of schools, Doaa Abu-Elyounes from the Berkman Klein Center for Internet and Society said. AI can be used for threat detection; facial recognition has been used in some schools for identifying whether someone has permission to be in the hall on campus or at a school event. “It can be used for another threat detection which is, is someone likely to fail out of school? And by identifying who you think is likely to drop out, then can you make an intervention?” she asked. Abu-Elyounes laid down some other use cases for AI in education: 

Adaptive and personalized learning: For instance, can we learn more about how individuals are learning things and then change lessons to improve our teaching? How often, if you’re learning a new language, do you need to be reminded and tested on a certain vocabulary word?  What is the optimal repetition time on average and for you as an individual? 

Proctoring: AI is being used for identification proctoring for remote exams; you can for instance do data detection and try to identify where someone is looking if you want to enforce that they are looking at the screen and not at something else that is off camera.  Some colleges are using it to track how early and how often are people engaging with that collegiate website. 

  • Is automated proctoring surveillance? A Youth IGF participant from Portugal, João Pedro Martins, asked how AI-based proctoring is not basically enforcing surveillance from an early age on young people. A solution is then proposed, perhaps we should be rethinking the questions that are being asked to the students rather than trying to over watch everything they are doing, capturing their webcam, sound, on a deeper level the processes that are run on our own machines. And I guess that would be my greatest input for this debate.  What are the things that we should change in educational level that would allow us for more transparent use of this technologies without undermining for instance the trust that should exist between these different stakeholders is in the education process?” Martins asked. 

Teacher evaluation: Emotion and affect detection both online and in some classrooms in China. For instance, looking for joy and engagement based on a student’s facial expressions or voice. You can imagine different ways that can be used: it could be to ascertain if a student is paying attention, or if intervention is required. Or  to evaluate teachers: is a teacher compelling attention from their students? Are they causing them to be engaged?  And should we use that to evaluate teachers? 

Data breaches and privacy

Big companies, toy makers, YouTube, and children’s hospitals, which are trusted companies are assumed not to have data breaches, but they do which begs the question: can we prevent such events from happening before they happen? Many companies are aware of breaches, but we have to bridge the gap between action and awareness. At a school level breach of attendance records, who do I report it to, a teacher, principal, the authorities? – Varunram Ganesh, MIT Currency Initiative 

Children and Online Gaming

Motivations for children to game: The motivation for children to game is primarily their social relationships with peers and classmates, as well as in the neighborhood and family. According to work done by University of California, most children game to kill time; the second motivation is friendship driven; and the third is for recreation. The final motivation is knowledge seeking and cultural production, including creative activities. — Jing Sun, Director of Game Research Center, Perfect World, China 

Parents also game: There is now a new generation of parents who are also entering the field. A recent survey shows that 59% of the parents spent roughly the same amount of time (as their children) in playing games and 24% confessed they spend more time playing games. But this doesn’t mean these parents are critical about games or they actually know what games to choose for their children and what good games can do or not negative things that games can do. We have to work with both these groups of parents, parents who have never played a game and are very worried about gaming and parents who play a lot of games but yet are not media literate. — Manisha Shelat, Center for Development and Management and Communication, MICA, India

Advertisement. Scroll to continue reading.

Games are changing: Games have changed from violent, shooting games to games for education, learning professional skills, advertising, gamification of marketing, games with avatars and gambling, and very serious e‑sports. We have to make sure that we address this entire landscape when talking about media literacy. There is definitely violence, sexualisation, gambling in gaming. Such risks are present, but not in every game. Media literacy now has to extend to teachers, game designers, media owners and policymakers. — Manisha Shelat

Online gaming exposes children to several harms, according to Dr. Amanda Third at the Western Sydney University. While gaming helps children deploy critical thinking and other skills to solve problems and strategize, it also comes with a range of risksT, she said. he stereotype of the teenage gaming addict tends to dominate our thinking in policy and programming about online gaming. This can be a concern for a small minority of children who are gaming compulsively. Some games do aim to keep children engaged, to compel them and we do need to regulate and make sure that these practices are ethical, she added. 

  • While challenges around excessive gaming pre-occupy most people, children often gain heavily commercialized and underregulated digital spaces, exposing them to forms of economic exploitation. These include exposure to loot boxes, gambling, advertising, as well as the harvesting of their data by third parties for purposes that don’t always benefit the child. 
  • Many children participate in gaming platforms that are not designed specifically for them, raising questions about the extent to which they might be exposed to inappropriate content, practices and relationships with others, and whether they have the skills and capacities to deal with these risks of harm. These include cyberbullying and concerns that some games reproduce forms of toxic masculinity. 
  • It is widely acknowledged that online gaming may simultaneously infringe and support children’s rights. Online gaming perhaps most obviously impacts children’s right to protection from harm, but also, of course, and importantly their right to rest, leisure and play.  But remembering that the digital environment is complex, there is much more at stake than these two rights. Indeed, online gaming potentially impacts a very wide range of children’s rights both positively and negatively and some of these are on your screen now.

Some directions to at for a governance framework can be, according to Third are: 

  1. The UN Convention on the Rights of a Child — the mostly widely ratified human rights treaty in history — and the supporting mechanisms for interpreting the Convention are really important tools in guiding decision making when it comes to governance of online gaming. There are three dimensions of children’s rights as laid out in the Convention: provision, participation and protection. 
  2. Minimum safety standards incorporated in gaming: It’s not possible to anticipate the effects of online play on different children, but we can ensure that minimum safety standards are built into the games and communities. Safety by design which was pioneered by the Australian office of the e‑safety commissioner lays out principles to guide developers whether small startups or multi‑national companies to embed safety tools.  We have enforceable standards around the children’s data via gaming platforms, and standards that acknowledge and protect children’s rights. “I think safety by design, or rights by design is what I would like to see ultimately to really see gaming companies pick up this idea that we embed children’s rights at the very heart of gaming platforms,” she said. 

Recommendations for a governance forum for children and gaming: 

  • The most immediate thing we can do is start sharing data with researchers. Scientists still struggle to answer the basics of why people game and their resultant effects. There is rich diversity in gaming. — Pete Etchells, Professor of Psychology, Bath University 
  • Girl gamers need to be taken more seriously, and that does not just mean design accepted games for girl like dolls and homes. But make the spaces more empowering for girl gamers,  more safety, less harassment. They should be able to reap the real benefits of gaming. — Manisha Shelat
  • In China, there are some problems around commercialised promotion and production. But there is higher game literacy among both male and female gamers. With evolving tastes among gamers, developers and productions teams will find that commercial promotions are not the only way to attract gamers, and that they have to provide better games that help them with education, health and care and also other social responsibility. — Jing Sun 

Lessons from the pandemic: child rights and online harms 

The pandemic brought into clear focus a terrible normal where we are asking children to be in places that are simply not fit for them. At the same time, it highlighted the digital divide, where 1.7 million children do not have a laptop of their own for education, Baroness Beeban Kidron, filmmaker and founder of 5Rights Foundation said. The United Kingdom government has brought in an emergency law making remote learning compulsory in UK schools so any child not in school during the next wave of lockdown must have access to remote learning by the school. But they have done nothing, nothing to make that access safe, Kirdon said. 

  • OnlyFans and online harms legislation in the UK: The UK is bringing an online harms legislation where the government is considering exempting small companies from the law. OnlyFans has less than 50 employees, but has 50 million subscribers and 700,000 content creators. OnlyFans sells access to naked photographs and sexual content. In April 2020, it was found that on a single day a third of Twitter profiles globally advertising nudes for sale or similar tags appeared to belong to underage users. And many of those were using the British firm OnlyFans that would, I believe, be out of scope of the proposed legislation. What the pandemic has brought into sharp focus that the general public, family, and even policymakers have not understood is that we are putting children in the center of a very toxic environment. And that in a world in which it is normal for a child to make some pocket money by selling a nude picture of themselves, that actually creates an environment in which it is also normal for a child to share it on the school remote learning school platform, and it’s also then normal to pick up admiration and friends through that process. — Kirdon 

Dr Amand Third from the Western Sydney University highlighted some trends and challenges during COVID-19: 

Children don’t distinguish the online and offline in the ways that adults often do, but rather they move flexibly across online and offline spaces and they often interact, learn, and participate both in offline and online platforms simultaneously. Children have emphasized that during the pandemic, while technologically mediated relationships have been important to them, they can’t replace their face-to-face interactions.

More time online raises encounters with harms: Research shows that increased time online does, indeed, augment the likelihood that children will encounter risks of harm.  And while we also know that more time online enables children to develop their digital literacies and their capacities to manage these risks, arguably they have not had the opportunities to do so with the right structures of support and guidance around them given that they have been largely reliant on family members who themselves are in crisis.  

Children’s privacy during Covid-19: One effect I think of this pandemic has been to challenge most people’s sense of privacy.  Living in close quarters is very challenging.  But, of course, again, by virtue of intensified time online, children are evermore exposed to the not always explicit or well explained data collection practices of technology companies with potentially significant implications for their right to privacy. These incursions on privacy have only been compounded by contact tracing absent other surveillance technologies that are implemented to protect populations.  But if surveillance of the general population has increased, the privacy rights of those children who live in abusive families are severely compromised.

Jutta Croll, a long-time advocate for digital rights of children and currently with the Digital Opportunities Foundation had the following insights: 

Advertisement. Scroll to continue reading.

Increase in reports of violence and cyberbullying: On the German helpline for children, we had an increase in reports about violence which was physical, emotional, and also sexual violence. Children had a lot of reports or just questions about love and relationship online and also about cyber bullying. It was not only harms, but also questions around positive experiences. 

Parents’ concerns around children spending time online: There were also questions from parents about gaming, data protection and excessive use. We got calls from parents asking about possible risks of children being more and more online and how to deal with these risks. “Many parents felt overwhelmed with the situation and they were frustrated and got feelings like am I a bad parent?  What shall I do?” 

Law enforcement and child sexual abuse online

First action at Europol: identify the victim: In terms of operational response around children’s sexual abuse, the first thing Europol does is on identification of the victim.  In any investigation, the first question we ask is where is the victim and how can we help them? This is ascertained using different conventional methods of investigation as well as the victim identification function at Europol. This is used to determine where a child is so that the national law enforcement agency as per jurisdiction can reach them. Doing this involves intelligence analysis and technical support intelligence analysis for information stored in Europol’s databases about offenders and their activities. Because of this, Europol was able to support several operations remotely during lockdown. — Cathal Delaney, Team Director, Europol

  • This was done in a case where Europol staffers in Australia discovered child abuse material online and find where it was taking place in Italy. “And ultimately that led to the identification of the suspect and the victim, a young girl, a child who had been abused by him within 10 days of those videos being posted online, which is quite a significant achievement in that particular difficult circumstances,” Delaney said. 

Europol also has a video analysis system for victim identification, this system currently holds 50 million unique images and videos child sexual exploitation abuse, with multiple copies. With its partners and annual victim identification task force , the agency has managed to examine just 20% of that material – showing the scale of the problem in relation to the amount of material out there. Any child whose abuse has been recorded and distributed, there is every possibility that they will at some stage be fearful that they will be recognized on the street by someone who has seen the material. This fear has been established by survivor surveys by C3N in Canada and others. — Cathal Delaney

Significant increase in exchange of child abuse materials, including videos and images, which was being done in peer‑to‑peer sharing file sharing networks, during the pandemic. There was also increase in discussions on dedicated pedophile forums which exist in abundance, especially on the dark net. We saw increase in self‑generating materials, wherein minors produced child abuse material at the request of offenders contacting them online and such material being distributed. — Uri Sadeh, INTERPOL

Use of PhotoDNA for child sexual abuse content 

PhotoDNA is an image-identification technology, developed by Microsoft, used to detect child pornography and other illegal content. It was at the center of a debate PhotoDNA is an image-identification technology used for detecting child pornography and other illegal content. 

Advertisement. Scroll to continue reading.
  • “We are talking about fighting child sexual abuse material with the means of those tools like PhotoDNA. So that is not about suppressing freedom of expression. In the other way, it is ensuring that children’s rights not only to safety, but they also have the right to freedom of expression. And we need to ensure that children can exercise that right in a safe environment.  So I don’t think we are — we can achieve something if we are talking about whether it is censorship or not,” Croll said. “Photo DNA has nothing to do with mass surveillance.  It is not like surveilling or monitoring the interaction of people.  It is discovering files and images,” Croll added. 

The purpose of tools like PhotoDNA is that they will detect material relating to child sexual exploitation and abuse in platforms where it is being distributed. They are not being used in order to suppress freedom of speech but for a very particular defined purpose in order to detect this material and to inform law enforcement about it, which reaches the law enforcement in all 18 Member States, through Europol in relation to referrals that are made through the mandatory reporting in the U.S. and through the National Center for Missing and Exploited Children there. And those number have been increasing the last couple of years to a significant extent. — Cathal Delaney

In relation to what is the legislation that was proposed by the Commission earlier this year — and Europol obviously applauds that this legislation was produced and respects the role of the Europe parliamentary and council institutions in deciding how that legislation is to be implemented. I’m not defending the legislation.  What I understand is that what it seeks to do is to maintain the status quo and not to go beyond what is already the situation and to close a gap between legislation which comes in to force the 20th of December this year and other legislation that they hope to introduce next year in order to close this gap basically in the law that otherwise exists. That is what our position on it is and the technical aspect that I think needs to be understood about it.  And about the other technologies that are being used to do the same thing. — Cathal Delaney

Encryption and child sexual abuse onlineThe police very often depend on encrypted service providers for information about abusers actively offending against children online. There is increased usage of encryption, which blinds not only law enforcement but service providers themselves to the abuse that they know and we know occurring in enormous numbers. Going encryption blind doesn’t solve the problem, it only serves the offenders. European legislation around protecting privacy must take into consideration the need to protect vulnerable relations online, which are predominantly children, and leave some leeway for industry to work with law enforcement and try to detect child abuse and try to filter out child abuse material. 

Indications of online payments being used for abuse: There were also indications that there is more online payment for wild exploitation material and that’s linked to the phenomena of offenders sitting in one country and paying someone else an adult elsewhere to abuse kids live on camera for them in exchange for payment. 

Written By

I cover health, policy issues such as intermediary liability, data governance, internet shutdowns, and more. Hit me up for tips.

MediaNama’s mission is to help build a digital ecosystem which is open, fair, global and competitive.



Looking at the definition of health data, it is difficult to verify whether health IDs are covered by the Bill.


The accession to the Convention brings many advantages, but it could complicate the Brazilian stance at the BRICS and UN levels.


In light of the state's emerging digital healthcare apparatus, how does Clause 12 alter the consent and purpose limitation model?


The collective implication of leaving out ‘proportionality’ from Clause 12 is to provide very wide discretionary powers to the state.


The latest draft is also problematic for companies or service providers that have nothing to with children's data.

You May Also Like


Google has released a Google Travel Trends Report which states that branded budget hotel search queries grew 179% year over year (YOY) in India, in...


135 job openings in over 60 companies are listed at our free Digital and Mobile Job Board: If you’re looking for a job, or...


Rajesh Kumar* doesn’t have many enemies in life. But, Uber, for which he drives a cab everyday, is starting to look like one, he...


By Aroon Deep and Aditya Chunduru You’re reading it here first: Twitter has complied with government requests to censor 52 tweets that mostly criticised...

MediaNama is the premier source of information and analysis on Technology Policy in India. More about MediaNama, and contact information, here.

© 2008-2021 Mixed Bag Media Pvt. Ltd. Developed By PixelVJ

Subscribe to our daily newsletter
Your email address:*
Please enter all required fields Click to hide
Correct invalid entries Click to hide

© 2008-2021 Mixed Bag Media Pvt. Ltd. Developed By PixelVJ