Opinion
Can an EU law save children from harmful content online?
Can an EU law save children from harmful content online?
July 14, 2022 | 10:26 PM
A new EU law to rein in tech giants could serve as a benchmark for worldwide legislation to protect children online, as concern grows globally about the impact of social media on young people, children’s rights campaigners say.The bloc’s Digital Services Act (DSA) includes a ban on targeted advertising aimed at children and prohibits the algorithmic promotion of content that could be harmful for minors such as videos related to eating disorders or self-harm.Jim Steyer, founder of Common Sense Media, a US nonprofit focused on children and tech, said the act signed by European lawmakers last week could help usher in similar rules for big tech companies elsewhere, including the United States.“The DSA is a landmark legislation, and what you’re going to see is it will also lead to similar legislation in the United States,” Steyer said, adding that it could bolster various state-led efforts to regulate social media networks on issues ranging from child safety to political bias.By stipulating hefty fines for firms that fail to remove illegal content — such as child sexual abuse images — from their platforms, the DSA effectively ends an era of voluntary self-regulation by the companies, campaigners said.“The importance of this legislation is (to say): ‘No, it’s not voluntary, there’s certain things you must do’,” said Daniela Ligiero, co-founder of Brave Movement, a survivor-led organisation fighting to end childhood sexual violence.“We believe it can not only help protect children in Europe, but then it can serve as an example ... to the rest of the world,” she added.Between 2010 and 2020, there has been a 9,000% jump in abuse images online, according to the US National Center for Missing and Exploited Children, a nonprofit, and Covid-19 lockdowns saw a surge in reports about online child sexual abuse.While detailed European Union regulations on child sexual abuse material have yet to be drawn up, the DSA sets out fines of up to 6% of global turnover for platforms that fail to remove illegal content.Survivors of child sexual abuse or other online crimes such as so-called revenge porn say the resharing of videos or images online forces them to relive the abuse and can have a devastating impact on mental health.Leading tech companies believe the new EU legislation brings “necessary clarity” and could help foster trust in the digital world, said Siada El Ramly, director general at Dot Europe, a lobby group for tech giants including Apple and Google.She added, however, that tech companies still wanted clarity from regulators on how they should balance protecting users’ privacy with demands for great transparency.“We can’t be pulled in both directions,” she said.Despite praise for the legislation from rights campaigners, there is concern about enforcement. The European Commission has set up a task-force, with about 80 officials expected to join up, which critics say is inadequate.Some have pointed to the poor enforcement of the bloc’s privacy rules governing big tech, known as the General Data Protection Regulation (GDPR).Four years after it came into force, the EU data protection watchdog lamented the stalled progress in long-running cases, and called for an EU watchdog, rather than national agencies to take on cross-border privacy cases.But children’s rights advocates say the speed with which the DSA was agreed shows policymakers are committed to accelerating measures designed to protect children using the Internet.The legislation is “one piece of the puzzle”, said Leanda Barrington-Leach, head of EU affairs at 5Rights, an advocacy group for children’s online safety.It will set the tone for European regulations on areas of particular concern such as artificial intelligence (AI) and child sexual abuse material, both of which are currently in the works at EU level.Barrington-Leach said another key step for Europe would be enshrining “the age appropriate design code” — a kind of rule book for designing products and handling children’s data in order to prevent minors from being tracked and profiled online.Britain pioneered this approach with its Children’s Code, which requires online services to meet 15 design and privacy standards to protect children, such as limiting collection of their location and other personal data.US efforts to pass similar legislation are progressing at a slower pace and facing significant industry pushback.In Minnesota, for example, a bill that would prevent social media firms from using algorithms to decide what content to show to children failed to pass the state’s senate this year.But Steyer said a push by California lawmakers to pass a bill enshrining an age appropriate design code by the end of 2022 could get a boost from the EU’s lead.Crucially, said Barrington-Leach, the child protection measures contained in the DSA highlight an acceptance for the need for legal safeguards online.“We keep saying (children) are digital natives, they understand all this, they’ve got it all sorted. No they haven’t,” she said.“The tide is changing and tech companies realise that they’re being looked at more closely now.” — Thomson Reuters FoundationA new EU law to rein in tech giants could serve as a benchmark for worldwide legislation to protect children online, as concern grows globally about the impact of social media on young people, children’s rights campaigners say.The bloc’s Digital Services Act (DSA) includes a ban on targeted advertising aimed at children and prohibits the algorithmic promotion of content that could be harmful for minors such as videos related to eating disorders or self-harm.Jim Steyer, founder of Common Sense Media, a US nonprofit focused on children and tech, said the act signed by European lawmakers last week could help usher in similar rules for big tech companies elsewhere, including the United States.“The DSA is a landmark legislation, and what you’re going to see is it will also lead to similar legislation in the United States,” Steyer said, adding that it could bolster various state-led efforts to regulate social media networks on issues ranging from child safety to political bias.By stipulating hefty fines for firms that fail to remove illegal content — such as child sexual abuse images — from their platforms, the DSA effectively ends an era of voluntary self-regulation by the companies, campaigners said.“The importance of this legislation is (to say): ‘No, it’s not voluntary, there’s certain things you must do’,” said Daniela Ligiero, co-founder of Brave Movement, a survivor-led organisation fighting to end childhood sexual violence.“We believe it can not only help protect children in Europe, but then it can serve as an example ... to the rest of the world,” she added.Between 2010 and 2020, there has been a 9,000% jump in abuse images online, according to the US National Center for Missing and Exploited Children, a nonprofit, and Covid-19 lockdowns saw a surge in reports about online child sexual abuse.While detailed European Union regulations on child sexual abuse material have yet to be drawn up, the DSA sets out fines of up to 6% of global turnover for platforms that fail to remove illegal content.Survivors of child sexual abuse or other online crimes such as so-called revenge porn say the resharing of videos or images online forces them to relive the abuse and can have a devastating impact on mental health.Leading tech companies believe the new EU legislation brings “necessary clarity” and could help foster trust in the digital world, said Siada El Ramly, director general at Dot Europe, a lobby group for tech giants including Apple and Google.She added, however, that tech companies still wanted clarity from regulators on how they should balance protecting users’ privacy with demands for great transparency.“We can’t be pulled in both directions,” she said.Despite praise for the legislation from rights campaigners, there is concern about enforcement. The European Commission has set up a task-force, with about 80 officials expected to join up, which critics say is inadequate.Some have pointed to the poor enforcement of the bloc’s privacy rules governing big tech, known as the General Data Protection Regulation (GDPR).Four years after it came into force, the EU data protection watchdog lamented the stalled progress in long-running cases, and called for an EU watchdog, rather than national agencies to take on cross-border privacy cases.But children’s rights advocates say the speed with which the DSA was agreed shows policymakers are committed to accelerating measures designed to protect children using the Internet.The legislation is “one piece of the puzzle”, said Leanda Barrington-Leach, head of EU affairs at 5Rights, an advocacy group for children’s online safety.It will set the tone for European regulations on areas of particular concern such as artificial intelligence (AI) and child sexual abuse material, both of which are currently in the works at EU level.Barrington-Leach said another key step for Europe would be enshrining “the age appropriate design code” — a kind of rule book for designing products and handling children’s data in order to prevent minors from being tracked and profiled online.Britain pioneered this approach with its Children’s Code, which requires online services to meet 15 design and privacy standards to protect children, such as limiting collection of their location and other personal data.US efforts to pass similar legislation are progressing at a slower pace and facing significant industry pushback.In Minnesota, for example, a bill that would prevent social media firms from using algorithms to decide what content to show to children failed to pass the state’s senate this year.But Steyer said a push by California lawmakers to pass a bill enshrining an age appropriate design code by the end of 2022 could get a boost from the EU’s lead.Crucially, said Barrington-Leach, the child protection measures contained in the DSA highlight an acceptance for the need for legal safeguards online.“We keep saying (children) are digital natives, they understand all this, they’ve got it all sorted. No they haven’t,” she said.“The tide is changing and tech companies realise that they’re being looked at more closely now.” — Thomson Reuters Foundation
July 14, 2022 | 10:26 PM