Research  Report on digital safety for children & youths: Better design instead of blanket ban

#1
C C Offline
https://www.eurekalert.org/news-releases/1122595

INTRO: US courts have ruled against platform providers for failing to protect children, and the debate over age restrictions for social media has gained momentum. An international group of experts from academia, children’s rights organizations and non-profit institutions is convinced that bans would be the wrong approach.

In the journal Science they advocate for new strategies for the digital safety of children and youths aged 13 and older. Prof. Sandra Cortesi and Prof. Urs Gasser from the Technical University of Munich (TUM) explain when artificial intelligence could intervene on smartphones, what role peer groups can play and why children should be involved in shaping their digital education.

Q: In the US, Meta and Google were ordered to pay substantial fines just a few days ago for failing to adequately protect children and youths on their social media and video platforms, respectively. What significance do these rulings hold in light of your working group’s findings?

Urs Gasser: These rulings could mark a turning point because they underscore that child safety in the digital world is not simply a matter of harmful content, but also a matter of platform design. The courts have examined how platforms are built, what kinds of risks their features generate and whether companies can be held responsible when those risks are foreseeable and insufficiently addressed. These questions strike at the heart of our working group’s recommendations: designing digital spaces to ensure safety, agency and well-being of children and youths from the outset. In the context of the cases heard in the US, this means excluding features that can be addictive and providing protection against abuse by adults.

Q: Several countries have banned social media for children under a certain age or are planning to do so. Why are you opposed to a ban?

Urs Gasser: Our argument is not against regulation. Legal requirements are indispensable. However, we believe that policymakers should do more than just establish red lines. Rather, they should require providers to design their platforms and products in a child-friendly manner. That is more demanding than a blanket ban, but also more promising. After all, what we really want is for children and youths to be able to learn how to use media autonomously and in a way that has a positive impact on them... (MORE - details)
Reply


Possibly Related Threads…
Thread Author Replies Views Last Post
  Can Dems talk like normal people again, instead of literary intellectuals? (stats) C C 1 685 Aug 26, 2025 09:22 PM
Last Post: C C
  Research “Ban-the-Box” not helping job applicants with criminal records (statistical analysis) C C 0 531 Apr 16, 2025 10:36 PM
Last Post: C C
  Research More than half of college students report alcohol-related harms from others (survey) C C 0 546 Jan 8, 2025 01:41 AM
Last Post: C C
  Research Most surveyed grocery shoppers report noticing shrinkflation C C 0 516 Nov 13, 2024 11:46 PM
Last Post: C C
  Research New group of digital criminals are exploiting the United States' financial systems C C 1 1,252 Oct 24, 2024 08:14 PM
Last Post: C C
  Research A gun safety paradox: Study finds some precautions linked to riskier storage practice C C 1 765 Sep 10, 2024 12:40 AM
Last Post: Magical Realist
  Research The world’s emotional status is actually pretty good, a new global report finds C C 0 480 Jul 5, 2024 06:13 PM
Last Post: C C
  Study reveals an unexpected side effect of traffic safety messages C C 0 371 Apr 25, 2022 05:05 PM
Last Post: C C
  Why smartphones are digital truth serum C C 0 494 May 4, 2020 06:14 AM
Last Post: C C
  Consensus report shows burnout prevalent in health care community C C 0 516 Oct 24, 2019 03:36 PM
Last Post: C C



Users browsing this thread: 1 Guest(s)