wordpress blog stats
Connect with us

Hi, what are you looking for?

Summary: US lawmakers introduce bill to tackle social media addiction and amplification of harmful content

The bill will require social media platforms to release month-wise and half-yearly breakdowns of their content moderation stats.

US lawmakers on February 10 introduced a new bill that aims to tackle addiction and the amplification of harmful content on social media platforms with content-agnostic interventions.

The bipartisan bill, titled Nudging Users to Drive Good Experiences on Social Media (Social Media NUDGE) Act requires the National Science Foundation and the National Academy of Sciences, Engineering and Medicine to find potential content-agnostic interventions and the Federal Trade Commission (FTC) to formulate regulations based on these findings.

Content-agnostic interventions are actions taken by a platform to alter user experience that does not rely on the substance of the user-generated content and does not alter the core functionality of the platform, the bill states.

The bill comes after Facebook whistleblower Frances Haugen in late 2021 revealed the mental health impacts of Instagram on teenagers and the lack of efforts by Meta in tackling hate speech and harmful content on its various platforms.

Senator Amy Klobuchar, who co-authored the bill with Senator Cynthia Lummis, said:

Advertisement. Scroll to continue reading.

“For too long, tech companies have said ‘Trust us, we’ve got this.’ But we know that social media platforms have repeatedly put profits over people, with algorithms pushing dangerous content that hooks users and spreads misinformation. This bill will help address these practices, including by implementing changes that increase transparency and improve user experience. It’s past time to pass meaningful reforms that address social media’s harms to our communities head-on.”

Notably, unlike other proposed legislation, this bill does not require any amendments to Section 230 of the US Communications Decency Act, which provides immunity for platforms hosting user-generated content.

US lawmakers have been aggressively going after Big Tech companies in recent weeks. Earlier this month, the US Senate Judiciary Committee gave the go-ahead for the Open App Markets Act that will force Google and Apple to open up their app stores to alternative payment processing systems and allow app downloads from non-official sources. In January, the same committee voted to pass another tech antitrust bill called the American Innovation and Competition Act, which accomplishes some of the same goals as the Open App Markets Act while targeting a wider set of anticompetitive practices and tech companies.

Academia should conduct studies to identify interventions

The Bill proposes that the Director of the National Science Foundation enter into an agreement with the National Academies of Sciences, Engineering, and Medicine to conduct studies to identify content-agnostic interventions that platforms could implement to reduce the harms of algorithmic amplification and social media addiction. They should:

  1. Identify ways to define and measure the negative mental or physical health impacts related to social media by reviewing a wide variety of studies, literature, reports done by academic institutions, civil society groups, and other appropriate sources. Internal research that platforms carry out either by themselves or by engaging third parties may also be submitted for the study, but any other research in the platform’s possession that is closely related to such voluntarily submitted research must also be submitted if the Academies request for it.
  2. Identify research-based content-agnostic interventions such as reasonable limits on account creation, content sharing, etc.
  3. Provide recommendations on how platforms may be separated into groups for the purpose of implementing content-agnostic interventions. The classification should take into consideration factors such as:
    • the number of monthly active users and the growth rate of such number
    • how user-generated content is created, shared, amplified, and interacted with on the platform
    • how the platform generates revenue
    • and other relevant factors for creating groups of similar platforms
  4. Provide recommendations on which interventions apply to which group of platforms
  5. Provide recommendations on how platforms should implement the interventions without it altering the core functionality of the platform and considering:
    • whether the intervention should be an optional setting or with appropriate default settings
    • any other means by which the intervention may be implemented
  6. Define metrics applicable to platforms in a group to measure and publicly report, in a privacy-preserving manner, the impact of any content-agnostic interventions
  7. Identify data and research questions necessary to further understand the negative mental or physical health impacts related to social media.

Timelines:

  1. Initial study report: The Academies must submit their initial study report concerning the above aspects to the government within 1 year of the enactment of this Act.
  2. Updates: Every 2 years after the regulations are promulgated, the Academies must submit updated reports on the aspects mentioned above based on new research and also include the impact of the interventions implemented by the platforms as well as an analysis of any entities that have newly met the criteria to be considered a covered platform under this Act.

“By empowering the National Science Foundation and the National Academy of Sciences, Engineering, and Medicine to study the addictiveness of social media platforms, we’ll begin to fully understand the impact the designs of these platforms and their algorithms have on our society. From there, we can build guardrails to protect children in Wyoming from the negative effects of social media. We can build a healthier internet without the federal government dictating what people can and can’t say.” – Senator Cynthia Lummis, co-author of the bill

FTC should formulate regulations to implement the interventions

  1. FTC forms regulations based on study reports submitted by the Academies: Within 60 days of the academies submitting their initial study report, the FTC should initiate a rulemaking proceeding for the purpose of promulgating regulations to:
    • to determine how covered platforms should be grouped together
    • to determine which content-agnostic interventions identified in such report shall be applicable to each group of covered platforms identified in the report
    • to require each covered platform to implement and measure the impact of such content-agnostic interventions
  2. FTC issues notice to covered platforms with interventions: Within 30 days of forming the regulations, FTC should provide notice to each covered platform of the content-agnostic interventions that are applicable to the platform.
  3. Platforms submit plans on how they will implement the interventions: Within 60 days of receiving the notice from FTC, the platform should submit a plan to implement each content-agnostic intervention applicable to them. If the platform reasonably believes that an applicable intervention is not technically feasible, would substantially change the core functionality of the covered platform, or would pose material privacy or security risk to consumer data stored, held, used, processed, or otherwise possessed by the platform, it should include in its plan evidence supporting these beliefs.
  4. FTC studies plans submitted by platforms and determine if they are satisfactory: Within 30 days of receiving the implementation plan from a platform, the FTC should determine whether such plan includes details related to the appropriately prompt implementation of each intervention applicable to the platform or whether the platform can be exempt based on the reasons filed by the platform.
  5. If FTC rejects the plan, companies can file revised plans or appeal: If FTC determines that a plan does not satisfy the requirements or that an exemption cannot be granted, the platform has 90 days to submit a revised plan or appeal the FTC determination to the United States Court of Appeals for the Federal Circuit. However, if a platform submits 3 revised plans and the FTC determines that none of them is satisfactory, the Commission can say that the platform is not acting in good faith and mandate that it implements a plan developed by the Commission.
  6. Platforms publish periodic statements of compliance: Platforms should make compliance statements publicly available on their website at least once a year. These statements should be in a machine-readable format, privacy-preserving, and contain the following:
    • the platform’s compliance with the required implementation of content-agnostic interventions
    • the impact of the interventions on reducing the harms of algorithmic amplification and social media addiction on covered platforms.

Semi-annual transparency reports on content moderation efforts

Platforms must semiannually publish a publicly available, machine-readable report about the content moderation efforts of the platform with respect to each language spoken by not less than 100,000 monthly active users of the covered platform in the United States. The report should include the following:

  1. How many content moderators? The total number of individuals employed or contracted by the platform during the reporting period to engage in content moderation for each language, broken down by the number of individuals retained as full-time employees, part-time employees, and contractors of the platform and reported in a privacy-preserving manner.
  2. A random sample of viewed content: Each month, for publicly visible content with over 1,000 views, platforms must pick a sample randomly in a manner such that the probability of a piece of content being sampled is proportionate to the total number of views of that piece of content during the month. For each piece of sampled content, the platform must share the text, images, audio, video, or other creative data associated with the content, the details of the account that posted the content, and the total number of views of that content during the month.
  3. High reach content: For each language, platforms should reveal the 1,000 most viewed pieces of publicly visible content each month.
  4. Removed and moderated content: Platforms should publish month-wise aggregate metrics of their content moderation efforts broken down by language, the topic of content such as bullying, hate speech, etc, and whether the account that shared the content is a prominent (verified) user. The data should include the number of pieces of user-generated content and the number of views of such content that were:
    • reported to the platform by a user
    • flagged by an automated content detection system
    • removed from the platform and not restored
    • removed from the platform and later restored
    • labelled, edited, or otherwise moderated by the platform following a user report or flagging by an automated content detection system

Why do we need these interventions?

“Research on social media addiction continues to paint a grim picture. Facebook’s own internal research found that ‘Young people are acutely aware that Instagram can be bad for their mental health, yet are compelled to spend time on the app for fear of missing out on cultural or social trends,’ and a 2018 Pew Research Center survey found that 54 percent of teens say they spend too much time on their cell phones.” – Office of Senator Amy Klobuchar

  1. Negative impacts of social media platforms: Social media platforms can have significant negative impacts on users including on their mental and physical health and design decisions made by social media platforms such as what content a user sees may exacerbate these negative impacts, the bill states.
  2. Amplification of harmful content: Harmful content spends virally on platforms and platforms do not consistently enforce their terms of service and remove them, leading to prohibited content often being amplified by platforms, the bill states.
  3. Automated content moderations systems do not fully address harmful content: The bill states that social media platforms rely heavily on automated systems for content moderation, but these systems do not fully address the harmful content on the platforms.
  4. Research shows that content-agnostic interventions works: The bill states that significant research has shown that content-agnostic interventions, such as the following, work in mitigating negative impacts of social media platforms:
    • Screen time alerts and grayscale settings which reduce addictive platform usage patterns
    • Alerts requiring users to review user-generated content before sharing such content
    • Prompts to help users identify manipulative and micro-targeted advertisements
  5. Platform hesitant to independently adopt content-agnostic interventions: “Evidence suggests that increased adoption of content-agnostic interventions would lead to improved outcomes of social media usage. However, social media platforms may be hesitant to independently implement content-agnostic interventions that will reduce negative outcomes associated with social media use,” the Bill states.

Key definitions

The following are some of the key terms that the Bill defines:

  1. Algorithmic amplification: “Algorithmic amplification means the promotion, demotion, recommendation, prioritization, or de-prioritization of user-generated content on a covered platform to other users of the covered platform through a means other than presentation of content in a reverse-chronological or chronological order.”
  2. Covered platform: “The term covered platform means any public-facing website, desktop application, or mobile application that—
    • is operated for commercial purposes;
    • provides a forum for user-generated content;
    • is constructed such that the core functionality of the website or application is to facilitate interaction between users and user-generated content; and
    • has more than 20,000,000 monthly active users in the United States for a majority of the months in the previous 12-month period.”
  3. Privacy-preserving manner: “The term privacy-preserving manner means, with respect to a report made by a covered platform, that the information contained in the report is presented in a manner in which it is not reasonably capable of being used, either on its own or in combination with other readily accessible information, to uniquely identify an individual.”

Also Read:

Have something to add? Post your comment and gift someone a MediaNama subscription.

Written By

MediaNama’s mission is to help build a digital ecosystem which is open, fair, global and competitive.

Views

News

Looking at the definition of health data, it is difficult to verify whether health IDs are covered by the Bill.

News

The accession to the Convention brings many advantages, but it could complicate the Brazilian stance at the BRICS and UN levels.

News

In light of the state's emerging digital healthcare apparatus, how does Clause 12 alter the consent and purpose limitation model?

News

The collective implication of leaving out ‘proportionality’ from Clause 12 is to provide very wide discretionary powers to the state.

News

The latest draft is also problematic for companies or service providers that have nothing to with children's data.

You May Also Like

News

Google has released a Google Travel Trends Report which states that branded budget hotel search queries grew 179% year over year (YOY) in India, in...

Advert

135 job openings in over 60 companies are listed at our free Digital and Mobile Job Board: If you’re looking for a job, or...

News

Rajesh Kumar* doesn’t have many enemies in life. But, Uber, for which he drives a cab everyday, is starting to look like one, he...

News

By Aroon Deep and Aditya Chunduru You’re reading it here first: Twitter has complied with government requests to censor 52 tweets that mostly criticised...

MediaNama is the premier source of information and analysis on Technology Policy in India. More about MediaNama, and contact information, here.

© 2008-2021 Mixed Bag Media Pvt. Ltd. Developed By PixelVJ

Subscribe to our daily newsletter
Name:*
Your email address:*
*
Please enter all required fields Click to hide
Correct invalid entries Click to hide

© 2008-2021 Mixed Bag Media Pvt. Ltd. Developed By PixelVJ