Skip to Content

Here’s how US lawmakers could finally rein in Facebook

By Clare Duffy, CNN Business

“Change is going to come. No question.”

“We are serious about taking action.”

“Here’s my message for Mark Zuckerberg: Your time of invading our privacy, promoting toxic content, and preying on children and teens is over. Congress will be taking action.”

Facebook, now known as Meta, has faced scrutiny on Capitol Hill for years, with executives — including Zuckerberg — repeatedly grilled in Congressional hearings. But if these and other comments from lawmakers during hearings in recent months are any indication, 2022 could shape up to be a make-or-break year in the long-running effort to regulate Facebook.

Congress is currently considering around a dozen proposed bills targeting Big Tech, some of which could force Meta to change how it handles algorithmic recommendations and collecting user data, as well as its ability to make acquisitions. A bipartisan group of 10 state attorneys general launched an investigation late last year into Meta, focused on the potential harms of its Instagram platform on young users.

And last week, a federal judge said the Federal Trade Commission could move forward with a lawsuit seeking to break up Meta, after the company had argued the complaint should be dismissed. (The case could drag on for years.) The FTC and several state attorneys general are also reportedly investigating Meta’s Oculus virtual reality unit over antitrust concerns, according to a Bloomberg report Friday citing people with knowledge of the matter.

Some industry watchers have pointed to recently appointed federal officials such as FTC Chair Lina Khan, a vocal tech industry critic, and the sharper focus of lawmakers as cause for optimism that something may happen on the regulatory front.

“You’re seeing a lot less of the politicized commentary and a lot more focus and coordination on these issues, the underlying technology behind them and the business model,” said Katie Paul, director at the tech advocacy group Tech Transparency Project. “It’s clear that a lot of these members of Congress have done their homework and they understand what they’re looking at.”

Still, after years of talk and glimmers of progress, it remains unclear if or when US lawmakers and regulators might take successful action — as their EU and UK counterparts have — that would limit Meta’s power, as well as that of Big Tech more broadly. And the window of opportunity may be limited as preparations for the US midterm elections could divert attention from advancing new legislation.

Recent revelations from former Facebook employee and whistleblower Frances Haugen and the hundreds of internal documents she leaked have galvanized bipartisan support for new legislation related to protecting children online. But the likelihood of success for the many other Meta-related proposals is murkier, and not just because of the company’s immense lobbying power.

Despite their agreement that something should be done to address Big Tech’s dominance -— and to crack down on Meta in particular — Democrats and Republicans are divided on what the core problem really is. Republicans accuse Facebook of anti-conservative bias, despite a lack of evidence, while Democrats are concerned that the company doesn’t do enough to protect against hate speech, misinformation and other problematic content.

The stakes for action, or inaction, are only growing. The “Facebook Papers” revealed a wide range of potential real-world harms and consequences from Meta’s platforms. Yet, lawmakers are largely still playing catch up in understanding and regulating the company’s older platforms, even as Meta pushes to transition into a “metaverse company” and perhaps shape a whole new generation of user experiences.

“Congress must seize this historic moment — a pivotal turning point for reining in Big Tech,” Sen. Richard Blumenthal, the Connecticut Democrat who chairs the Senate Commerce Subcommittee on Consumer Protection, told CNN Business. “Having seen Big Tech’s harms and abuses, in our hearings and their own lives, Americans are ready for action — and results.”

Here are a few of the approaches lawmakers could take.

Section 230

One of the first places lawmakers and experts often look when considering new rules for tech companies like Meta is a piece of federal legislation called Section 230 of the Communications Decency Act.

The 25-year-old law prevents tech companies from being held liable for the content that users post on their platforms. For years, big tech companies have leaned on the law to avoid being held responsible for some of the most controversial content on their platforms, using it to dismiss lawsuits over messages, videos and other content created by users.

Momentum has grown on Capitol Hill around the idea of scrapping or updating Section 230, which could expose tech platforms to more lawsuits over hate speech and misinformation. Proposed changes include making platforms liable for hosting child abuse content. President Biden has also suggested platforms should be held responsible for hosting misinformation related to vaccines. (Social media companies and industry organizations have lobbied hard against changes to Section 230.)

But there’s one big hurdle to this approach, experts say: the First Amendment. Even if lawmakers got rid of Section 230 and, for example, Meta faced lawsuits over misinformation on its platforms, that speech is protected by the First Amendment. That means the company would probably still win out in the end, according to Jeff Kosseff, cybersecurity law professor at the US Naval Academy and author of a book about Section 230 called “The Twenty-Six Words that Created the Internet.”

“Where Section 230 really makes a difference is in things like defamation lawsuits,” Kosseff said. “But that’s not really what’s driving the debate around Facebook and other social media sites — it’s more of this lawful but awful types of content.”

Kosseff also raised the concern that trying to hold tech platforms responsible for certain types of speech — such as health misinformation — could give the government significant leeway in determining what content falls into those categories.

“There have been some countries that have passed fake news laws, and they’ve misused them just as you would expect that you would,” he said.

Algorithms

Haugen, meanwhile, has encouraged reforming Section 230 to hold platforms accountable for how their algorithms promote content. In that scenario, Meta and other tech companies still would not be responsible for user-generated content, but could be held liable for the way their algorithms promote and cause that content to go viral.

Bipartisan legislation introduced in the House in November would take a slightly different tack by forcing large tech companies to allow users to access a version of their platforms where what they see isn’t shaped by algorithms at all.

Perhaps anticipating such a law, Meta-owned Instagram has said it will bring back the option for users to access a reverse chronological version of their feed (one not manipulated by its algorithm) later this year. Facebook already offers this option but it can be frustrating to use — rather than being an option in settings, where users might expect, it’s toggled using a button in a long menu on the left side of the NewsFeed screen, and it resets every time you close the site.

Privacy

Lawmakers have also used recent hearings about Meta to rail for updated privacy laws.

“We have not done anything to update our privacy laws in this country, our federal privacy laws. Nothing. Zilch,” Minnesota Democrat Sen. Amy Klobuchar said during Haugen’s hearing.

Currently, progress on this front is coming more at the state level than the federal level.

California’s Consumer Privacy Act, which went into effect last year, gives consumers the right to demand that large companies disclose what data they have collected on them. Under the law, consumers can also ask companies to delete their data and, in some cases, sue companies for data breaches. Meanwhile, Virginia’s Consumer Data Protection Act (set to take effect in 2023) also gives consumers more control of their online data, but it includes more exceptions than the California law and doesn’t give consumers the option to sue companies. A federal bill could help provide consistent, nationwide standards for how data can be collected and sold online.

Congress is considering the KIDS Act, which aims to protect internet users under 16 in various ways, including by prohibiting the use of age verification data for commercial purposes, as well as the SAFE DATA Act, which would give consumers more choice in how their data is collected and used.

A new tech regulatory body

In his testimony before a Senate subcommittee earlier this month, Instagram head Adam Mosseri proposed the creation of an industry body that would set standards for “how to verify age, how to build age-appropriate experiences, how to build parental controls,” and other social media best practices.

But lawmakers didn’t seem enthusiastic about the idea of leaving standards-setting and oversight to industry players. “Self-policing depends on trust and the trust is gone,” Blumenthal said during Mosseri’s hearing.

Instead, lawmakers and advocates are pushing for the creation of a new federal regulatory body responsible for overseeing Big Tech. The group could be tasked with developing the framework and structures needed to regulate the tech industry, similar to the mechanisms within the government that help oversee the banking industry, TTP’s Paul said. It could also, as Haugen testified, serve as “a regulatory home where someone like me could do a tour of duty.”

Such a group would help supplement the limited existing accountability structures surrounding Meta. The Facebook Oversight Board — which says it acts independently, although its members are appointed and paid by the company — is only responsible for weighing in on content moderation decisions. Even then, the group has recently focused on smaller, one-off flubs, rather than the many broader, structural problems the company faces (although it has made larger calls for transparency).

The role of the FTC

If Congress does pass any Big Tech laws, the FTC will play a key role in enforcing them. And even if we don’t see new legislation in the next year, Meta won’t necessarily be off the hook.

The judge’s Tuesday ruling in the FTC case opens the door to perhaps the most existential threat yet to Meta: the FTC is looking to unwind Meta’s acquisitions of Instagram and WhatsApp. (Meta previously said it was confident “the evidence will reveal the fundamental weakness of the [FTC’s] claims.”)

The case will give Khan, the FTC chair, a chance to make her mark in her first turn as a federal regulator — and there is some reason to believe Meta is nervous. Last July, company officials wrote to the FTC asking Khan to recuse herself from all matters related to the social media giant (she has not done so). Meta also argued that the FTC’s suit should be dismissed on the grounds that Khan should not have been able to vote to approve the updated complaint; however, the judge sided with the FTC.

In addition to the agency’s lawsuit, Khan said last month that the FTC is considering drafting new rules that would more strongly regulate how US businesses can use data and algorithms. The effort could lead to “market-wide requirements” targeting “harms that can result from commercial surveillance and other data practices,” Khan said in a letter to Blumenthal. That could deal another potential blow to Meta’s business model.

And Friday’s report that the FTC is also working with state attorneys general to investigate potential anticompetitive practices by Meta’s Oculus — a key unit in its plans for the metaverse — indicate that its future ambitions are risk of regulatory crackdown, too.

–CNN’s Brian Fung contributed to this report.

The-CNN-Wire
™ & © 2022 Cable News Network, Inc., a WarnerMedia Company. All rights reserved.

Article Topic Follows: CNN – Social Media/Technology

Jump to comments ↓

Author Profile Photo

CNN Newsource

BE PART OF THE CONVERSATION

News Channel 3-12 is committed to providing a forum for civil and constructive conversation.

Please keep your comments respectful and relevant. You can review our Community Guidelines by clicking here

If you would like to share a story idea, please submit it here.

Skip to content