Meta, TikTok, and YouTube may finally need to start sharing data with researchers

On Wednesday, Congress was faced with the unfamiliar spectacle of highly intelligent people speaking with nuances about platform regulation. The occasion was a hearing entitled “Platform Transparency: Understanding the Impact of Social Media” and served as an opportunity for members of the Senate Judiciary Committee to consider the need for legislation that would require major technology platforms to make themselves available for study by qualified researchers and members of the public.

One such law, the Platform Transparency and Accountability Act, was introduced in December by a (however slightly) bipartisan group of senators. One of those senators, Chris Coons of Delaware, chaired Wednesday’s hearing; another, Senator Amy Klobuchar of Minnesota, was also present. During a delightfully lively hour and forty minutes, Coons and his assembled experts examined the need for requiring platforms to disclose data and the challenges of obliging them to do so in a constitutional manner.

On the first point – why is this necessary? – the Senate named Brandon Silverman, co-founder of the transparency tool CrowdTangle. (I interviewed him here in March.) CrowdTangle is a tool that allows researchers, journalists and others to see the popularity of links and posts on Facebook in real time and understand how they’re spreading. Researchers studying the impact of social networks on democracy say we would benefit tremendously if we had similar insights into content distribution on YouTube, TikTok, and other major platforms.

Silverman eloquently described how Facebook’s experience of acquiring CrowdTangle, only to find that it could be used to embarrass the company, made other platforms less likely to take similar voluntary actions to improve public understanding.

“The biggest challenge in particular is that in the industry right now, you can just get away with not doing any transparency at all,” said Silverman, who left the company, now known as Meta, in October. “YouTube, TikTok, Telegram and Snapchat represent some of the largest and most influential platforms in the United States and offer almost no functional transparency in their systems. And as a result, they avoid almost all of the scrutiny and criticism that comes with it.”

He continued, “This reality has industry-wide ramifications and has often led to conversations within Facebook about whether it’s better to just do nothing because it’s an easy way to get away with it.”

When we hear about what’s happening at a tech company, it’s often because an employee Frances Haugen-type decides to leak it. The overall effect of this is to paint a very selective, patchy picture of what’s happening inside the largest platforms, said Stanford Law School professor Nate Persily, who also testified today.

“We shouldn’t have to wait for whistleblowers to whistle,” Persily said. “These kinds of transparency laws are about empowering outsiders to get a better picture of what’s going on at those companies.”

So what would the legislation now under consideration actually achieve? The Stanford Policy Center had a nice summary of its core features:

*Allows researchers to submit proposals to the National Science Foundation. If the NSF supports a proposal, social media platforms would have to provide the necessary data, subject to privacy protections, which could include anonymization or “white spaces” for researchers to review sensitive material.

*Grants the Federal Trade Commission the power to require platforms to regularly disclose certain information, such as: B. Ad targeting data.

*The Commission could require platforms to create basic research tools to examine what content is successful, similar to the basic design of the meta-owned CrowdTangle.

*Prevents social media platforms from blocking independent research initiatives; Both researchers and platforms would be given a legal safe haven regarding privacy concerns.

To date, much of the focus on regulating technology platforms has found members of Congress seeking to regulate the language at both the individual and corporate levels. Persily argued that starting with this type of forced sunlight instead might be more effective.

“Once the platforms know they are being watched, that will change their behavior,” he said. “They won’t be able to secretly do certain things that they have been able to do up until now.” He added that platforms would also likely change their products in response to increased scrutiny.

OK, fine, but what are the trade-offs? Daphne Keller, director of Stanford’s platform regulation program, testified that Congress should carefully consider what kind of data platforms are required to disclose. Among other things, new requirements could be exploited by law enforcement to circumvent existing limits.

“Nothing about these transparency laws should change the protections Americans have under the Fourth Amendment or laws like the Stored Communications Act, and I don’t think that’s the intention here,” she said. “However, clear elaboration is essential to ensure the government cannot effectively circumvent Fourth Amendment restrictions by using the unprecedented surveillance power of private platforms.”

There are also First Amendment concerns regarding these types of platform regulations, she noted, noting the failure in court of two recent state laws aimed at forcing platforms to broadcast speech that violates their policies.

“I want transparency mandates to be constitutional, but there are serious challenges,” Keller said. “And I hope you put really good lawyers on it.”

Unfortunately, a little Ted Cruz has to fall in every Senate hearing. The Texas senator was the only attendee on Wednesday to use up his speaking time without asking a single question from the experts in attendance. Cruz expressed great confusion as to why he got relatively few new Twitter followers in the days before Elon Musk said he would buy it, but then got a lot more after the acquisition was announced.

“It’s obvious someone flipped the switch,” the Texas Republican said. “The governors who had them said ‘silent conservatives’ freaked out. That’s the only rational explanation.” (I know the word “Governors” is used a bit unconventionally here, but I’ve listened to the tape five times and that’s what I heard.)

The real explanation is that Musk has a lot of conservative fans, they flocked back to the platform when they heard he was buying it, and from there Twitter’s recommendation algorithms kicked into gear.

But even I have to sympathize with Cruz here, for all the reasons today’s hearing was called in the first place. Without legislation requiring platforms to explain in more detail how they work, some people will always believe the dumbest explanations possible. (Especially when those declarations serve a political purpose.) Cruz is what you get in a world where transparency on the part of platforms is only voluntary.

Still, we should keep our expectations in check – there are limits to what platform disclosures can do for our discourse. It seems entirely possible that you could explain to Ted Cruz exactly how Twitter works and he would either not understand or willfully misunderstand you for political reasons. And even people who, in good faith, are trying to understand recommender systems cannot understand technical-level explanations. “Transparency” is not a panacea.

But it’s a start? And seems a lot less strained than many other proposed technical regulations, many of which find Congress trying to regulate speech in ways that are unlikely to survive First Amendment scrutiny.

Where other countries hold hearings as a prelude to passing legislation, of course we hold hearings in the United States instead of this the passing of laws. And despite some Republican support for the measure — even Cruz said it sounded good to him — there’s no evidence it’s gaining any particular momentum.

But, as always, Europe is much further ahead of us. The Digital Services Act, which regulators agreed to in April, includes provisions that would oblige large platforms to share data with qualified researchers. The law is expected to come into force next year. And even if Congress hesitates after today, transparency is coming to the platforms one way or another. We hope it can begin to answer some very important questions.

https://www.theverge.com/2022/5/5/23058139/meta-tiktok-youtube-senate-transparency-hearing-research-us-europe Meta, TikTok, and YouTube may finally need to start sharing data with researchers

Fry Electronics Team

Fry Electronics.com is an automatic aggregator of the all world’s media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials, please contact us by email – admin@fry-electronics.com. The content will be deleted within 24 hours.

Related Articles

Back to top button