Technology

Watch out for virus misinformation – The New York Times

Two decades ago, Wikipedia emerged as a quirky online project that aimed to collect and document all of mankind’s knowledge and history in real time. Skeptics worry that much of the web will consist of unreliable information and often point out mistakes.

But now, online encyclopedias are often seen as a place that, on balance, helps combat misinformation and misinformation spreading elsewhere.

Last week, the Wikimedia Foundation, the group that oversees Wikipedia, announced Maryana Iskander, a social entrepreneur in South Africa who has worked for many years in non-profits addressing youth unemployment and women’s rights, will become chief operating officer this month. January.

We spoke to her about her vision for the group and how the organization is working to prevent misinformation and misinformation on its websites and around the web.

Tell us about your direction and vision for Wikimedia, especially in such an information-laden landscape and in this polarized world.

There are a few core principles of Wikimedia projects, including Wikipedia, that I think are important starting points. It is an online encyclopedia. It’s not trying to be anything else. It certainly isn’t trying to be a traditional social media platform in any way. It has a structure led by volunteer editors. And as you probably know, the platform has no editorial control. This is a user-led community, which we support and enable.

The lessons to learn, not just from what we’re doing but how we continue to iterate and improve, start with this idea of ​​radical transparency. Everything on Wikipedia is quoted. It is debated on our talk pages. So even when people may have different views, those debates are open and transparent, and in some cases, really allow for proper reciprocity. I think that’s the need in such a polarized society – you have to make space for passersby. But how do you do that transparently and ultimately lead to a better product and better information?

And the last thing I’m going to say is, you know, this is a community of incredibly humble and honest people. As we look to the future, how do we build on those attributes of what this platform can continue to offer to society and provide free access to knowledge ? How do we ensure that we are reaching the full diversity of humanity in terms of who is invited to participate, who is written about? How do we truly ensure that our collective efforts reflect more of the global south, more women, and more diversity of human knowledge, more reality? ?

What are your thoughts on how Wikipedia fits into the widespread problem of misinformation online?

Many of the platform’s core attributes are very different from some traditional social media platforms. If you give false information about Covid, the Wikimedia Foundation has partnered with the World Health Organization. A group of volunteers have come together around what’s called WikiProject Medicine, which focuses on medical content and creates articles that are then carefully curated because these are the kinds of topics you want to focus on around. around misinformation.

Another example is the foundation that assembled a task force ahead of the US elections that, again, tried to be proactive. [The task force supported 56,000 volunteer editors watching and monitoring key election pages.] And the fact that there are only 33 inversions on main US election site is an example of how so focused on key topics that misinformation poses a real risk.

Then another example that I think is really interesting is there’s a podcast called “World According to Wikipedia.” And on one of the episodes, there was a volunteer interviewed, and she really did her job by being one of the main followers of the pages on climate change.

We have technology that notifies these editors when changes are made to any page so they can see what the changes are. If there is indeed a risk that misinformation could creep in, there is an opportunity to temporarily lock a page. No one wants to do that unless it’s absolutely necessary. The climate change example is useful because the talk pages behind it have great debate. Our editor is saying: “Let’s argue. But this is the page I am monitoring and monitoring closely.”

A major debate currently taking place on these social networking platforms is the issue of information censorship. There are those who argue that biased views take precedence on these platforms and more conservative views have been dropped. When you think about how to handle these debates when you’re the head of Wikipedia, how do you issue a judgment call when this happens in the background?

What has inspired me about this organization and these communities is that there are core pillars that were established on Day One of Wikipedia’s founding. One of them is the idea of ​​presenting information with a neutral point of view, and neutrality requires an understanding of all sides and all points of view.

That’s what I said earlier: There are debates on the talk pages, but then come to a suitable kind of informed, documented, verifiable conclusion about the articles. I think this is a core principle that, again, can provide something for others to learn from.

Coming from a progressive organization fighting for women’s rights, do you think a lot about people who misarm your resume to say it can affect the calls you make? about what’s allowed on Wikipedia?

I will say two things. I can say that the really relevant aspect of the work I’ve done in the past has been volunteer-led movements, which are probably a lot harder than others might think, and I’ve play a real role in understanding how to build systems, build cultures, and build processes that I think would be right for an organization and a community group trying to grow in size and reach. their access.

The second thing that I want to say is, once again, I am on my own learning journey and invite you to join the learning journey with me. The way I choose to be in the world is that we interact with others with a presumption of goodwill and we participate in respectful and civilized ways. That doesn’t mean others will do it. But I think we have to hold on to that as an aspiration and as a way to be the change we want to see in the world.

When I was in college, I did a lot of my research on Wikipedia, and some of my professors would say, ‘You know, that’s not a legitimate source.’ But I still use it all the time. I wonder if you have any thoughts on that!

I think most professors now admit that they also sneak into Wikipedia looking for things!

You know, this year we’re celebrating Wikipedia’s 20th anniversary. On the one hand, this is something that I think people scoff at and say is going nowhere. And now it’s become the most legitimately referenced source in human history. I can tell you just from my own conversations with academics that the narrative about Wikipedia sources and Wikipedia use has changed.

https://www.nytimes.com/live/2020/2020-election-misinformation-distortions/qanon-believers-us-survey Watch out for virus misinformation – The New York Times

Fry Electronics Team

Fry Electronics.com is an automatic aggregator of the all world’s media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials, please contact us by email – admin@fry-electronics.com. The content will be deleted within 24 hours.

Related Articles

Back to top button