VentureBeat: Facebook should be forced to shut down WhatsApp and fix it

rachit-tank-lZBs-lD9LPQ-unsplash.jpg

Defects in the design of Facebook’s WhatsApp platform may have led to as many as two dozen people losing their lives in India. With its communications encrypted end-to-end, there is no way for anyone to moderate posts; so WhatsApp has become “an unfiltered platform for fake news and religious hatred,” according to a Washington Post report.

WhatsApp is not used as broadly in the U.S. as in countries such as India, where it has become the dominant mode of mobile communication. But imagine Facebook or Twitter without any filters or moderation — the Wild Wild West they were becoming during the heyday of Cambridge Analytica. Now imagine millions of people who have never been online before becoming dependent on and trusting everything they read there. That gives you a sense of what kind of damage the messaging platform can do in India and other countries.

Earlier this month, India’s Ministry of Electronics and Information Technology sent out a stern warning to WhatsApp, asking it to immediately stop the spread of “irresponsible and explosive messages filled with rumours and provocation.” The Ministry said the platform “cannot evade accountability and responsibility specially when good technological inventions are abused by some miscreants who resort to provocative messages which lead to spread of violence.”

WhatsApp’s response, according to The Wire, was to offer minor enhancements, public education campaigns, and “a new project to work with leading academic experts in India to learn more about the spread of misinformation, which will help inform additional product improvements going forward.” The platform defended its need to encrypt messages and argued that “many people (nearly 25 percent in India) are not in a group” — in other words, only 75 percent of the population is affected!

One of the minor enhancements WhatsApp offered was to put the word “Forwarded” at the top of such messages. But this gives no information about the source of the original message, and even highly educated users could be misled into thinking a source is credible when it isn’t.

WhatsApp owner Facebook is using the same tactics it used when the United Nations found it had played “a determining role” in the genocide against Rohingya refugees in Myanmar: pleading ignorance, offering sympathy and small concessions, and claiming it was unable to do anything about it.

Here is the real issue: Facebook’s business model relies on people’s dependence on its platforms for practically all of their communications and news consumption, setting itself up as their most important provider of factual information — yet it takes no responsibility for the accuracy of that information.

Facebook’s marketing strategy begins with creating an addiction to its platform using a technique that former Google ethicist Tristan Harris has been highlighting: intermittent variable rewards. Casinos use this technique to keep us pouring money into slot machines; Facebook and WhatsApp use it to keep us checking news feeds and messages.

When Facebook added news feeds to its social-media platform, its intentions were to become a primary source of information. It began by curating news stories to suit our interests and presenting them in a feed that we would see on occasion. Then it required us to go through this newsfeed in order to get to anything else. Once it had us trained to accept this, Facebook started monetizing the newsfeed by selling targeted ads to anyone who would buy them.

It was bad enough that, after its acquisition by Facebook, WhatsApp began providing the parent company with all kinds of information about its users so that Facebook could track and target them. But in order to make WhatsApp as addictive as Facebook’s social-media platform, Facebook added chat and news features to it — something it was not designed to accommodate. WhatsApp started off as a private, secure messaging platform; it wasn’t designed to be a news source or a public forum.

WhatsApp’s group-messaging feature is particularly problematic because users can remain anonymous, identified only by a mobile number. A motivated user can create or join unlimited numbers of groups and share hate-filled messages and fake news. What’s worse is that message encryption prevents law-enforcement officials and even WhatsApp itself from viewing what is being said. No consideration was given in the design of the product to the supervision and moderation necessary in public forums.

Facebook needs to be held liable for the deaths that WhatsApp has already caused and be required to take its product off the market until its design flaws are fixed. It isn’t making its defective products available only to sophisticated users who know what they have signed up for; it is targeting people who are first-time technology users, ignorant about the ways of the tech world.

Only by facing penalties and being forced to do a product recall will Facebook be motivated to correct WhatsApp’s defects. The technology industry always finds a way of solving problems when profits are at stake.

Original link