Skip to Content, Navigation, or Footer.
Wednesday, April 24
The Indiana Daily Student

opinion

COLUMN: Media distributors are right in reducing harmful content

opamazon031919

Amazon just pulled two pseudoscience books on autism from its shelves. One of those books recommends curing autism by drinking and bathing in chlorine dioxide, a bleach harmful to humans in high concentrations.

Because the books are baseless and dangerous, eliminating them was a good move on Amazon’s part. However, Amazon, as well as other massive tech companies, must continue to do more to combat the massive influx of false or dangerous content from denialists, conspiracy theorists and extremists.

A number of health and safety threats have been made worse by these enterprises, and one reason is that they make it easy for misinformed or harmful communities to form. When I search “vaccines” on Facebook, numerous anti-vaccination groups pop up with depressingly high followings. Vaccines Exposed has more than 13,000 members. Vaccine Resistance Movement has over 35,000.

It is no surprise that Ethan Lindenberger, the 18-year-old who went against his mother’s wishes and got himself vaccinated, said his mother gets all of her vaccine information from Facebook.

Many individuals have also used social media to kindle conspiracies and hate, such as Alex Jones of Infowars.

Because of Jones’ claim that the Sandy Hook massacre was a hoax, parents and relatives of the victims have been harassed and threatened by people who believed they were crisis actors — people who portray as disaster victims. And his encouragement of the conspiracy that John Podesta was running a child sex-ring in the restaurant Comet Ping Pong led to the restaurant suffering both a shooter and an arson attack.

The algorithms these tech companies use have a tendency to push users toward extreme content, conspiracies and misinformation, as has been found to be the case with YouTube and their recommendations function.

Extremists such as the Islamic State group, commonly known as ISIS, and the alt-right weaponize social media and its algorithms to attract followers. In New Zealand, most recently, a white extremist attacked two mosques with the expressed intention of circulating his video on social media.

It’s time these tech companies did something about the problems they are contributing to.

Fortunately, we are seeing some action. Apple, Google, Facebook and Spotify have taken steps to remove content by Alex Jones; Twitter has also permanently banned him from their website. Facebook and YouTube say they have purged over a million reposts and thousands of re-uploads of the New Zealand shooter’s video within the first 24 hours of the event.

And to address the international anti-vaxxer problem, which has become a global health threat, Facebook is planning to reduce the visibility of anti-vax groups and cut advertising for anti-vax content.

However, this is a slippery slope. Since these companies have the power to take down any form of content, they should make sure the content falls within certain criteria. For example, misinformation about the well-established fact that vaccination is far safer and healthier than avoiding the needle should be reduced.

Second, it must be clear that the content could bring direct harm to certain groups or is a danger to public health and safety. This again includes anti-vaxxer messages, the video from the New Zealand white extremist and violent hate speech.

Above all, these tech companies should be transparent about what content they are removing, and if it ever turned out they were trying to take down real and trustworthy content, the owners of that content should speak up.

Unfortunately, what happens on the internet stays on the internet, and harmful content will always be available through re-uploading and reposting. While removing harmful content is a positive step, the bigger problem is with their algorithms, which tech companies need to watch carefully in order to avoid excessive harmful content being published in the first place.

Get stories like this in your inbox
Subscribe