Paper: Platforms and misinformation

A brief history of the Internet

In the beginning of the Internet - there was a tug of war on what type of Internet society wanted. Would it be a closed system, curated and organised by professions, who could help you navigate the wealth of information? Or was it an organic and ever growing body of knowledge collected from all parts of society?

The Internet community made a choice. The World Wide Web was going to be open and free with anyone and everyone being able to participate. It would allow users to navigate from node to node on a whim. There was a low barrier of entry. For the cost of a domain name registration, anyone could create a website, and have it accessible by every other person with Internet access across the globe. This was a new type of freedom. This was a new type of equality.  

The Internet would be an environment for experimentation and innovation. It was going to be the equivalent of a bustling market town full of independent shops, with open access and welcoming new participants. In the beginning there were a lot of search providers who gave users access to the Web and helped us navigate. We liked this choice. We felt in control. Society embraced the freedom of the Internet. The open Web flourished for a number of years. But the foundations for a different type of Internet were being laid.

Despite us still believing in a free and independent Internet, it was slowly becoming something else, more controlled, more coordinated. It was becoming the virtual equivalent of the shopping mall. Everyone was welcome, but their experience was being controlled from behind the scenes. A handful of platform companies were dominating and becoming the coordinating force behind our Internet experience.  

To understand this more - we need to understand how a platform company works.

So what is a platform company and why is it like a mall?

For our purposes, a platform is an online business that facilitates interaction between users, whether this be communication, like Snapchat, transactions, like TradeMe, or services, like SoundCloud. These companies don’t produce anything themselves, but rather coordinate access. They deal in code and data rather than any tangible assets. AirBnB owns no property. Uber has no cars. Facebook creates no content. A mall owner owns no shops. These companies are all platforms for other people's businesses.

The difference to a brick and mortar mall, is scale and audience. A platform is a mall with infinite floors, and anyone can set up shop.

These platforms are valuable as they bring this scale and audience to one place. In today's world, what a company owns matters less than what it can connect. (1)

Two or three social media platforms have become so popular that people are now using them as one-stop-shops. Platform companies allow users to enjoy third-party content without ever leaving their site. People started getting more and more information fed through one site. So although it looks a lot like free and independent voices, what you see is curated behind the scenes to maximise revenue streams.

Curating an Internet experience

The rise of these large and widespread technology companies have business models that, because they are free for users, gather their revenue elsewhere. They rent out prime space, like you would in a mall. The way the companies do this is by allowing advertising on their sites or through their searches. Over 85% of Google’s and 97% of Facebook’s total revenue comes from advertising. (2) Results from paying websites are prioritised and returned first. What you see is not a free and unbiased search.

To augment this, the companies also collect data about you, where you go, what you view, who your friends are, how long you spend on their site. Your data feeds the algorithms that feeds social media. These algorithms select what content makes it to your news feed and what is hidden from you. (3)This means when they return your search, it is much more targeted and marketed just for you. We may search the same thing, but it is likely my search returns something vastly different to your search.

The free and open Internet that we once envisioned is now being curated and designed by the few top companies. The platform companies are creating an environment to keep us entertained, interested and maximise profits.

Let me be with the people

These platform companies are our community hubs. It’s where we congregate and share our lives. This network effect means the more people use something, the more beneficial it becomes. Having a phone is a bit pointless if no-one else has one. Being on Myspace was very lonely when everyone else headed to Facebook. With 2.27 billion monthly active users (4), Facebook has become the default platform of choice. Twitter has 336 million monthly active users worldwide.(5) Everyone flocks to where the people are. The excitement and energy of being part of something means the platform can attract and keep our attention.   

And it’s not just us, as users, that are affected. If people are primarily spending their time on Twitter, Snapchat or Facebook, a lot of non-platform websites will see most of their traffic filtered through these social media sites. This means the platforms have a lot of power over these websites and how much coverage they get. This is how independent voices can get squashed.

You are the product

There’s an oft quoted phrase “if you’re not paying for it, you are the product.” Social media platforms are free for users, and the value the platform has is derived from the numbers on the platform, their engagement and quality and quantity of the data collected about these people.

We seem to be becoming increasingly reliant on platforms. They are designed to be addictive and there is economic interest in ensuring you spend more and more time on the site. Tristan Harris, a former Google design ethicist has revealed that every time you open an app there are 1,000 engineers behind the scenes trying to keep you using it.(6) The top jobs in today’s world are not to cure cancer but to get people to like things. Liking things allows the platforms to build up sophisticated models of who you are as a person, allowing more and more targeting. You are the product.

We normally think of the information being collected on us as being used to sell us things. But the same nudge effect used to sell us stuff has the ability to promote wider agendas, including the power to flip an election. In 2010, Facebook ran a trial during the United States midterm elections. They sent a voting reminder to 61 million US users, a quarter of the US voting population. They also provided an “I Voted” banner showing users which of their friends had clicked the banner. The trail resulted in an extra 340,000 people turning out to vote. Facebook was able to nudge people to take action in an election. If Facebook had shown the button to every US voter, more than a million voters could have been mobilised.(7)

Knowing this power, microtargeting is being used more and more in the political sphere. Microtargeting is the targeting of individuals and crafting a personal campaign for each person using predictive analytics from their demographic, psychographic and other data.

In the 2016 US election, the Donald Trump campaign used microtargeting to run 5.9 million different Facebook advertisements, microtargeting messages to audiences, and seeing what worked. On the other hand the Clinton campaign ran only 66,000 different advertisements.(8) A political party can identify the individual voters which it is most likely to convince, and match the message to the specific interests or vulnerabilities of these voters.(9) While this may encourage more people to vote, the information they are using to inform their decisions may be skewed.

But anything for an easy life

Platforms are useful. And easy. There are many reasons we have gravitated towards where our friends and heroes are. It’s so much easier having our news sources in one coordinated site where it’s familiar and easy to navigate. It’s the ease of having one place compared to 30 different sites, one feed to rule them all.

And the site is built for you. A 2015 study showed that computer modelling is already better at predicting your personality from your Facebook likes than a friend or spouse. This means your digital footprint can give advertisers insights into what you didn’t even know you wanted and feed you information it knows you want to hear. (10) 

And it’s good. It means you don’t need to go out searching for things. You can have what you didn’t even know you needed, delivered directly to your home. This is very similar to a mall. The convenience of all shops in one place with a climate- controlled environment. While your partner or friend tries on an outfit, you can grab a coffee and wait. It’s easy.

However, there is much literature about the psychology of the mall. Malls are designed specifically to confuse people, and make them lose track of time and their original intentions, and to make them more susceptible to impulse buys. You are manipulated as soon as you step in the door.

And as we grow wary of the psychology of the mall, we are also growing increasingly aware of the vulnerable situation we have put ourselves in online. Participating in these platforms means the companies can easily influence us, because they know more about us than we do.

Choice versus addiction

Like the mall, we choose the convenience of things in one place. Choice theory dictates that, as rational beings, we will balance up the pros and cons, and make a choice that ultimately benefits us. But for this, we need to be rational and to have perfect information. That is, we need to understand the benefits to us and the disadvantages. But how much do we really know about what’s going on behind the scenes?

Over the last few years there has been a multitude of studies and stories in the media of how excessive social media use can have  negative impacts on our health. Recent research indicates that 210 million users worldwide are affected by Internet addiction.(11) This means these people are no longer choosing, but are compelled, to participate.

But not all of us are addicted. However, there is still a neurological response. The founding president of Facebook, Sean Parker,recently admitted they exploited a “vulnerability in human psychology” in developing the site. Whenever someone likes or comments on a post you receive a little dopamine hit.(12) The anticipation of this reward enables the behaviour to become a habit. In a recent paper by Daria Kuss and Mark Griffiths, the authors outline that few people are genuinely addicted to social media.(13) They point out that, although it may not be addiction, many people’s social media use is habitual and is spilling over into other areas of their lives and can be problematic and dangerous. Habits are non-reflective, repetitive behaviour, not choice. And part of this habit is fed by the hit of dopamine the developers planned on you receiving.  

But why am I talking about this, when the point of this discussion is to understand misinformation, disinformation and how the social media platforms feed the beast? Well, as we have established, social media platforms rely on a business model that incentivises people’s engagement. Nothing spurs on engagement better than outrage.

The use of platforms, and our reliance on them has created the perfect environment for false information to spread. An environment built to entertain you and to keep you watching. An environment with a huge audience. An environment where you are encouraged to share. An environment feeding you a dopamine hit with every thumbs up. An environment where outrage creates excitement.

How does this let disinformation fester and grow

The Internet did not create disinformation. It did not create divisions between groups of people who are different, or have different ideologies about the world. It did, however, create new ways to spread this information, and much faster than we have ever seen before.

Two people taking a photo of a fish with a third plastic eye on a beach, fishing rod in the background.

Everyone is a creator

Social media platforms rely on a user-as-information-producer model for content. So everyone is their own reporter, author or artist. The barrier to entry to sharing your opinion, idea or creation with the world has never been lower.

This is beneficial to platforms, who rely on people creating content. It is beneficial to people, as we can voice our opinions and show our creativity without censorship. This is the freedom we love. A soapbox, with the world as our audience. Content can be relayed among users with no significant filtering from third parties, fact-checking, or editorial judgment. And because of this, it’s so much faster than traditional news sources could ever be. This puts pressure on the business models of traditional news.   

Traditional media is now trying to compete with blogs and social media platforms. Where once, traditional news sources had robust fact-checking resources and editorial oversight, they now have speed and agility. The time and money to resource fact and quality checking has been eroded. And yet it was this fact checking that meant we could have a high level of trust in our news.

Original social post being replied toLots of social shares

More social shares

It’s becoming a rumour mill

So, the line between well-researched journalism and vanity publishing is merged to no-longer being a relevant comparison. Each is given the same weight in the world of social media.

The way information is received by users has been blurred by the Internet. Under the traditional media model, public opinion relied on ‘one-to-many’ broadcast, a newspaper, a journal, the 6pm news broadcast. But now social media acts as an intermediary, distancing the user from the primary source or those analysing the story. And news is shared in various ways. A news article is shared on Facebook by the news organisation you follow. A friend tweets a comment to a news article. A news article comes to you via paid advertising. A meme is created about the news article and shared on Instagram.

Social media knows that users are more likely to take on information from sources they trust like close friends or celebrities they admire. A lot of effort is put into creating an illusion of trust and community.

The social networks sophisticated targeting systems already know which users are more likely to accept a particular view, and they can be directly targeted.(14) Once they hit share the next person who sees the post in their social feed is more likely to trusts it and goes on to share it themselves. The speed at which information travels is amazing in the trusted peer-to-peer networks.(15)

It’s often difficult to trace a piece of information back to its original source. You only have the information in front of you. It’s often not clear where the information comes from, who the author is, what lens has been put on it or if AI has created a new story.  As a result, it’s not that easy to make up your own mind about validity.

And sometimes this is followed up with video manipulation. A recent example was a video of Jim Acosta and a White House intern. The video showed Mr Acosta pushing an intern away so he could keep the microphone.  This video was reasonably easy to dispute, as the story seemed so far removed from what had happened in the press room that day. At a point when people were calling out the disinformation being spread about Mr Acosta, a new story emerged that it was just a technical glitch. It was possible that the differences could come from “video compression” when the video was turned into a GIF.(16)

In the a blink of an eye, it can flip. It’s becoming really hard to keep up. Am I being played? Am I not? What am I to believe? I believe what I know - I believe my network.

And each day it gets more complex. The latest issue people are worried about in the spread of disinformation is the creation of ‘deepfakes.’ A deepfake is when AI is used to combine and superimpose existing images and videos onto source images or videos.(17) Each day AI is learning more and becoming more sophisticated. Seeing can no longer be believing.

Person at desk about to share fake news on tablet with caption - NZ freakfish LOL!

A new way of thinking about fake news

No one agrees on what fake news is. It was once a useful term, to describe either satirical news, or more recently, website networks that are a mechanism for spreading misinformation. In a recent blog(18) we identified a schema to understand fake news in a more useful way. We also called for people to stop using ‘fake news’ as a phrase and start talking about misinformation and disinformation in a more productive way.

When looking at misinformation, we need to consider the motivation of the person who posted it. Usually, incorrect information is posted without malicious intent. We call this misinformation. On the Internet this is clickbait, or errors in journalism. It’s the misinterpretation or parody or satire as fact. But quickly misinformation can change. People use it for their own end. We released a comic (19)about #freakfish, to show just how quickly a story can change from and innocent joke to  a strategic influence campaign aimed at misleading you to achieve a political outcome.

Prime minister at podium in front of beehive

There are laws – how do they get away with it?

There is a question of how the platforms can get away with this. How can they have fake, and often harmful, material on their site and not have to answer for it?   It comes down to the definition of what a platform is. Platforms are online businesses that facilitate interactions between users. They are not publishers or creators of content. As a result, most countries have established what are called “safe harbour”. Safe harbours protect platform companies from being held responsible under a particular piece of legislation for what a person posts on their site. This is because the platform company is not the publisher of the material, they just host it. These provisions have been what has allowed platforms to exist.(20)

This doesn’t mean that the platform companies are doing nothing. The same safe harbour which means the platforms cannot be held accountable for everything put on their site also allows them to enforce content guidelines and remove posts that violated them. Platform companies and Governments are trying to crack down on the spread of misinformation. Google launched Fact Check (21) to help readers determine whether a news story is accurate, Facebook is expanding the scope and capabilities of its fact-checking programme(22), the UK have recently completed an inquiry into fake news(23) and the European Commission set up a high-level group of experts to advise on policy initiatives to counter disinformation spread online.(24)

In New Zealand, if the information is harmful, and breaches principles (for example if it makes false allegation or denigrates a person’s colour, race, ethnic or national origins, religion, gender, sexual orientation or disability) then the Harmful Digital Communications Act can help you address the issue.(25) This is useful for when the content turns into cyberbullying. But given the scale of the problem, we all need to work together to address the ever growing issue.

The current state of social media is a symptom of wider global issues. While we might not be able to get to the underlying issues, we need to look at what levers we can use to make the social media platforms a better place to communicate.

Front page of paper - NZ fish price plummets!

But none of this is new.

So we come to the end of this tale. Anyone can tell you, misinformation is not new. Propaganda has been used since the beginning of humankind. But we have never seen the speed or the ease in which propaganda can be shared.

But hope should not be lost. If we return to the mall at the beginning of this story, it is not necessarily the mall of the future. We are starting to see malls change, and reinventing themselves to become what their creator had envisioned, the town squares of suburbia. Places where people can come together and build communities.(26) Can we do the same with social media? Can the people claim back our experimentation, innovation and free Internet? It will take a united effort, and some grown-up conversations. But there is hope.



Download the paper