Our ever-increasing consumption of social media is damaging our ability to think critically and engage in self-reflection, argues Professor Shannon Vallor.

It’s a fundamental question for our times; what are the major impacts of the enormous consumption of emerging technologies on human beings and their character?

Professor Shannon Vallor, Baillie Gifford Chair of Data and AI Ethics at the University of Edinburgh, says those impacts have been diverse – but damaging overall.

 

“Positive impacts include the ability of digital connections to foster bonds of empathy and understanding that reach much further than in the pre-digital era,” she explains.

“Social media draws people into emotional responses to stories and narratives that in the past they would never have seen. This can encourage virtues of empathy, compassion and care.

“People can share stories around the world instantaneously in multimedia formats which don’t necessarily require high skill levels to access or consume. Creativity is a virtue which digital tools have absolutely strengthened. You have a multitude of ways to create things for other people that are far more open.”

 

However, Professor Vallor says greater consumption of technology also creates overwhelming challenges: “It’s very easy to overlook the positives because the negative is so overpowering, substantial and threatening to the systems of life that support all the good things.

“It’s hard to be creative or compassionate if your world is burning around you, and if your health, your well-being, your rights, even your life, are under attack. You probably then don’t have the energy, will or optimism to use technological tools for anything other than survival or self-defence, so the positives are lost.

“For the positive potential to blossom and be sustainable, we have to address these systemic harms – including disinformation on a huge scale, which challenges the virtue of honesty and the habit of respecting the truth.

“Truth comes in many forms but the distinctions between truth and lies – or story and conspiracy – are hard to maintain online. Ethical norms that used to reinforce respect for the truth are deeply damaged – and as respect for the truth becomes harder to maintain in digital environments, trust too is naturally eroded.”

 

When we cannot tell truth from falsehood and think nothing is truly trustworthy, we switch off our faculties of critical thinking and fall back on intuitive, emotional ways of reacting, Professor Vallor believes.

And allied to this is the tendency of digital environments to discourage self-reflection.

 

“We have not given people the right online tools to question themselves or their own assumptions,” Professor Vallor says.

“The internet is a tool to either endorse, or question and challenge, other people’s claims, their experiences, their stories. Yet there is very little to promote self-reflection.

“If your environment decreases the opportunities to look at your own assumptions and picture of reality, other people have to challenge them for you and it becomes adversarial. So I think anyone questioning my assumptions is just trying to show me up, or that they are bad people.

“If you are only challenged in that context, you learn to avoid having your beliefs challenged. Donald Trump epitomises that mindset, but it’s contagious in many online spaces.”

 

What does Professor Vallor see as the key-tipping-points in this online journey?

 

“It’s been a continued, gradual process, but there are points of significance; the Cambridge Analytica Facebook data harvesting scandal of early 2018 and before then, the political campaigns in 2016 – the UK Brexit referendum and US Presidential campaign.

“This revealed it was not just about political tribalism or bias; political events were being directly manipulated by hard-to-see forces operating behind the scenes.

“Russia’s Internet Research Agency and examples like it showed that certain nations were diverting resources in a quasi-militaristic operation to armies deployed in misinformation campaigns.”

 

This led to a realisation that the challenge would be harder to solve than anyone thought.

 

“Previously there was an idea we could tackle misinformation by teaching children digital literacy,” says Professor Vallor. “They would be taught to exercise judgment and grow up to be responsible digital consumers. We had a naive view that it was about the individual.

“After 2016, that changed. We recognised these were systems through which power flows, and that this information really was power. You can leverage the power of disinformation in some ways more easily than you can by giving people information. If you deprive them of certain information or flood them with falsehoods, that’s tremendous power.”

 

While President Trump’s influence has also been significant, Professor Vallor stresses that he did not arrive in an empty space.

 

“These things were already happening. He amplified this dynamic – he was the fuel poured on a fire already burning that has now exploded into an inferno.

“There are many ways to look at his influence, but of the most troubling has been his patterns of behaviour on Twitter which have, on one level, undermined any notion of higher duties or standards that holders of public office have.

“There is no real difference in tone or content between what Trump tweets at 2am and what a drunken teenager might tweet at 2am. That undermines our trust in the structures of authority and leadership.”

 

What does Professor Vallor think of the intervention by Twitter in May, to add a fact-checking tag to two of Trump’s tweets?

 

“Social media giants said until very recently that the normal rules – whether inciting violence or abusing each other online – do not apply to Trump because of his position. That undermines the belief that lines cannot be crossed. For a community to function, we have to be able to hold all its members to common standards.

“The Twitter response was symbolic; the first time one of the platforms said there is a line here and the President can cross it. There are different views on how effective the intervention was. I think Twitter is learning; I want it to mature as a platform, to sustain a digital community that isn’t dangerous.”

 

Professor Vallor thinks Facebook is in a very different place: “I think its position is untenable. When your own employees go on virtual strike or quit, or come out and say this is unacceptable, I think as a leader of the company, you must have the humility to say ‘Maybe I’m not getting it right’. Is Mark Zuckerberg able to listen and to turn the lens on himself and be capable of real self-reflection?”

 

Read part one of our interview with Professor Vallor here.

Or listen to our podcast with Professor Vallor here. As part of our virtual engagement series #DDIDiscussions

Read the latest DDI news

Photo of exterior of Edinburgh Futures Institute

Futures Institute welcomes partners and members

Edinburgh Futures Institute welcomed the first cohort of partners and members to its co-location spaces…

READ INTERVIEW
A group of people in a circle with their hands all together demonstrating team bonding

DDI initiative surpasses innovation goal by fourfold, five years early

The Data-Driven Innovation initiative has exceeded a key government objective to drive innovation by supporting…

READ INTERVIEW
Group photo of the AI cohort 2024

AI for good: new AI Accelerator cohort announced

Wearable tech to reduce the risk of drug overdose, AI for affordable financial guidance and…

READ INTERVIEW