'Digital revolution' excluding the most vulnerable, world leaders told

    Thursday, 14 November 2019 10:29 GMT

ARCHIVE PHOTO: A woman surfs the web at an Internet cafe in Bangkok, Thailand, September 29, 2010. REUTERS/Sukree Sukplang

The 'digitisation of information' impacts every sector in society but not everyone benefits equally, says a leading human rights researcher

By Zoe Tabary

LONDON - From tackling diseases to improving transport, technology like data and artificial intelligence has unleashed a wave of opportunities, but those still exclude society's most vulnerable citizens, according to a leading human rights researcher.

The "digitisation of information" impacts every sector in society but not everyone benefits equally, said Carly Kind, head of the Ada Lovelace Institute, a British-based research body named after the British mathematician and computer pioneer.

"We see huge power imbalances in terms of who governs, hoards and uses data, and in what ways," said Kind, who previously led a European Commission-funded project on data governance and privacy regulation.

Tech giants, once seen as engines of economic growth and a source of innovation, have come under fire on both sides of the Atlantic for allegedly misusing their power and for failing to protect their users' privacy.

Glen Weyl, a principal researcher at the research arm of U.S. tech giant Microsoft, said that "tech companies make up five of the six largest companies in the world and they have a business model driven effectively by surveillance."

"We need a society that treats people as agents of their own privacy rather than passive subjects in a surveillance state," he said at the Thomson Reuters Foundation's annual Trust Conference in London on Thursday.

Kind cited the criminal justice system as one area where marginalised communities have been discriminated against by the use of facial recognition and algorithms.

Computers have become adept at identifying people in recent years, unlocking a myriad of applications for facial recognition, but critics have voiced concerns that the technology is still prone to errors.

"Research shows that policing technologies predicting where crime might occur can be informed by biased datasets," said Kind.

"That could lead them to wrongly identify black people and people of colour as more likely to offend, and create over-policing in certain areas."

She likened new technologies to climate change, saying that those who had the least say are often the most affected.

Kind said the best way to ensure technology was a "force for good" and used in an ethical manner was to involve the public in debating such issues.

"Companies need to be more transparent, and communicate to people how their data is being used," said Kind, who took up her post in July.

"But the biggest onus is on the state: one of the lessons from Brexit is that people feel disconnected from policymaking."

Kind called on governments to adopt a "precautionary approach" to adopting new technologies.

"It's not about banning things or strictly regulating what we don't understand, but through best practice taking a slow and steady approach and figuring out what will bring everyone along on the ride," she said. 

Related News

Sign up for our weekly newsletter