Humans Are Much More Than Data

In my last article I posed the question: “Is Data and AI the New Plastic?” – are we, today, blindly creating a new form of data or information pollution that could have unforeseen and negative consequences for future generations?

Foreseeing the unforeseeable is a perplexing question and one deftly addressed by Margaret Heffernan during yesterday’s ‘A Point of View’, on BBC Radio 4

Entitled “The Myth of Inevitability”, Heffernan, a leading UK technologist, CEO and writer, questioned the idea that the future is inevitable; that technology will fix everything; that the march of technological progress is unstoppable and we can either get on board or get left behind.

During the radio segment, Heffernan described this “tone of inevitability” as bullying and ugly. She also critiqued the certainty of statistics used by the purveyors of this new world view. Driverless cars, Artificial Intelligence etc., she argued, are sold with “the same brute force” even though there are huge problems with these technologies. They could fall, she suggests, into the ranks of all the other predictions that never came true such as the paperless office, ebooks replacing physical books, or streaming killing off the vinyl disk and cinema. All of these “predictions of inevitability”, Heffernan argues, are driven by agendas. Presented as fact, their speakers imply that they know the future and any questions we might ask simply show how little we understand. “But their goal”, she says, “isn’t participation, but submission.”

For Heffernan, the narrative of inevitability is simple, easy to mass communicate and all to easily consumed. It allows us to forget how, in the past, people were as confused and uncertain as we are today. “Questions are what the rhetoric of inevitability seeks to silence,” she says. We can’t know what will happen in the future because it will be shaped by new knowledge which we don’t have yet. So, anyone claiming to know the future with confidence is, according to Heffernan, really trying to own it. This also includes the hype and propaganda that surrounds Big Data and AI. Heffernan points to that fact that Big Data is patchy and selective and that several Big Data failures have demonstrated that you can’t capture the complexities of human life just by crunching data. Heffernan also warns that applying historical patterns to people’s data is going to be dangerous. “Human beings are much more than their data. There is so much in life that can’t be quantified. Just because it can’t be counted, doesn’t mean it doesn’t count,” she says.

Like Heffernan, I am also a technologist and I am not a Luddite. I agree with her when she argues that our choices must be informed by values that go beyond those offered by science and engineering, economics and finance. I also agree that such choice places an obligation on technologists and scientists to communicate with anyone about the consequences and ramifications of their work. “The language of inevitability has to be called out for the propaganda that it is,” she says. Heffernan points out that not knowing what tomorrow brings gives us the opportunity to shape it. “No one group, voice or discipline owns the future,” she adds.

From data crunching to information security and privacy; from robotics to Artificial Intelligence; from social media platforms to smartphones, I also believe the future of these technologies is as uncertain as any other ‘advance’. Nor do they have a monopoly on the future. We can’t foresee the unforeseeable, but we have brains and the ability to envision what the future could look like. And, yes, we also have the tools and knowledge that enable us to consider, test and control, as best we can, the possible outcomes of our actions today for the generation of tomorrow. That of course assumes we are willing to look beyond what we are doing today. I am pessimistic. Too often, I see science and technology, developed and used for all the wrong reasons. It’s a sad indictment in a world where, to avert environmental catastrophe, there has never been so much need for positive human intervention. I don’t believe a solution will come but if and and when it does, if it is to work, it has to be one that embraces every section of society, not just the scientists, engineers and technologists.

Philip Adams SVP

#data #artificialintelligence #dataprivacy #robotics #margaretheffernan #economics #technology #science #cybersecurity #silconvalley