How can behavioural science, data science and technology best serve our financial lives? Questions for a distinguished panel of economists, professors and opinion formers at the Summer Seminar: Man, Machine, Money recently held in Amsterdam.
It has come at a breath-taking rate, but we are now firmly in a world in which huge amounts of data are being vacuumed up by artificial intelligence (AI) programs to persuade us what to buy, wear, eat and invest. The question is, how can people benefit from this Cyber Society and what role, if any, do we have in making sure it all works properly?
Such was the issue before a panel of experts who met on July 9 under the auspices of ING's Think Forward Initiative and the Amsterdam Innovation District.
Too much data?
The meshing of AI with so-called Big Data has raised a number of problems, mostly not to do with the technology itself but with use. Paul Ormerod, an economist and visiting professor at UCL, showed a) how there was sometimes too much data to be useful and b) how consumers could easily be influenced.
In the first case, Ormerod noted that simply typing “mobile phone” into a Google search threw out about 155 million results or data points. In the second, he cited earlier studies that showed how knowing what others had chosen had led to “a phenomenal difference” in the music downloaded by a test group of students.
But he also showed how useful huge swathes of data could be, tracking the mood of Londoners based on an analysis of positive and negative tweets on Twitter. AI, Ormerod concluded, was good at picking up patterns, but pretty much useless at everything else.
In a similar vein, Professor Gerd Gigerenzer, Director of the Harding Risk Literacy Centre at Berlin’s Max Planck Institute for Human Development, noted a difference between risk and uncertainty – the former essentially knowing what could come, the latter not knowing. AI performs best in the first case, but humans have evolved to understand the second and machines haven’t. “We need to have a good machine-human collaboration,” he said.
Professor Gina Neff, a sociologist with the Oxford Internet Institute at the University of Oxford, focused on some of the misuses of Big Data by AI programs, or at least misuses by the people seeking to interpret what they were given by AI. In one infamous case, New York City police tried to find a suspect who looked like Woody Harrelson by running the actor’s picture through a face-recognition program, unaware that such programs use highly specific measurement -- not looks-like – variables.
What banks can learn
The people, however, are somewhat confused about it all to judge from findings presented by Tony Smith, global head of financial services at UK polling firm Ipsos. For example, 75% of people in an Ipsos survey said they would like to have access data on how they spend their money. But only 40% said they were comfortable providing information that could lead to that.
Smith said Ipsos had found that people became more positive about things as open banking – sharing data with others to get more opportunities – if they could be persuaded that they would benefit. “Utility overcomes concern,” he said.
By journalist, Jeremy Gaunt