Bethan Turner explores the newest trends in research according to the latest IIeX conference in Atlanta.
I’ve been spending some time reflecting on my trip to the IIeX conference in Atlanta earlier this month, and one joke that kept making the rounds was that this was the first year not dubbed “year of the mobile”. This year was all about Artificial Intelligence, Data Science, Blockchain, and Automation. Given that my talk was on bridging the gap between market research and data science I was loving these sorts of comments, feeling like I was on trend for possibly the first time in my life (I’ve never been one for fashion).
However, the automation side of things scared me somewhat. Yes, I understand the need for automation. Everyone (including your clients, whether those are internal or external), wants everything faster, better, and cheaper. They don’t want to have to wait for specialists to get the job done. They don’t want to have to wait for humans to do something a computer could do much faster. And, in a lot of instances, I agree with them. There are a lot of jobs computers can do just as well, if not better, than humans. However, do research and analytics fall into this camp?
There’s an argument for both sides here. Which, often, in my mind, means the optimum solution lies somewhere in the middle. Yes, a lot of research and analytics tasks are similar, despite what sort of data you’re working with. A lot of these tasks are open to human error, and subjectivity. But the subjectivity is part of these tasks for a reason. Having a human eye oversee your research makes a big difference.
Humans can easily spot those anomalies, those outliers that don’t fit with common sense. Humans can interpret your data and your outputs. Humans can evaluate the commercial relevance of your insights. Computers, and automated processes, will struggle to do this.
The best way forward, in my mind, is a half-way house. Build processes and programs that can help you with your job, but don’t claim to be able to do it for you. Set rules in place that ensure things are not taken out of context. Allow a helpline or something similar so that you (and your clients) can peer inside the black box if, and when, they want to understand things more.
The British Election Study has done this remarkably well, by creating a “Data Playground” that allows users to build visualisations and conduct analysis on its data automatically. However, it does not allow any analysis that might be misconstrued, or where base sizes are lower.
As speakers at IIeX were saying; these are exciting times, and we should be experimenting and leading the way with new technologies and methodologies. As long as we realise we are experimenting. The golden rule in my mind? Do not try to automate the things that you need to think about – quite often, if you can’t automatically draw a conclusion you trust, you shouldn’t expect a computer to either.
P.S. Want to have a nosey at what we were saying at IIeX? Get in touch and we can have a chat, or send the slides over to you.