Skip to content
Ideas Artificial Intelligence

The nudge and artificial intelligence

Francisco Marco-Serrano

Last week, the U.K. Information Commissioner’s Office (ICO) released a proposal banning the use of ‘nudging’ tactics to influence the behavior of children. It proposes a set of standards for age-appropriate design which aim to safeguard the personal data of children and suggests, “Do not use nudge techniques to lead or encourage children to provide unnecessary personal data, weaken or turn off their privacy protections, or extend their use...Reward loops or positive reinforcement techniques (such as likes and streaks) can also nudge or encourage users to stay actively engaged with a service, allowing the online service to collect more personal data. 

“Nudge Theory,” a concept in behavioral economics, gained prominence with a 2008 book co-authored by Nobel winning economist Richard Thaler — “Nudge: Improving Decisions about Health, Wealth, and Happiness.” It later became the nickname for a team set up by the British government in 2010 officially called the Behavioral Insights Team (BIT), but better known as the “Nudge Unit.” They were tasked with understanding how interventions in decision-making could be used to encourage desired behaviors including, in the case of the BIT, paying taxes, donating to charity, and organ donation. 

The Nudge Unit used A/B testing to determine, for example, that “when letters to non-payers of car tax included a picture of the offending vehicle, payment rates rose from 40 to 49%.” A/B testing has long been used in advertising, and is often the first step in creating a learning agenda for the marketing organization. The limitations of A/B testing, however, are clear in a comparison to analyses using artificial intelligence and machine learning. A/B testing is focused on measuring the causal impact on a handful of impact-response flows, and as a human designed experiment, is unlikely to uncover any new insights beyond said causal impact. Using tools like Google’s Custom Algorithm or Facebook’s Ads Manager for programmatic campaigns, however, advertisers can automate the collection, analysis, and digestion of terabytes of data. And the computing power to perform this is cheap compared to the humans required to attempt similar real-time pattern recognition and optimisation. Best of all, the holistic nature of an AI-aided approach could reveal causal impacts beyond the imagination of the marketer. 

Therein lies the rub. While the Nudge Unit conducted controlled experiments testing behavioral nudges relating to tax compliance, the algorithms at work in many organizations operate as black boxes, with little to no “explainability” of their results and the potential for “glitches” where the machine learns to discriminate by putting together known variables to infer unknown ones (e.g. race from education and postcode, or income from job title and education).

Relying on AI and algorithms can therefore have unintended results. Sometimes those are deemed beneficial - for example using computer vision to better diagnose certain cancers - and sometimes they are ominous - like an algorithm trained on alternating images of poisonous and edible plants, which didn’t learn how to recognize images of poisonous items, but rather learned that after an edible plant comes a poisonous one. Giving AI free rein might lead to successful new nudges, but without designing the proper incentives and rules for the algorithm, it might exploit techniques a human would consider unethical. 

There is no simple solution. Designing incentives for AI which prevent or limit “gaming” is difficult. We can set up rules to prevent explicit targeting by age, gender, or race, but given all the available data, how do we prevent the system from targeting by other data points that achieve the same end? 

The ICO is just one example of a regulatory body attempting to ensure the use of AI and the next generation of nudging is used in a way which is not harmful to humans, society, and the environment. The opportunity and some of the onus are ours now - to determine the principles of ethical advertising and nudge the industry toward transparency and explainability.