Rules for Robots – regulatory enforcement trends?

Published 21 August 2017, Kevin O’Leary, VP Product Management, Corlytics

Every job is at risk of being replaced by a robot – even in financial services, or so it seems.  From back office applications like algo-trading and fraud detection, robotics process automation is now moving steadily into the front office. Robo-advisors offer lower cost automated investment opportunities to clients and on the trading floor.  New Artificial Intelligence powered software supports human traders. The objectives make sense of course: reduced cost, less errors and hopefully happier customers.

Regulators are keen not to get in the way of this innovation. The FCA Sandbox in the UK and the Innovation Testing License provided by the DFSA in Dubai show how regulators are facilitating invention and creativity – especially for smaller firms exploring new business models. However, invention and risk remain constant companions. As intelligent automation penetrates the industry further, what can we expect?

Corlytics assesses regulatory risks across the globe. In the last 18 months, Corlytics has seen a rise in enforcement cases that involve IT failure. Since the start of 2016, more than $400m in enforcement penalties have been issued that involve a technology element. Recurring enforcement themes include failures around governance, testing and record-keeping of automated systems. Robots are not invulnerable. Even simple coding errors can wreak havoc.

Risks are magnified when some badly-behaved software gets in the way.

Not surprisingly, regulators are not forgiving when core principles of market conduct and treating customers fairly are not followed – whether by human or machine. This creates an interesting organisational challenge. The technological expertise required to build robotic systems and the business expertise to supervise their behaviour are not the same. Handing over the “build-the-robot” challenge to specialist firms brings with it additional oversight and governance problems.

As the use of robotics expands further into the domain of client relationships, we expect regulators to focus not just on issues like cybersecurity and data protection, but on “robot conduct” issues too.


Kevin O’Leary, VP Product Management, Corlytics