today-is-a-good-day
HomeLocker RoomIn VAR As In Football, A Human Touch Is Much Needed

In VAR As In Football, A Human Touch Is Much Needed

Published on:

Mike Ryder, Associate Lecturer, Literature and Philosophy, Lancaster University is the author of this article which was originally published in The Conversation, an independent source of news and views from the academic and research community.


Over the past year or so, the Video Assistant Referee system, known as VAR, has been gradually rolled out to the world of football with the aim of improving the accuracy, and consistency, of refereeing decisions. However, its introduction has not been without controversy. The latest edition of the Women’s World Cup has been beset by issues, and there can be no escaping the public outcry at decisions that are deemed to be harsh, unfair or, in some cases, just plain wrong.

But while VAR is taking all the blame, the problem is not so much the system, or even the way it’s applied, but rather with the rules of the game itself – and the way those rules are applied by human decision-makers at the other end of the line.

As many fans have come to realise, most decisions in football are not cut-and-dried. They are subjective decisions made in real time by a human referee.

By adding cameras, slow-motion replays and a strict reading of the “letter of the law”, so VAR is exposing the problems with the rules themselves, and the way no rule can ever account for every eventuality on the field of play. By improving the “accuracy” and “consistency” of decision making, so VAR is turning what was once a fluid, analogue-style decision into a strictly digital robotic question of yes-or-no, in-or-out.

Cameroon players react with dismay after a goal is disallowed during their Women’s World Cup clash with England, June 2019.

Decisions in football simply aren’t clear-cut. Take hand ball for example. According to the laws of the game, a penalty should be awarded for a deliberate hand-ball in the penalty area. And yet what the VAR controversy has shown us is that it is very hard to determine what we mean by a deliberate action. If a player is competing for the ball, they are, by definition, deliberately trying to win it back. So any handball (accidental or otherwise), is deliberate in the broadest sense – even if the intent is not to commit a foul.

This grey area in the rules has been exacerbated by the introduction of VAR, which has served to draw attention to “errors” on the field of play, and has stripped nuance from the decision-making process. Either a hand-ball has been committed, or it has not. With VAR, there can be no in-between.

No room for error

The VAR controversy reveals an issue that strikes to the very heart of debates around robots and robot ethics. By coding an ethical “decision” into a machine, there can be no nuance, and no scope to err. Take killer robots for example.

Were we to send out a host of such robots into battle then we would need to program them with a series of protocols to define who to kill and who to avoid. While wars may have once been fought between two (or more) clearly marked sides, it is no longer so easy to distinguish between friend and foe. Our enemies no longer wear uniforms marking them as targets, and more often than not, our enemies move among us. It therefore becomes difficult, if not impossible, to program a “killer robot” to decide between friend and foe based on uniform alone.

A ‘robot’ takes part in a campaign to ban Killer Robots, Berlin, March 2019. EPA-EFE/Alexander Becher

These issues become even thornier when we think about civilians and human rights. At what point does a civilian become a combatant? At what point do they become a legitimate target?

While many would (rightly) point to the fact that killer robots can actually make better ethical decisions in some cases, on account of the fact they do not succumb to stress, fatigue and disorientation, they still have to make a decision based on a predetermined set of codes.

The issue here is not that the robots don’t carry out their instructions, but rather, as drone theorist Grégoire Chamayou suggests, the issue is that they will never disobey, and will continue to carry out their orders exactly as programmed. This exposes a paradox at the heart of human ethics for any decision, must, by definition, be a sacrifice of all other decisions at all other times. By making a decision one way or the other, and codifying it in computer code, we are making an ethical decision for all time, that the killer robot will follow regardless.

To shoot or not to shoot?

This brings us back to football and the question of VAR. As the VAR controversy shows us, there is no such thing as a simple rule. While of course, rules (and indeed laws) are designed to be followed to the letter, in a strict robotic fashion, as soon as we do, so we expose the perversity of applying a universal general rule to the infinity of individual cases.

Thus, even something so “simple” as hand-ball in the penalty area is not quite so simple and clear cut as it first appears. These decisions then become even more problematic in a military setting. This is because targeting criteria are often fluid – and it is rarely so simple as telling a machine to target “anyone with a gun”. As the VAR controversy in football shows us, sometimes we need a human touch.

also read