While automation and technological innovation should be embraced rather than opposed, they are not foolproof solutions by any stretch of the imagination. Arresting criminal suspects based on information provided by a computer program is not acceptable. A UK human rights group has expressed concern over these practices, claiming that such systems are “advisory” rather than “necessary”.
Police and Algorithm-based Tools
It is evident there are a lot of exciting developments taking place in the world of algorithms and artificial intelligence. Not all of this change is considered positive, which is only to be expected at this point in time. More specifically, when it comes to law enforcement agencies arresting and detaining suspects based on information provided to them by a computer program, there is every reason to balk. That’s especially the case if such programs tell police that certain individuals “may be offenders” in the absence of credible evidence to back up those claims.
So far, we have seen various law enforcement agencies all over the world show a keen interest in taking computer algorithms into account when arresting perpetrators. This is a very fine line to walk, as algorithms are not 100% correct whatsoever. While it is commendable that police experiment with new technologies, such analyses always have to be taken with a grain of salt. Identifying potential suspects based on computer software alone is not the answer; that much is evident. If false positives are generated, innocent people will often not only be arrested but also locked up for an extended period.
Over in the United Kingdom, a human rights organization is arguing with MPs over the potential implications of this new technology. The organization is very concerned about the discriminatory nature of these algorithms. Profiling potential perpetrators based on their location, gender, and past behavior is not an exact science. With the number of such algorithms rising at a rapid rate, it is only prudent that MPs take the time to evaluate their potential repercussions.
Police in Durham, England are using this sort of algorithm right now. The algorithm helps the police decide whether or not a given suspect should be kept in custody. It’s a very strange approach, considering a software tool can’t always have all the facts. The Harm Assessment Risk Tool uses historical data to classify suspects as low, medium, or high-risk individuals. While this novel approach is pretty interesting, it is not necessarily the right way to go about things.
For the time being, Durham officials claim this algorithm only serves an advisory role first and foremost. Not everyone may agree with this statement, though, as it may not be all that easy for humans to go against guidance rendered by the algorithm. There will always be some form of bias to contend with, and that particular aspect should not be overlooked by any means. Regardless of statistics, there is always a chance of being wrong, as the information used to render a “verdict” may be incomplete, biased, or simply incorrect.
One main selling point of these algorithms is that they allow users to do more with less. Indeed, they may prove to be a valuable tool for law enforcement agencies in the sense that they free police up to do actual police work. This double-edged sword is not all that easy to wield by any means. Initial tests of this particular tool have been pretty positive, but they should always be put in the correct perspective.