Posted on

How companies may be forced to put “back doors” in their encryption software

Tech companies are coming under increasing pressure to create “back doors” in their encrypted messenger software, so that, with a warrant, the police and intelligence service can view previously private conversations of suspects. End to end encryption involves scrambling messages in transit, and then, if the recipient has the right key, unscrambling them. The system is what WhatsApp use by default to protect the privacy of the messages sent on their app.

Analysing the metadata can give clues as to what was in the sent messages, including when they were sent, how many people received it, and the location of the sender and recipient at the time the message was dispatched. But crucially, the metadata cannot tell one the specific content of the messages.

This kind of encryption can be used to protect all kinds of data, in many data acquisition environments, for perfectly valid and acceptable reasons, but is also means that it can hide criminal activity, including financial information, from the authorities.

But many companies say that there is no way to install a back door in their encryption that only the police and security services can use. It means opening up private information to all, the good and the bad. The companies say that overall, encryption keeps us safer than we would be in back doors were installed, despite the potential for users to abuse the privacy it provides.

Any back door, into an encryption software, is surely open for abuse, but in this case, does the end justify the means?

Leave a Reply

Your email address will not be published. Required fields are marked *