What You Should Know About the Latest Congress Effort to Make Encryption a Crime

A new bill in Congress will force tech companies to undermine or disrupt their own security and encryption features anytime law enforcement requests them to do so. Sounds awful? This. Here’s what the invoice says and what you can do about it.

For those just catching up, Apple and the FBI recently launched a major legal battle over the iPhone belonging to Syed Rizwan Farouk, a gunman in the San Bernandino mass shooting . The FBI has demanded that Apple create a tool to bypass the phone’s PIN lock. Apple has argued that this is an undue burden and will weaken the security of all iPhones. In the end, the FBI backed down and found a third-party firm to unlock the iPhone , although there is another phone in the game right now , across the country.

However, in response to the whole case, Senators Diane Feinstein (D-CA) and Richard Burr (R-NC) are currently working on a bill so law enforcement can get what they need without having to beg. If the Feinstein-Burr Bill is passed, tech companies will be forced to obey court orders for data transfers, even if the data is encrypted or the company cannot access it. Last Friday, a preliminary version of the so-called “Compliance Act 2016” was released. This version is not necessarily final, but it is already pretty awful. Unless major changes are made, this law is dangerous for all who value their safety.

What will this bill do

According to the draft, released on Friday, every time a technology company receives a court order to provide information, it must be able to comply. Either by having access to the data itself, or helping the government find a way to access the data. In other words, the company cannot say “This is impossible” and stop working. A tech company faced with such an order would have two options:

  • Pass information directly. If a company has data on its servers related to a court order, they will have to transfer it to law enforcement agencies. It should be “in an understandable format”. This means that the company must be able to translate encrypted data into a readable format. This will require technology companies that offer encryption to either store keys to decrypt data, which makes their customers’ data more vulnerable, or worse, use only encryption that the company itself can break, making encryption virtually useless.
  • Help law enforcement agencies gain access to information. If a company doesn’t store data somewhere, it will have to provide ” technical assistance as needed ” to help the government access the data. In other words, tech companies will be forced to resort to forensic investigations until the government decides the job is done. It is noteworthy that this bill does not limit how much effort the government can require from the company. However, there is a provision whereby they will be “reimbursed” for any costs incurred in connection with the provision of technical assistance.

Using the San Bernardino case as an example, this new law required Apple to gain access to Farouk’s iPhone as it was the subject of a court order, no matter how much Apple believed it might harm their business or their customers. ‘ safety. However, that is a bit confusing, it intentionally does not say how Apple needs to do it. One of the sections of the law reads:

Nothing in this Act should be construed as authorizing any government official to require or prohibit the adoption of any particular design or operating system by any protected organization.

In other words, the FBI cannot ask Apple for a specific software feature that could bypass phone encryption (which they did in the San Bernardino case). Instead, it simply stipulates that Apple must do it in some way. It also says that Apple’s job won’t get done until the government decides it’s done.

The bill also applies to app stores. One section states that any company that “licenses products, services, applications, or software” must ensure that those products comply with the law. In other words, if Apple cannot guarantee that an app developer can transfer their customers’ data, Apple cannot legally allow their apps to be placed on the App Store. Again, the bill does not say how the company should ensure that every application it distributes complies with a court order. At best, it legally requires a lengthy security audit of every communication app in the store. In the worst case, app store owners are required to determine which security features developers can use. Admittedly, this is a bad sign.

Everything else is wrong with this bill

As it stands, this law is disastrous for tech companies and consumers. One of the biggest challenges is that what is required to do so may not be possible in many situations. For example, WhatsApp recently enabled end-to-end encryption for all messages . Google has been doing the same with Gmail for a long time. Both companies cannot access the data sent through their service without physical access to the end device, and any data received in transit is definitely not in a “readable format”. This law requires the company to find a way to transfer this data in a way that law enforcement agencies can read and use, although this is literally impossible. According to political analyst Julian Sanchez , in some cases this law is tantamount to asking the company to do magic:

WhatsApp will have two options. On the one hand, they can turn off end-to-end encryption, making their products less secure and upsetting their customer base. Alternatively, they can embed a backdoor or maintain a database of their clients’ encryption keys, which undermines the security of the platform . It’s like requiring the person who built your house to have a set of keys to your house and have a special door that only he can enter.

Both options weaken consumer security and open doors for other attackers who might want to steal user data, messages, and anything else sent through the app. Under this law, strict security measures will be illegal . If a product or service is so secure that a company or government cannot access or decrypt it, the company will have to weaken that security in accordance with the law.

Another major issue is that the law requires companies like Apple, Google and Microsoft to monitor their app stores to keep apps safe and uninstall. These companies will have to not only weaken the security of their products, but also make sure that any application in their app stores is also weakly protected. In addition to banning secure apps, it will put an undue burden on both app developers and app stores to make sure every app complies with this law.

You may not need an account either. TheAll Court Orders Act (which we referred to in the Apple / FBI case ) allows the court to order a company to assist in an investigation in any way necessary, as long as compliance is not an unreasonable burden. In Apple’s fight against the FBI, Apple argued that creating the tool the FBI required was an unreasonable burden that would jeopardize the security of many more iPhones than just one in this case. However, Apple has helped law enforcement agencies to extract data from other iPhones many times under various circumstances. The new law will compromise the security of every device and application in the world simply to deal with a few exceptional cases where the government cannot use existing laws or, as the FBI proved in the San Bernardino case, existing security researchers and contractors get the right one. information to them.

Supporters of the bill argue that the “darkening” problem must be tackled. As security technology improves, the work of law enforcement agencies becomes more complex. In the past, advanced levels of encryption and security were only available to governments and highly organized criminals. Now, anyone with a recent smartphone has the potential to thwart a federal investigation. This places a significant and growing burden on law enforcement in dealing with technologically advanced criminals. However, as our editor-in-chief Alan Henry explains, this is how it should be.

The FBI, NSA and CIA shouldn’t crawl into Silicon Valley to hack phones or encryption. They should already have the opportunity to do this , and if not, what are they waiting for and where have they been for the last 20 years?

While “getting dark” is a legitimate issue, hiring tech companies that make all of our gadgets and apps is a bad decision. Law enforcement agencies need to be equipped with the tools they need to conduct investigations without compromising the safety of users who have done nothing wrong. Every US citizen should not keep a weak lock on their front door in case the government ever needs to break it.

How to make your voice heard on this issue

Currently, the final version of this bill has not yet been officially released, so there is still time to speak. You can find information on how to contact the offices of Senator Burr here and Senator Feinstein here . You can also use 4USXUS to find information about your senators and representatives and get contact information. You can also use Democracy.io to easily contact them without looking for contact information. Use any of these methods to let your representatives know what you think of the bill and how you think they should vote if it gets to a committee, or, God forbid, a place for a full vote. You can also contact the White House here to inform the president that you support the veto if the bill is passed. Gizmodo also has a summary of the computer security views of every presidential candidate currently running for the office.

You can also use 4USXUS to track the bill itself after it has been officially introduced (for example, this is what CISA looks like ). In the meantime, this is always a good time to reach out to your Congressional representative so that your voice is heard. At the moment, it may seem unlikely that this law will be passed (and it has received a lot of very negative attention). However, when no one was watching, the worst scores were accepted, so take the time to voice your opinion while you can.


Leave a Reply