I think its better to just admit that freedoms / tech will always be misused by criminal actors, and that's just a price we agree to pay for privacy, security, and liberty. I don't think think that's a controversial statement, and we make such trade offs all the time unconsciously. The United States has largely agreed to accept a certain amount of criminal gun violence in the name of personal gun ownership. We agree that a certain amount of money laundering will occur due to shell corporations and foreign ownership of assets. We agree that police have to let a certain amount of crime go unpunished in order to protect against unreasonable search and seizure. The only difference between those things and this is that no one has the balls to stand up and admit that a certain amount of child abuse is an acceptable price given the stakes at hand, even though it is true.
> I think its better to just admit that freedoms / tech will always be misused by criminal actors, and that's just a price we agree to pay for privacy, security, and liberty.
It's possible for both things to be true at the same time.
If Signal exists and is secure, will criminals use it? Sure they will, criminals are people and people want private communications.
But if you ban honest citizens from using Signal, will criminals stop using secure communications? No, they have an unusually strong incentive to use them and will seek out alternatives. The percentage of criminals who switch to insecure communications will be lower than the percentage of honest people who do.
Which increases the amount of crime, because the amount you're helping law enforcement catch criminals is smaller than the amount you're helping criminals exploit victims. This is also compounded by the fact that there are more honest people than criminals.
There is a theory of bureaucracy ("an institution will attempt to preserve the problem to which it is a solution") that says law enforcement agencies will ask for this even when they know full well that it will increase the overall amount of crime, because more crime is good for them since it means more law enforcement.
I agree that criminals will use secure communications regardless of the law. I don't understand what you mean when you say it will increase crime though.
Regardless, I feel like there's a deeper motive from governments/law enforcement. It would allow them to claim that anyone using secure comms must have something to hide and is thus a criminal. Combine that with mass surveillance and anyone you see sending encrypted traffic can automatically be assumed to be a criminal. I'm not saying this is right, it's certainly not right. But I'm sure that's the argument that will be used by those trying to push it.
The only way to fix this is secure-by-default comms, such that all traffic looks the same and you cannot make any claims of criminality based on that alone.
> I don't understand what you mean when you say it will increase crime though.
Suppose you're a criminal organization or a foreign government. You break into AT&T or Amazon or whomever and get access to a bunch of data streams. If they're all E2EE, you have a bunch of inscrutable ciphertext. If they're not, you have everybody's passwords, trade secrets, credit card numbers, information useful for blackmail etc. Lack of strong encryption enables crime -- that's why honest people use strong encryption.
This is also a good reason to use a VPN and tools like Tor, even when you have nothing to hide. The more normal it becomes the less likely it can be used as presumption of guilt or probable cause.
> But if you ban honest citizens from using Signal, will criminals stop using secure communications? No, they have an unusually strong incentive to use them and will seek out alternatives.
This has been a 2nd Amendment argument for ages: "If we outlaw guns, only outlaws will have guns."
It has been a 2nd Amendment argument for ages because it's tautologically true.
It correctly identifies that the proponents need to justify the cost from substantially all law-abiding citizens following the law against the benefit from only the law-abiding criminals following it.
And say what you will about the benefits of law-abiding citizens carrying firearms, but if you want to seriously dispute the benefits of law-abiding citizens using encryption, try convincing a credit card company to let you accept credit cards on your website without encrypting the traffic.
It sounds like you accept the bill's authors' claim that EARN-IT is about protecting children.
I'd be very interested in hearing from child abuse investigators how the controls in the bill line up with how tech is used in abusing children. My expectation is that there is very little alignment, because "for the children" is most often the rallying cry of politicians who want something that is not in the best interests of the people they are supposed to represent.
> It sounds like you accept the bill's authors' claim that EARN-IT is about protecting children.
No, you're putting words in their mouth.
You have your head in the sand if you don't think people use perfectly legitimate encryption service to discuss illegal activity. But that is not a reason to ban encryption. The entire US constitution is built on the premise that people have rights.
But it has always been true that some people use their rights to avoid having their criminal activity detected. That doesn't make our rights any less important.
>“Our goal is to do this in a balanced way that doesn’t overly inhibit innovation, but forcibly deals with child exploitation,” US Senator Lindsey Graham (R-South Carolina) said last month in announcing the legislation.
> The entire US constitution is built on the premise that people have rights.
As much as I'm near-absolutist on civil liberties, I think it's also valuable to recognize that the intrinsic good of individual rights are only one part of the story; the other is the balance of power between government and the governed.
I recently heard Sam Harris opine that from a utilitarian perspective, an absolutist right to privacy pales in comparison to allowing harm to come to children, and so the tech community needs to flex a little on the privacy question, and meet law enforcement halfway. Through that reductionist lens, it's hard to find fault in the argument.
The problem isn't limited to privacy, though. Unbreakable digital locks exist, and they aren't going anywhere. [0] And there is power in the ability to keep secrets. You can bet the Feds have little interest in a Panopticon, where they too are obstructed from keeping digital secrets, as "meeting us halfway" for some greater good. Rather, they want to hoard that asymmetric power as their exclusive purview. No matter how well-intentioned, that asymmetry of raw power is something We The People have a vested interest in taking seriously, far beyond some abstract notion of "I want to Google ${CONSENTING_ADULT_SEXUAL_ACTIVITY} without worrying the neighbors will find out".
I don't know about US but in EU electronic passports and electronic IDs are becoming mandatory. So all people will have an RFID device with them all the time. And let's not forget the mobile phones which can be localized with high accuracy even without GPS, usually because the device can be seen by more than 3 base stations at a time.
The Chinese made mass surveillance even simpler: they have lots of cameras and face detection.
Just because it's possible to use something as a source of information, it doesn't mean it is used as part of a massive dragnet. Yes, it's possible to track phones, but most countries don't have a dragnet implemented based on this information, as far as I'm aware. It's not a lost battle and we still need to push back to ensure it is not.
There is never any logical reason to suppose that the right solution lies in between 2 extremes. If the question is the answer to 2 + 2 the answer isn't halfway between 0 and 9000.
Secondly when a party consistently pushes for an extreme position if you meet them halfway as a matter of policy you will shortly find yourself within spitting distance. The only productive position is extreme obstinacy.
> I recently heard Sam Harris opine that from a utilitarian perspective, an absolutist right to privacy pales in comparison to allowing harm to come to children, and so the tech community needs to flex a little on the privacy question, and meet law enforcement halfway. Through that reductionist lens, it's hard to find fault in the argument.
I'd say it's pretty easy. For utilitarianism to make sense, it has to take the future into account. And what looks like an absolutist right to privacy might be a utilitarian argument of the type that if you grant a monopoly of power (private or public) the right to make use of your private information, then it could well use that private information against you later.
An integral utilitarian might then say "it's worth some harm to children today to ensure there won't be great harm tomorrow". That kind of being able to trade off different scenarios of harm without regard to absolute principle is pretty much what characterizes (act) utilitarianism.
I don't believe that. I'm simply saying that if the stated logic for this bill is that we need to regulate encryption because there is an unacceptable risk of misuse, then my response is that I actually accept the current level of misuse risk given the current level of regulation. Instituting further controls in the form of regulation would cost us more than the perceived reduction of risk that it affords.
Obviously this bill is about more than that, but I think that statement pretty much torpedoes their main public argument.
It's a difficult question to answer because most of what HN complains about is speculation based on assuming bad faith, and doesn't seem to line up with what is actually in the bill (from what I can tell).
Just because an Ethernet cable can be used to strangle someone doesn't mean that failing to stand in opposition to network wiring is to accept a certain amount of murder by strangulation. Don't focus on the tool being used for the crime but on the tool committing the crime.
I think this depends on the tool. Certainly we could see the tool being a problem if it was a mini nuke or Anthrax (I don't for the record think encryption rises to this level).
I'm very concerned that technology will put something devastating (at scale) in people's pockets and then we're kind of screwed (do we choose big brother and all that entails, or indescribable mass destruction?). I don't have a solution but it keeps me up some nights.
There are degrees to which tools are useful for committing crimes, and it's naive to pretend otherwise. Encryption is obviously an incredibly useful tool for committing a number of crimes, and I think it's better to argue that it's worth it than to act like there's no connection.
The government wants to expand surveillance so that way potentially disruptive social movements can be monitored and disrupted. Activists use signal too.
In case you hadn't noticed, the government is currently on its backfoot and disruptive social policy reforms are back on the table. They want to make sure that corporations get everything and the people get nothing.
The encryption fight has been going on for decades, but at root their complaints about terrorists and child trafficking are covers for expanding a lazy version of COINTELPRO. Lazy meaning that they can just sit in an office and see everything. Let's not forget the FBI's role in trying to get MLK to commit suicide. These shadowy agencies are not in any way the good guys.
A crowbar is also an incredibly useful tool for committing a number of crimes, and yet I don't see any legislators pushing to ban Home Depot from selling them, or to ban me from buying them.
Cars used to be a good example, but this is quickly changing. Modern cars relay OBD-II (unofficially OBD-III, not entirely ratified) data over cellular networks. Most electric cars and especially self driving cars are sending and receiving telemetry data and software updates all the time. Some people are even voluntarily adding OBD-II cellular dongles to their car to get lower insurance rates. This includes real time GPS coordinates and speed. Some regions are already considering making this a requirement for cars sold after {n} date (date to be determined) so they can see your smog emission data real time. This almost happened in California, but car manufacturers were not ready and successfully pushed back, for now. I would suggest that within a decade or so, a majority of cars will be wiretap devices.
Truly, this is a stance we have to have for everything.
If we want criminal justice reform, too, for example, we have to agree that some criminals will come out of prison after their shorter sentences and they will get into positions and jobs where they will cause harm.
Any lightening of sentences will come with bad people getting through and hurting others. But, this is an acceptable price to pay to allow the other felons redemption in this world.
> I think its better to just admit that freedoms / tech will always be misused by criminal actors, and that's just a price we agree to pay for privacy, security, and liberty.
Yes! Also, one sure way to know that we have "privacy, security, and liberty" is that criminals are abusing them. And, as an added benefit, efforts to identify and apprehend criminals help identify weaknesses and OPSEC failures.
The EARN IT law enables warrants for digital privacy. The problem is that the choice is between "warrants are impossible due to encryption" and "warrants can be skipped by misbehaving actors".
Well, this is not truly defacto - if it's less than six months old, sure. There's some ancient history that complicates it. In practice I'm pretty sure Google and other providers will fight for a warrant (citing US vs Warshak), but technically speaking anything older than six months could be gotten with an administrative subpoena.
Of course, there's a whole 4th Amendment discussion there. And IANAL, so feel free to fact check whatever.
Third-party doctrine. It is awful but well-established.
If you want a good grounding in the legal precedents - both laws and decisions - that have gotten us here, read Habeas Data. Great book laying out all the terrible implications.
> ...and that's just a price we agree to pay for privacy, security, and liberty.
I think this is fine here, but I am compelled to point out and remind, given the amount of concurrence in the thread:
In a more rigorous discussion, I think this is a particularly dangerous line of thinking to the Stallman-level advocate and their campaign down the line.
Edit (oops. Chopped off a long version of this paragraph when I edited down the post): Privacy, security, and liberty are maintained by the advocate to be the natural rights that are paid in price for justice.
This isn't to speak of those in agreement here or myself (and not just limited to said advocate), but on the part of anyone that uses such framing, for risk of it massively normalizing, even if I find it an artistically made point.
You listed two things that easily and obviously line up with a Bill of Rights amendment... not sure there is one of those for encryption. Unless I’m just blanking...
Maybe flame wars. It would be nice if people believed in abstract principles that strongly, or rather, almost that strongly would be perfect. Empirically wars are fought over which groups get to control resources.
It depends on your definition of resources. In some cases it may be a mountain or river near my hometown, while in others it might be my house or my husband.