Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It takes 10 minutes to put an old HTTP server behind a reverse proxied nginx instance with auto-renewing letsencrypt certs

Hi. I've been online since 1982, when I built my own modem from parts sold out of the back of a magazine. What you just wrote might as well have been instructions for assembling a DIY fusion reactor, for all I know.

That's what the author means when he suggests that something is being lost. I now have no realistic choice but to go through a centralized gatekeeper to post content on the Web. That sucks. It wasn't supposed to be that way.



> no realistic choice but to go through a centralized gatekeeper

This is the critical part. Even though Let's Encrypt is not-for-profit, etc. It is still a single entity. Until someone can go and start their own CA in that 10 minutes that works across the world, we're moving away from and not towards what HTTP was.


The A in CA is always going to stand for authority. The basis of the CA system is that CAs (supposedly) can be trusted. If you remove the hard parts of being a CA, then you remove the basis for that trust (however questionable it was to begin with). You don’t have an authority any more, you just have an ephemeral group of self-attesting certificate issuers, and why would you even need the issuers in that case?

These are just the general (and unsolved, or perhaps even unsolvable) problems associated with PKI. I think there’s a solid argument to be made about separating the encryption of communication from the authentication of server identity. That would solve one problem, but it would also just take an existing bad problem and (arguably) make it a lot worse. Given all of its horrendous shortcomings, the CA system is still the most successful PKI ever created, by quite a large margin.


The usual alternative proposal is some form of decentralised web of trust. This has the significant advantage that it includes neither the letters CA nor SPOF. But of course it has difficulties of its own, and unfortunately it's hard to see how we could move safely and reasonably quickly in that direction given the entrenched powers that now dominate the CA and browser markets.

It is worrying that Let's Encrypt is currently the only major provider of free, relatively easily arranged certs, though. Its success has been impressive, but it's probably also one of the biggest single points of failure in modern Web infrastructure now.


The problem with a decentralized web of trust isn’t that CAs are too entrenched to be replaced. It’s that there’s no such thing as an alternative decentralized web of trust solution, and no reasonable idea about how one could be created. Amazon reviews work on a decentralized web of trust model, and any PKI attempting to implement one is going to run into exactly the same issues that those have. The trust has to be put somewhere. It’s not realistic to expect the users to validate trust themselves, if you establish an authority to trust, then it can abuse its power, and if anonymous participants can abuse your model, then anonymous participants will abuse your model.

All decentralized web of trust models lean towards one of those problem categories. They can be too onerous to be usable (like the original PGP proposal), easily abused, or simply create even more dubious shadow authorities. There is no known PKI model that solves all of those problems at once, and not really any basis for even presuming it’s possible.


Just to be clear, the problem I think we're trying to solve with a web of trust is the authentication one, not the encryption one. Several existing systems such as GPG demonstrate the viability of a web of trust solution in principle. What is currently missing is a practical method of scaling up.

Yes, any such method would inevitably require a different process to how things work today, most likely including some sort of administrative action to get a new domain recognised when setting up a new site. Still, if you'd suggested seven or even six years ago that anyone would be able to set up effective HTTPS hosting with relatively little effort and without paying for an expensive certificate from some big name CA, you'd probably have been laughed out of the building. Five years ago, Let's Encrypt was starting its rapid rise to dominance, and as the various limitations in the original scheme have been removed, that rise has shown no sign of slowing down.


The basis of the PGP/GPG approach is in-person key signing. It can only work if you're willing to authenticate somebody's identity with them face-to-face (keep in mind that publishing your public key online is currently only possible because of CAs). The web of trust model is an improvement to this, because it allows you to choose people that you trust to also perform this ritual properly. But to say all you need to do it figure out how to scale this up is a flawed premise.

The PKI you choose for the internet needs to work for all internet users. The web of trust model only works for highly motivated and technically competent users, and even then, such a user cannot possibly authenticate all of the identities they need to for typical internet use. They end up with a small group of people they can communicate with more securely. They aren't heading off the google HQ to sign their TLS keys, and if they discover a new web service they want to use, they're not going to hold off on using it until a proper key signing ceremony can take place.

Such a system is not fit for the purpose of securing internet communication, and it's not fit for use amongst the majority of web users. Your options for improving those shortcomings is to either create a system that can easily be gamed, by requiring users to trust some sort of community consensus, or to establish trusted authorities (which is just reinventing CAs, and probably in a way that's worse than the current CA system).

The web of trust model simply cannot work for this purpose, there's currently no plausible approach for improving it, and it's not obvious that such an approach is even possible.


The basis of the PGP/GPG approach is in-person key signing. It can only work if you're willing to authenticate somebody's identity with them face-to-face (keep in mind that publishing your public key online is currently only possible because of CAs).

That depends on what you're trying to prove. At the moment, CAs typically accept that someone attempting to create a certificate for a certain domain is the legitimate owner if that person can demonstrate immediate control over something like the content of the website or the DNS. Attempts to enforce stronger verification, such as EV certificates, have largely fallen flat. An equivalent standard would be no worse as the basis for a web of trust system than it is as a basis for major CAs issuing certificates today.

The biggest problem with today's CA system isn't that there are trusted authorities, it's that for any given cert there is one and only one ultimate trusted authority and if there are further certs involved then the dependency chain is strictly linear. This creates both gatekeepers and single points of failure. A system where multiple reputable sources could indicate their trust in a certain identity and where authentication was based on looking for multiple trusted paths to verify a claimed identity would address several of the practical weaknesses of our current CA-based infrastructure.


Having your certificate signed one time or three times or a hundred times doesn’t really change the dynamics of the problem at all. Without an authority you put the burden of authentication on the user (no matter what ceremony they use). This fact alone makes it a generally useless approach to PKI. The only thing the web of trust does is allow you to share that burden with a select number of people that you actually know and trust. If you try to extend the perimeter of the web beyond that, it just becomes a web of horribly misplaced trust. Reframing it as a web of reputation doesn’t change anything either.


Having your certificate signed one time or three times or a hundred times doesn’t really change the dynamics of the problem at all.

Just to clarify, I'm not necessarily talking anything as specific as signing a certificate. I'm talking about the problem of authentication in general, whatever mechanism it might use.

Maybe we're talking at cross-purposes as I'm not sure exactly what problem you're trying to solve here, but my argument is that the main problem with existing CAs is that they represent single points of failure in several respects.

Authenticating against multiple sources from a potentially larger pool does then change the dynamics in exactly two very important ways: it means neither the site host nor the site visitor is reliant on a single CA and thus a single potential point of failure any longer. This isn't a complete solution to all possible problems with the existing CA system, but it does offer a conceivable way to mitigate some known weaknesses in that system.

Getting that far doesn't really even need a full web of trust system, just a mechanism for seeking confirmation from N of M ultimate authorities where N is chosen such that 1 < N < M with whatever safety margin on either end you deem appropriate. But a more comprehensive web of trust would potentially allow for a fully decentralised system in the sense that anyone could operate their own final authority but how much it was trusted would depend on the properties of the web (as would how much any default authorities included with browsers as standard were or potentially were not to be trusted).


What you’re describing is not a known weakness in the CA system, and doesn’t even really touch on the actual known issues with implementing PKI. The CA is not a point of failure. If one CA fails to sign your certificate, you just get one from a different CA. If a CA that had already signed your certificate vanishes, nothing fails because your existing certificates do not suddenly become unsigned.

The core issues with PKI are establishing trust. The only way to do that for most applications requires an authority. You’re not improving the system by requiring multiple authorities to sign every certificate. You’re not making things less authority-reliant or more resilient. You’re just making it more complicated, and leaving the real underlying issues around authentication unchanged.


Encryption without authentication is meaningless, and it's no accident that https has them so tightly linked. If you can't tell that you are securely communicating with mitm, what's the point?


Sorry, I'm not sure what point you're trying to make here. Authentication without encryption is not meaningless, and if you can authenticate then there are multiple possibilities for setting up encryption for the subsequent communication. Tightly linking the two is not necessary, and not necessarily desirable.


> Until someone can go and start their own CA in that 10 minutes that works across the world, we're moving away from and not towards what HTTP was.

Would you trust the authenticity of content that originated from a site signed with such a CA? If so, why?

PKI is just a technology for verifying trust over untrusted channels. But that trust is created in and remains rooted in the real world, and is itself built on societal chains of trust.

No technology, centralized or decentralized, will replace real world trust in human societies any time soon. Trustless tech only works if it's based on a society with trust. Otherwise the tech is meaningless.


I'm a modern web developer who's fairly up to date on those things. I've set up what he described just last week. I definitely couldn't do it in 10 minutes. An hour maybe.

The person above forgot that those things require a lot of skill and experience just to understand what you're told to do. We're a long way from building websites with FrontPage and uploading them over FTP.


After 15 minutes of googling, I believe you'll have a good grasp on that stuff. I've been there.

So there is new stuff you need to know. On the other hand, these days you'll find that blog post that walks you through installing nginx and certbot on your $5/month VPS, prepackaged with dependencies, just works out of the box, in a very short order. From what I recall from the late 90s, setting up a plain old fashioned HTTP server felt a lot more difficult then.


So the "solution" is paying a monthly fee?


I've got a 30mbps upload and a static ipv4 IP and ipv6 /48 subnet.

That's my home internet connection.

I've got a laptop in my cupboard that is connected to my router.

That's my server for a lot of simple things.


Uh, like paying for hardware and a beefy connection, or renting a server? I don't think that was ever any cheaper.


In the late 90s you mentioned above, ISPs routinely provided some web space as part of the package, at least in my country. You just FTPed your files up to them, and they ran all the necessary software for you.


> I now have no realistic choice but to go through a centralized gatekeeper to post content on the Web.

I don't know where you got that. You'd install an nginx (or caddy, my preference) reverse proxy on exactly the same hardware you were running your HTTP site, or if that machine is too foreign you can buy an old laptop, put the reverse proxy on it and forward to your site. There is nothing centralized with that.


And where do you get your certificate from?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: