Wow. Public beta for 2.5 months and already 700,000 certificates issued, more than a third of the largest competitor's number, about 10% of the entire secure Web.
There certainly seems to have been pent-up demand.
For us the free doesn't matter because certificates are cheap. It's the byzantine and insecure process of obtaining a cert (sending us our private cert in a zip in a plain text mail. I mean, really?) that makes LE so great.
Edit: Total brainfart, apologies; The company sending us private info as a zip was a different thing.
Your CA should not be in a position to send you your private key. They don't need it in order to sign your certificate. Pretty much every CA I'm aware of allows you to provide your own CSR (which only includes your public key).
Sending the certificate (as opposed to the private key) via email is fine, since that only includes your public key, which is visible to every site visitor anyway.
(I agree that an automated process based on an open, standardized specification is preferable.)
Agreed a thousand percent, but there are services that offer easy installs onto cloud providers that do know your private key - that's how they get it onto your ELB or Heroku.
If you can restrict the service to a subdomain, there are alternatives like the SAN extension that allow those third parties to avoid handling your private key at a small extra cost.
I'm having trouble understanding your comment. SANs are mandatory (current browsers don't even use CNs), how would SANs specifically prevent this? The endpoint where your terminating your traffic obviously must have the private key to decrypt it.
You need one cert (or at least only a handful of certs)—SAN entries do not need to be subdomains of the CN. Greatly reduces headache of ssl-terminating for e.g. client domains.
No there were not. Free options meant untrusted certificates or short-lived certificates that could not be renewed.
LE made SSL free, trusted and long-term. You could have made it twice as hard to do the initial setup and people would have jumped at the opportunity regardless.
StartSSL and WoSign have been offering free, publicly trusted certificates with one year lifetimes (and the ability to renew for free) for quite some time.
The former doesn't allow commercial usage, while the latter operates in China. That's probably why it wasn't an option for a lot of people. (That, and the terrible UX at least in StartSSL's case.)
StartSSL generally don't mention their terms, so a lot of the use of 'StartSSL Free' is commercial. People don't like to hear they've been misled (and shoot the messenger).
They actually started policing this and refuse to renew a certificate if they decide it's a commercial use. Happened to me and rather than argue with them (it wasn't), I bought a $5 one at ssls.com...
While I agree LetsEncrypt is awesome, the certificates are actually shorter-lived (90 days), I believe, than the former CAs that issued certs for free (e.g. StartSSL) that offered a year of validity. The rationale is replacing the keys sooner since you can easily do it automatically with LE.
I used to use Wossl and StartSSL before for lots of little personal projects. I recently switched everything over to LE. I am looking forward to switching other projects to it too, such as ones for clients where we had to pay for wildcard certs in the past. Not having to mark the calendar to renew certs every year is going to be really nice.
Incidentally, does anyone have a good way to integrate LE with EC2's load balancers?
Might be interesting to look in a year's time - I don't think people will replace commercial certs before their expiration by a free offer in beta. Ones those expire it becomes interesting.
Why is that interesting? I have quite a few sites where I'd rather keep my commercial cert (which expires every 12 months), as opposed to a beta cert that expires every 3 months.
Hypothetical. If I run a website for a small town public library that only serves information, ie no user accounts and no logins on our domain, is there a valid reason for me to go through the process of https and certificates? Plus I'm on a shared host. I briefly looked at the install doc on letsencypt and while it's clear it's easier than it used to be I am uncertain my shell access will give me the necessary permissions to even accomplish it without upgrading to something thing like a vps or administering my own vm with a cloud provider. Which is honestly something I am less than interested in doing if I can help it. Keeping a Web server secure is seems brutal to me.
>If I run a website for a small town public library that only serves information, ie no user accounts and no logins on our domain, is there a valid reason for me to go through the process of https and certificates?
Absolutely. The privacy of library searches has long been viewed as one of the archetypal examples of why privacy matters. A user's ISP shouldn't be able to learn what sort of books, movies, and music a library patron is interested in. And that's before we get into matters like injection of malicious content by local miscreants at your favorite cafe, or ads by mobile networks. Fundamentally, it's an issue of your users being the ones deciding what they do and do not care about being secure, and security being the default (could you imagine the emotional hurdle someone who has a real need for their privacy when using the library website would have to go through to explicitly ask for it?). The Library Freedom Project, who has been in the news a lot as of late, was in part started to push the use of SSL in all libraries, even if you "don't need it."
Encrypting sensitive user data is only one advantage of TLS. Without TLS, every single network hop between you and your client could modify your site. This could mean altering the content, adding ads or even malware. Some large ISPs have done so in the past. Furthermore, browsers vendors have decided that HTTPS is the way to go forward and will gradually mark HTTP-only sites as unsafe. New browser APIs might only be available in secure contexts.
You might be interested in this list of web hosts that support Let's Encrypt[1]. Generally speaking, your hoster should be able to provide a one-click interface for obtaining and installing a certificate for you, and odds are most hosts will eventually do so free of charge once HTTPS becomes mandatory.
Historically library usage has been surveilled, so I would say that making passive surveillance of your users more difficult is a 'valid reason' for going through the trouble.
You don't need logins/user accounts for your users to be identified. IP is sufficient in many cases, and browser fingerprint pretty much covers the other cases.
Also, if the website ever has need to become more complex, it will be easier and less error prone not to have to throw 'figure out how to implement TLS' on the pile of tasks.
Also, if permissions are the primary concern, might I recommend moving to a host that does SSL/TLS for you? Webhosts have come a long way in the last few years. Moving to a nicer one may actually save you effort in the long run.
Yes, there are many reasons to deploy TLS everywhere, and everyone should be working towards it for these reasons:
- Increased resistance to surveillance. Instead of seeing the pages/information that a client downloads from your server, state actors, ISPs, local attackers, and anyone else listening only learn that the client downloaded some bytes from your server.
- Mitigation of man-in-the-middle and man-on-the-side attacks against your website. These can be as simple as someone attacking a local open Wi-Fi access point to sophisticated attacks like the Chinese DDoS against GitHub and the NSA's QUANTUM INSERT. Potential attacks range from replacing/rewriting information to attacking client machines with browser exploits.
- Better SEO rankings, Google weighs HTTPS sites higher than their equivalent insecure plaintext.
If you need a shared host that supports HTTPS, take a look at DreamHost, they have a free one-click Let's Encrypt integration.
Sorry to hear that. HTTPS (via SNI and user-uploaded certificate) doesn't cost the hosting provider anything to support, so it seems like yours is just behind the times.
Absolutely. Not only is privacy important for your users (a list of what books someone has read being available to a user's ISP or a government is a violation of human rights), but TLS also secures your users from local network attacks (injecting malware into pages, or otherwise changing what information they see).
In the case of library you would want to protect users privacy for searching books. I do not think it is possible to switch to https on shared hosting without explicit action from the provider. For minimal administration, a cloud provider will work best. Check out AWS since they also offer free certificates.
while I agree with the importance of TLS and SSL for everyone , I do recommend buying a SSL straightup (about 8$ a year) and not going through the commandline setup unless you are comfortable. Most hosting services (especially the cpanel variety) just let you upload a certificate that you have bought.
It is probably more important that you do TLS/SSL than not do it because you are uncomfortable with Letsencrypt setup
If the library site serves over plain http, it is fairly easy to snoop on the visitors activity. So one could find out which kind of books certain visitors were looking for. That could easily reveal secrets about said visitors.
If we are not just being hypothetical, a simple solution for you could be to register with cloudflare and run your site behind that. They will give you a free ssl certificate. This isn't as secure as running your own, since the connection between cloudflare and your server is in plain http, but it's a lot better. As an added bonus, you get a free caching layer in front of your site, which might be a good thing if you're on a small shared host.
tldr: Let's Encrypt is achieving its original mission: help people use TLS on domains that were previously insecure, and make the internet a safer place.
> In the first quarter of its' operation then, Let's Encrypt has far and away been used more to secure previously-unsecured (or at least untrusted) websites than simply as a cost-savings measure.
I'll plug my own client [1], which is also completely manual for those who don't want to go all in on the magic automation just yet. It also has DNS verification, which is probably more convenient in most cases.
Sorry for the delay. It's turned out that parsing is hard, especially in the absence of a formal grammar for configuration file formats!
If you don't need the certs automatically installed, nginx users are generally doing well with the webroot plugin (which automatically creates files to perform the ACME challenge, regardless of what webserver is serving those files). This will also work for renewal, as long as you're able to do the initial configuration of your nginx to work with the certs you get.
Just the other day, we finished a "dry run" deployment of our new app for small businesses on a Digital Ocean droplet running Debian and Apache (our app is Ember and Django). Let's Encrypt was the final step.
We followed the instructions provided by DO[0], and aside from our mistake of leaving a previous attempt as a Virtualhost on port 443, the client just works out-of-the-box.
It automatically detects which file has the Virtualhost for port 80, asks you if you want to force redirect to https, copies your script to a new file with a Virtualhost on port 443 (adding SSL and telling Apache where to find the certs), and enables the site for you. Needless to say, my pair programmer and I were impressed thoroughly.
The apache plugin for the Let's Encrypt Python client can edit apache configuration files (one of the most complex and hard-to-get-right but also one of the most convenient features of the client). There is also an nginx plugin which is significantly more experimental and which also edits nginx configuration files.
We had our sites configured in sites-enabled (ember.conf and django.conf). The LE client copied everything over to new files with encryption enabled, and had the old port 80 Virtualhosts redirect to the port 443 Virtualhosts.
So one thing that would be interesting to take a look at from this dataset is wildcard vs non-wildcard. My employer has two wildcard certs for public sites (purchased from your standard vendors) and that's all that is required (for a lot of places). However one of my personal domains I play with a lot of technologies, all on their own subdomains. So for that single domain I probably have 10 LE certs (and yes, none of these were secured before).
So maybe it's not wildcard vs non-wildcard, it's limit the datasets to root domain names?
I've used them a bunch. We host a lot of internal-facing utilities that are low-profile, but occasionally hosting sensitive data. In the past I couldn't convince managers to spend money on certs even if the cost of someone stumbling on these sites could be very high and certs are cheap. Now I don't even have to ask.
As far as I can tell, the website can not be used to browse all certificates directly; when the number was at 500k, the site would consistently time out when trying to fetch the oldest ~250k.
Is the raw data available somewhere?
On a completely different and off-topic note, as someone who would normally just handle my certificate needs by piping together 10 openssl commands, your ACME client is super handy!
The raw data is publicly available on Certificate Transparency log servers[1]. There are various clients for CT log servers out there, but they all require you to essentially download the entire log to query it (there's no query API).
One way of putting it is that we want to get caught as quickly as possible if we ever misissue a certificate. We don't think it's a benefit to us or the certificate-using (relying party) public if we have the ability to secretly issue certs that are erroneous.
I started using Let's Encrypt for a new app backend that I am building. I find very useful its free service as it give developers the opportunity to use secure servers to keep users privacy and server's security for free
MariaDB probably because the data he had was already in mysql/maria format or he had a meaty mariadb server set up. Go: it's easy to hack stuff together. Generally speaking, statistical analysis of stuff boils down to whatever you're comfortable with and what your goals are.
4th largest counting certificates that are either on Certificate Transparency Log servers or in censys' data set.
CT logs are populated by CAs sending their certificates to log servers. Only a few CAs (including Let's Encrypt) do this consistently at the moment, since it's only mandatory for EV certs (for now), and CAs generally move rather slowly. Certificates encountered by Googlebot during web crawling also get pushed to their log servers. Censys probably does something similar.
It's possible that there's a large number of certificates issued only for internal systems that would not end up on CT log servers and that are not accessible by public crawlers, so the numbers are probably not painting a full picture. It is, however, as close as you can get to the full picture unless every CA is willing to release their internal numbers.
It doesnt support browsers in Windows XP and it is the only annoying thing which forced me to rollback letsencrypt. I know it is 2016 but complaining clients is not what you want anyway.
I still have no idea if they are able to fix this in future.
I installed Windows XP few days ago, Letsecrypt certificates are working fine on Firefox 43, it just doesn't work if you are using SNI and IE8. Haven't tried Chrome but it must work.
Firefox uses its own certificate store, that's why it works on Windows XP. Chrome and Internet Explorer will likely not work, because they use Windows XP's certificates, which don't include trust for Let's Encrypt.
Their root certificate is currently not trusted by any browser vendor. They have a cross-sign from IdenTrust, which is trusted even by XP. However, their intermediate certificate uses a name constraint in a way that causes schannel (Microsoft's TLS stack) on XP to think they're not allowed to issue certificates for any domain name. Chrome and IE use schannel, while Firefox ships its own TLS stack. This bug was fixed in newer versions of Windows. This might get fixed if someone finds a way to generate an intermediate cert that doesn't trigger this bug (while still including the *.mil constraint).
Does anyone know where one can get a free wildcard certificate? Need it for development and foo/bar/baz/biff.example.com change names regularly (they include the hash of the code commit) so I would like to get a *.dev.example.com wildcard cert. (one that won't give warnings that scare the business types who are testing the code, and won't understand what self-signed means.)
To anyone wondering, this is also what the "big boys" do, so dont feel like this is a hack. Most big companies have their own company root CA, and install that cert on their company computers. They then have all internal apps use a cert signed with that root CA (or derivative thereof)
And that's how the CA system is actually supposed to work. You add to the trust store those entities you trust rather than those that are trusted by the browser makers...
CloudFlare does that. You could run a self-signed certificate on your server, relying on the wildcard certificate CloudFlare generated to do its proxying of your domain.
Other than Amazon Certificate Manager as moatra mentions (which I don't think let's you export the certificate), I don't think there is currently an option for free wildcard certificates.
As an alternative you could incorporate provisioning of a Let's Encrypt certificate for the new subdomain into your deployment process since the process is designed to be automated.
... if you need a different certificate for each subdomain. You are limited to 5 certificates per domain per week, each of which can be valid for many subdomains. Bad if you want to be able to add them dynamically every time a new name comes up, but if it is a static set...
You will run into rate and other kind of limits if you issue many names for a single TLD+1 name. Constantly ran into this while developing a plugin for cPanel.
You can get sub-€100/year wildcard certs on gandi.net (free the first year for their own domains I think?), which shouldn't be a problem for a business expense.
I used StartSSL and WoSign certificates for all websites I had to setup, but I welcome that initiative and my next website will certainly use LE certificate. While it was technically possible to issue free certificates before, LE looks much safer option. After all StartSSL and WoSign are both commercial entities and they can do what they want.
Slightly OT: It would be great if CT logs were available as part of Amazon's or Google's public data sets. Being able to access that data via BigQuery (or similar) would make generating something like this way easier. It would also be immensely useful when implementing CT log monitors.
I got to say AWS Certificate Manager is a game changer -- it took me 5 min to secure two domains. Last time I did it, buying a certificate and converting to work with AWS took about four hrs.
I started using Let's Encrypt for a new app backend that I am building. I find it very useful as it give developers the opportunity to use secure servers to keep users privacy and server's security for free
There certainly seems to have been pent-up demand.