Protection against bad SSL certs

Again, trying to use Twitter to express ideas is a bad place, 140 chars is too short and the sentences might get broken.

Let's start with this tweet from Bryan Ford. It links to an article that explains how a band of attackers were able to get full control of a Brazilian bank site thanks to altering the DNS records. They created a copy of the pages and got new SSL certs (we guess that the article is wrong about those 6 month old certificates from Let's Encrypt, that doesn't make sense. They are valid only for 90 days and they could have created them in a few seconds after taking over the DNS)

So losing control of DNS is really a big problem, even when they realized of the problem, they had to "fight" with NIC.br to recover control of their account and restore the proper DNS.

So what are possible solutions about this problem?
I think that something along HPKP (HTTP Public Key Pinning) is part of the answer. If everything worked correctly, the browsers would have noticed that the cert is wrong and then refused to load the page so visitors wouldn't have entered their credentials.

Bryan replied that HPKP has several problems and as you can read, it's hardly used.
So maybe the answer is not HPKP as-is now, but something developed to take into account new attacks.

Nowadays getting a HTTPS cert is finally easy thanks to Let's Encrypt, but there are still other kind of certs like EV SSL that provides verification of the company that runs the website. They aren't cheap, they require time and effort to get them, so maybe they are the starting point to get extra protection, not just showing a green url bar.

Let's say that all EV Certs are logged to a central repository (or multiple redundant copies), and that repository is the base for a new HPKP so it can't be abused by people trying to pin a free SSL cert that they got as soon as they took control of your server or your DNS. This new pinning would allow to protect those special sites that have worked and paid for a Cert that provides greater security to their users and the browsers would help to reach that goal.
A second way to use that central repository it would be that any CA should check it before issuing a new Cert. If a company has a EV Cert issued, Why would they want now a free SSL? Have they gone bankrupt? Or maybe they aren't the one requesting the new Cert? So this could close the hole that allows any CA to issue a Cert for an attacked domain.

Carlos Ferreira has replied about CAA records, but I fail to see how this is useful at all in the long run.
  1. The attacker doesn't have control of your server or your DNS. Then this will prevent them to get a SSL Cert, but I don't think that they could really get a Cert from any CA, maybe I'm wrong.
  2. The attacker has control of your server and is able to request new SSL certs. Why would they do that? If they are in your server, then they just can use your existing cert, they don't need to add a new one or create new ones. 
  3. The attacker has control of your DNS. Then they can control CAA as they please and there's no protection at all.
So that leads to his other reply about watching the DNS records with a tool like DNS Spy. Yeah, that can be useful to notice an attack, but I guess that by the time they got the mail (to a different domain of course) about the modified DNS, the admin of the attacked domain might have already noticed some problems and anyway it's just trying to do damage control instead of getting protection like it would have happened if the attackers wouldn't have been able to get certs for their fake servers. So yes, watching DNS is useful but it's not the solution.

There are many technologies around web security, some are old and trusted, others are proposals that didn't reach momentum for whatever reason. I'm an outsider so I can't really provide a full list of ways to get proper protection, but I feel that there are ways to get better security like promised by HPKP without risking basic sites.


Getting a Google Maps API Key

On June 22th Google announced that from that day on, every new implementation of the Maps API requires the usage of an API key.

This is very important for anyone that wants to use my Google Maps plugin for CKEditor, because now you must get your key in order to use it.
The basic usage of the API allows 25.000 free maps loads per day, and you can have one key for each domain that you want to use. From that point you'll have to get a paid license. This is more or less the same, they have adjusted the way that somethings are counted but the important part is that previously the free usage encouraged signing the requests with an API key but now it's a forced requirement.

Getting an API key isn't too hard because the process has been streamlined and you mostly have to agree to the Terms and Conditions, you can find here their instructions, but I'm gonna provide some screenshots so you can view how easy it is.

Step by step guide

First, go to https://console.developers.google.com/flows/enableapi?apiid=maps_backend&keyType=CLIENT_SIDE&reusekey=true

You'll get a screen like this:
As this is probably a new project you just have to click Continue.
Now you'll get some notifications and progress and you'll end up with a screen similar to this one
This doesn't look right, the "You don't have permission to create an API key" message is strange, but the fact is that the "Create" button is enabled and it works, so you can define the allowed Referrer sites, or leave that blank now and adjust it later.
Click "Create" and then you'll get your API key

Click the Copy icon at its right side and you're almost done. If you have already the Google Maps plugin, then open the CKEditor configuration file, add a new entry "googleMaps_ApiKey" and then assign there the value that you got:
Now if you load CKEditor with the Google Maps plugin, the Maps dialog should work, but the static images will fail, this is because we have enabled the Google Maps API, but you need also to enable the usage of the Static Maps API with this key (and also the Street View API if you want to allow your users to use a StreetView image as the preview)

So open https://console.developers.google.com/apis/api/static_maps_backend?project=_  and this time, instead of creating a new project we will use the one that has been created previously:
Then click the Enable button
And this step is done, just repeat it for the Street View API in this link:

Then the Geocoding so the searches also work in the dialog:

And this is over!


Please, keep in mind that these steps are the current ones as of July 2016, Google might change things or even depending on some settings on your account you might see different things, but the end goal is to get a Google Maps API key, enable also the usage of that key for the Static Maps and then put it in a googleMaps_ApiKey entry in the configuration of your CKEditor instance.

Additional notes

I think that the first time that you try to get an API key you'll get this screen:
and from them on when you try to get another key for a new domain, the dialog that it's used is the one that I've shown at first.

Also, at the top of the screen you might get a banner to sign up for Google Cloud, but if you plan to stay within the free plan limits you don't need that.


How to generate unique file names with SimpleUploads

If for any reason you can't change the server back-end that saves your file uploads in CKEditor and you want to prevent the overwriting of existing files with new ones that have the same file names, you can add this code to your page to generate a unique filename for each upload (adjust it to your tastes)

CKEDITOR.on('instanceReady', function (e) {
 e.editor.on('simpleuploads.startUpload', function (ev) {
  var filename = ev.data.name;
  //var extension = filename.match(/\.\w+$/)[0];
  var newName = CKEDITOR.plugins.simpleuploads.getTimeStampId() + '_' + filename;
  ev.data.name = newName;


Debugging client-side and server-side

In the Google I/O 2016, one of the sessions was dedicated to what's coming to the Chrome Dev Tools.
Besides the improved features in the tools themselves, they also announced that they are planning to enable debugging of Node.js from Chrome, this way Node developers can use Chrome to debug both the client side as well as the server side once that pull request is accepted.

On the other side, Microsoft announced on February the ability to debug Chrome from VSCode. By using the Chrome Debugger protocol, they have created a VSCode extension that connects with Chrome and enables you to debug your script from your editor.

This reminds me of the feature of VS that integrated with IE11 and below so that when you started debugging a project, besides debugging the server side, the javascript debugger in IE itself was disabled and any error was launched in VS.

I must confess that I always hated that behavior, when I'm debugging a web page I'm not looking only at the Javascript, I must check the DOM to verify if the elements exist, check their attributes, view how does the page react to changes, etc... so a debugger that only allows me to look at the javascript is a bad option and when I had to debug IE I launched a new instance that wasn't hooked to VS so I could use the F12 tools of IE.

I guess that this must be useful for some people or they wouldn't have spend the time to make it work with Chrome, but I really can't see how using VSCode for Javascript is any better than using only the Chrome Dev Tools as they are constantly updated and improved and I wouldn't say that they are missing important things to debug JS. To debug client-side I certainly prefer the client-side tools to keep all the context, I'm not looking only at a JS file.

So going back to debugging Node from Chrome, I guess that it might depend on the quality of your editor (I would say that some people use very bad editors). Until Chrome becomes a full IDE, you're still using another program to write your JS, that includes plugins, it's adjusted to your taste, integrated with other tools... If it's able to debug Node itself, then I think that I would prefer to do that there instead of using Chrome, but obviously this depends on the quality of that debug experience. I can understand that the context provided in this situation can be similar by the editor and Chrome, although on one hand your editor might be able to provide better context and Chrome might have better debugging tools.

The only fear is that people is focusing too much on Chrome, and so we might see soon that any other browser dies because the web developers don't test them, users find problems and are told to use Chrome instead, then the statistics say that people only use Chrome and more developers focus their testing only on Chrome and we end in IE6 land: A browser monoculture.


New plugin for CKEditor: ImageToolbar

Recently I've been working in order to create for Uritec a new plugin for CKEditor, it was needed due to the way that the Style system works in CKEditor.

First the Styles dropdown shows the styles that can be applied to the current element as well as its container(s); look at the official demo and select the image, now click on the Styles selector and you'll see there "Object Styles", "Paragraph Styles" and "Character Styles", something that it's not too user friendly if they just want to modify the image. The normal user wants simple things.

The second main problem is that even if they focus just on the Object Styles, you (as page author/designer) can't restrict the way that the styles work between them, so if you want to use a class to apply a 1px gray border, and a different class to create a slight rotation and shadow, the user can't apply both at the same time (unless you apply inline styles that we all know it's bound to fail).

So the goal was to allow to apply groups of classes that can be mixed so you set one group of classes to define different types of borders and another one to apply color filters, etc... Also, let's keep in mind that the align center has been disabled in CKEditor for those who prefer to use the classic Image plugin although it's easily done using display:block and margin:* auto, so one obvious group is alignment.

Putting all those buttons to show up always in the toolbar was considered a bad idea because most of the time they won't be used, and providing instead a "contextual" toolbar that shows only when an image is selected  was considered the best option.

Taking all of that into consideration ImageToolbar was born, designed to work in both classic (framed) editor as well as new inline mode, and compatible with the normal Image plugin and the Enhanced Image plugin. You can play at its demo with some predefined styles, but you can define whatever you want as long as you're able to do it with CSS classes. Or if you feel lazy today, check the 1 min. demo on YouTube.

If you use the Enhanced Image plugin you should know already that the way to define your CSS file requires taking some extra care, we have added some notes about it in a small guide about styling images in CKEditor, and due to those widget wrappers and upcast/downcast you'll find that it's much more powerful when you configure ImageToolbar with the classic image as you aren't restricted by the "Enhanced image" limitations.

Lastly, remember that unlike other plugins that claim to be free but then require you a monthly payment, this is a one time payment that will give you any future update.


Deprecation plan of IE8 in SimpleUploads

Die IE8, Die.

Planned deprecation of support for IE8 in SimpleUploads

We're already in 2016, Microsoft has stated that old versions of IE are no longer supported, and since some time ago Windows XP is not supported either.

IE8 is too bad according to any current standard, too limited and lacks too many features, it has bad security track, it's time to put it to rest and everyone of us has to work towards that goal and obviously one thing that we can do is to claim that those old versions aren't supported in our products, so people can realize that they have to change their browser to another one that has all those missing features.

So by the end of February I'll publish a new version of SimpleUploads just to remove the little parts that tried to add some kind of support for IE8.

I'll remove right now any claim in its description about support for IE8 and I'll update shortly the demo with the new version without support for any browser that lacks the FormData feature.

If you have any concern about this change, please comment here or send me an email, but remember that you can keep using your current version as long as you want if you prefer to support IE8 instead of embracing the future.


Die XP: Another benefit of Let's Encrypt

The Let's Encrypt project is a great initiative to move towards a more secure web, removing the costs to apply a secure certificate to a site and providing an automated client to take care of renewals.

This means of course huge changes across the whole the web industry:

Other CA will be forced to drop the price of their equivalent certificates to a bare minimum or make them free also, just to try to keep on some people looking at them.

Hosting providers will have to allow everyone to use a one click install of a free SSL certificate or at least manually update their certificates unless they want their clients to move to a friendlier hosting that allows setting up SSL without paying huge fees.

Website owners now will be able to avoid one of the problems (money) to install a SSL and this will be specially important for small websites. Big companies have enough resources that the cost of a certificate is nothing to them, but for a small website it's clear that every Euro counts, so most of the people didn't think at all about paying just to say that their site now can be used with https.

Hopefully this will mean the end of self-signed certificates or expired certificates, so end users will be able to understand better the difference between a secure site and a non-secure one and so after a while people will reject "old" sites that aren't using SSL, forcing those sites to install one, pushing their hosting companies to allow install of free SSL.

The government spies will have a harder time trying to track what everyone does, and no, this won't be an improvement for terrorists because they are already able to use secure communications but in Paris it has been clear that they used normal non-secure methods.

But there's one more benefit: Most of those small sites that now are installing SSL en masse are using shared hosting, so they don't have a unique IP and that means that they rely on SNI to enable https and it turns out that no version of IE under windows XP (as well as old Android 2.x phones) don't support it, so those that still keep using the old IE8 will now face a new problem because they will have constant security warnings whenever they try to visit all these new https sites.
And this is a good thing!!!

That people keep on using that old IE. I mean, it's old, old, old. Full of bugs, full of problems, a pain for all of us that try to create modern websites if you have to keep supporting it, and now those users will feel a little of that pain (although I guess that they are already suffering from all of us that have left IE8 behind and no longer test it).

Time to ditch IE 8 and move to a modern browser.