Docker is great, but not for everything.

Docker is great and all (despite me really needing to learn more about it), but I’ve noticed a new issues that I don’t think many people really think about.

Again, I like Docker. I run a couple containers at home, and they work great for what I need. I also know there are a lot of things out there you can use Docker with, and you can even use them to run older versions of applications that may not exist on the newer OS you’re running, but that is what got me thinking about the security aspect of Docker.

When building a docker container, generally you will run the latest available version. So in the Docker configuration files it will install a base Linux OS (for example) and get you the latest version of the packages available. Now lets say you’re running a simple LAMP stack solution like WordPress. When you build the container it will get the latest version of some Linux, like Alpine or Ubuntu, and set it up with Apache/NGINX and PHP. You will get WordPress and your plugins and everything setup. Great. You’re up and running in no time!

Now I’m not going to talk about making sure WordPress, plugins, or themes are being updated, or whether or not you have HTTPS setup. I’m going to talk about that little Linux that is running everything. What happens when an update for PHP comes out. Some nasty vulnerability was found and you need to update. Well normally an admins would go in and run something like apt or yum to update PHP and you’re good to go, but with Docker it isn’t always strait forward. You can’t log in and update, you must rebuild your container. You must do it right so you don’t clobber your WordPress installation, but then you have to check WordPress to ensure things like PHP were updated to the latest release. There is no guarantee that you will be running the latest version depending on who you got your container from. Now, there are cases where this isn’t as big of a deal. What if you have an application that runs in Docker on something like AWS where you can setup Blue-Green deployments and they are regularly updated, then maybe, just maybe everything will be okay and things will get patched when a new Docker container is built, but what about for the guy at a home lab who just wants to setup a simple service. You will have to check that everything is up to date on a regular basis. This might not be as easy since there isn’t really a good way to get exact version numbers out of Docker for the installed packages, so you have to rebuild and hope for the best.

Don’t get me wrong, I think Docker can be a great solution, but I don’t feel that it is the end all, perfect solution that people are touting these days. Docker has it’s place, just like having a full Linux VM (or running on bare metal) system has it’s place. The goal is to find what works best for your use case and your organization.