The team at dotCloud has been working on a semi-stealth project called Docker which is a container based approach to automating infrastructure. With Docker, you can deploy a fully functioning system that can run on your laptop, your AWS environment, your bare metal server, or anywhere you choose and it will run the same way because it is isolated at the process level. Here is a quote from the Docker web site that says it best:
Docker is a great building block for automating distributed systems: large-scale web deployments, database clusters, continuous deployment systems, private PaaS, service-oriented architectures, etc.
I first heard about Docker back in January of this year when I toured the dotCloud office with Solomon Hykes, CEO. We were sharing war stories about building startups when I said “if I could do it all over again, I’d implement continuous delivery from the start”. Solomon’s ears perked up and he said he had something to show me but he needed a couple more weeks to work on it. We agreed that the next time I was in San Francisco he would give me a demo. A month later I traveled west again and the dotCloud team hosted their first demo day. These became weekly events every Tuesday and the audience grew bigger each week. There were about 5 people there for the first demo. Solomon gave us a demo and the ideas of what we could do with this software starting pouring out. There were a few serious system admin types there that were talking way over my head, but as an applications guy I could see a real business case for leveraging Docker.
The use case that was relevant to me, the application guy, is to use Docker to streamline a continuous delivery process. In every place that I have worked in my career, from the mainframe days, to the client server days, to the cloud days, getting the different environments in sync and successfully testing applications has been a nightmare. When code moves from Dev to QA to Stage to Prod, no matter how good or bad our processes were these environments were NEVER the same. The end result was always a hit in the quality of a production release. “It worked in test” became the most shrugged off phrase since “the check is in the mail”.
With Continuous Delivery (CD), the entire environment moves with the code from Dev to QA to Stage to Prod. No more configuration issues, no more different systems, no more excuses. With CD, if it didn’t work in Prod it didn’t work in Test. With Docker, I can see writing scripts to automate the CD process. I can see gains in speed to market because of how quickly new environments can be created without dealing with all of the setup and configuration issues.
The other exciting thing about Docker is the fact that it is now open source. I have been on the forums for a couple of months and the ideas coming from the hardcore admins are very intriguing. I believe the open source community will quickly take Docker to the next level.
I already see companies like Frozen Ridge starting to use it and it wasn’t even released to the public yet. What I am most impressed with is that Docker is in its infancy stage and it is moving quickly and with much support and enthusiasm from the community. If you want to learn more about Docker go here , check it out on github, and watch Solomon’s 5 minute pitch at PyCon.