This week in Badger Academy, we were joined by Alexander Savin, a senior engineer of many talents. Under his guidance, he assessed the current state of our DevOps including the decision to use docker.
Finalising last week’s architecture choices, we were promptly laying down the foundations for the road to pave ahead.
There really was a lot of googling, not much stackoverflow!
Deciding on a one command workflow for any compatible unix system, we proceeded to create the mammoth script.
Bash Shell Script
Iteratively tweaking it (Agile!) in the end allowed us to do the following:
- Pull preinstalled images down
- Add our config files into them; specifically our Nginx and SSL certificates
- Mount our Badger-Time code into their respective destinations
- Install node and rails dependencies then create the databases and migrate them
- Run all the linked containers with persisted daemons and their services in a hierarchal order.
Badger-Time code up and running on any potential unix system in less than 15 minutes without any further interaction.
It sounds like a lot but in fact this is allowed due to the high internet speed within the office!
The advantages we had discovered in this approach compared to the previous Badger-Time Vagrant + Ansible were vastly great in so so so many ways!
First of all, an all-in-one up command; we have one extra intern joining us in a week’s time, getting her laptop up to current versioning requires little to no effort.
(Yes, we’ve tested it already on her preview day of the office)
- No makefile building? Yes please!
- Faster tests
- Reduced memory footprints
- Same environment from development to our build server to our deployment server
- Isolate local dev dotfiles and configs from the application
- 12factor application coherence!
There are many disadvantages as such you would imagine with any new technology:
- Initial volume mount mapping configuration
- Networking association is difficult to comprehend.
(Dynamic host files generated by linked containers, exposed ports, vagrant)
- Developer productivity affected by added configuration complexity
- Double layer virtualisation! Linux native support only
- The lack of a structured DevOps docker approach documented online leaves a lot of decisions to the creator.
Admittedly, as we’re still continuously learning, we will accumulate the software architect’s hat overtime.
Luckily we have constant surveillance and access to the senior engineers over Slack! #badgerbants
Scaffolding the frontend
With the majority of the DevOps out the way for the developer environment, together with Alex we conversed potential ways to scaffold the frontend tests.
This took a lot of learning Gulp with him to customise further our frontend workflow.
Our gulpfile was chosen to do the following tasks:
- Pull down npm and bower dependencies
- Build LiveScript React.js components, Index.Jade, Less files, Semantic Grid system
- Browserify, Concatenate, Uglify
- Build the LiveScript tests for compatibility with CucumberJS
- Start the Phantomjs service from within the docker container before running the CucumberJS tests
- Watch for source code file changes and compile
Letting Gulp do such things allows us to commit and push less code to Github plus have the added developer workflow productivity!
Less context switching, the above are just abstractions!
Food for thought
One problem that had to be overcome was the choice of running frontend tests from within the container or outside.
The issue is that we have to keep in mind that the tests will inevitably be run from within a build server environment before being deployed.
This poses the question because of Nginx serving static files in a container,
should we reroute the webdriver to examine outside in for tests?
We were a bit stumped at first so can someone document a best practices guide for Docker networking + Docker Frontend testing please!
It may be the case that someone at Red Badger will have to!
Next week tasks!
Tiago and I for next week will ponder about what kind of tests should be written.
BDD is a major cornerstone to the quality of our projects, we’ll have to assess such implementations with a split frontend and backend!
Let alone learn API design!