A fine judge of the necessary minimum
Rediscovering the command line, or Your Data isn’t that Big
This is a little rant about my rediscovery of just how much I can get done with the unix shell and a few tools. Working at the command line is an unexploited resource that too many developers have forgotten. We’re too easily seduced by hipster frameworks designed to address much bigger problems, and we’ve become too comfortable in our full-screen IDEs. These days, I find it easier and more effective to return to old-style tools that I can compose together and interact with quickly. You have nothing to lose but your tool chains!
A meta-pipeline for generating continuous delivery pipelines for microservices
Our development group is adopting microservices. We found that having developers set up build pipelines by hand for each service was time-consuming and led to inconsistencies as our environment changed, so we automated the process. We developed a meta-pipeline that will generate a continuous delivery pipeline for any of our repositories that follow a set of conventions – our Meta-Pipeline Protocol. Standardising our pipeline definition has greatly reduced our programmers’ effort and allowed us to safely evolve our build environment. We’re starting to exploit this meta information to improve our overall environment, for example: to visualise relationships, to manage the running of client contract tests, and to profile build timings. In this experience report we will describe the structure of the system, our struggles to develop it, and some of the design decisions we made along the way.