You will most benefit from this blog if you are already familiar with Crossplane. Check out my other blog Infrastructure as Code: the next big shift is here and Crossplane docs to learn more.
For a practical walkthrough using this self-paced Killercoda Scenario. If you like visual content, check out this companion video.
What and why to validate
Testing and validation in software are critical and this summarizes it pretty well:
“I don’t want to believe. I want to know.”
Carl Sagan
We want to be sure that the software we create and maintain will work as intended without bugs and errors. This is even more critical for software that powers infrastructure such as Crossplane’s compositions. When everything else depends on the correctness of this layer, the stakes are high!
YAML is a data serialization language
Compositions are written in YAML. Not in Go, C# or any other programming language. Whereas this is great to express declaratively what we need our system to do, it’s hard to create and maintain without advanced language facilities.
The ultimate goal is for compositions to be generated and YAML treated as a data serialization medium similarly to JSON with REST API. There are already a few mechanisms to do it today, check out this repository by Chris or Crossplane cdk8.
Learn more about cdk8s in my blog about creating Kubernetes YAML files.
Levels of Validation
There are 3 levels of testing/validating, here is how Datree docs explain with my comments below.
- dev-test loop helps uncover errors early and must be very fast, bonus if it integrates with IDEs or other development tools.
- CI/CD pipelines perform the same validation on every check-in
- Runtime use Kubernetes admission controller webhook to validate YAML against a set of policies before admitting it to the kube-api.
Types of validation
There are 2 types of validation we are interested in when it comes to compositions.
- schema validation against XRD
- custom policies validation <- what we are focusing on now
Custom policies validation is testing if a composition complies with an arbitrary set of custom rules such as naming convention, required fields otherwise not specified in the schema, or cascading dependencies.
For schema validation and testing composition YAML file correctness against build-in rules, you can use a VS Code plugin (works with vim/nvim as LSP as well).
Demo Setup
An interactive demo is available at https://killercoda.com/decoder/course/crossplane/crossplane-datree-validation, here are a few highlights so as not to repeat the whole setup.
Custom policies in `Datree` are expressed as JSON schema-based documents. Since this is JSON schema it’s automatically available in YAML. We will work with YAML policies.
Here is the custom policy you will use in the interactive scenario.
Conclusion
DevOps movement and more specific “shifting left”, brought infrastructure under the responsibility of development teams, but also brought software practices to the realm of OPS.
Automated testing and validation are a very important practice, well rooted in the software development world and something platform teams or DevOps practitioners are adapting.