Announcing Spinnaker Evaluate Artifacts Stage hero graphic

Announcing Spinnaker Evaluate Artifacts Stage

Apr 9, 2021 by Stephen Atwell

Armory’s new Evaluate Artifacts stage allows you to both create an artifact from within a pipeline and to inject Spinnaker parameters into any artifact. Only certain deployment stages, such as deploying a Kubernetes manifest, support using Spring Expression Language (SpEL) to reference parameters out of the box. Other stages, such as the Terraform Integration stage, lack this stage-specific SpEL support. This blog explores how the new ‘Evaluate Artifacts’ stage can be leveraged to inject Spinnaker parameters into your Terraform deployment pipeline.

The pipeline we're going to configure
The pipeline we’re going to configure

The Terraform Script

Before creating our pipeline in Spinnaker, we need to define the Terraform script that we want to deploy. In this case, it is a simple script that deploys an NGINX container. This script contains 3 parameters: the namespace, the deployment name, and the number of replicas. These parameters are what we will also define in the Spinnaker parameters, which are then used for the new Evaluate Artifacts stage. Here is the script I’m deploying:

variable "namespace" {
type = string
}
variable "deployName" {
type = string
}
variable "replicas" {
type = number
}
resource "kubernetes_namespace" "test" {
metadata {
name = var.namespace
}
}
resource "kubernetes_deployment" "test" {
metadata {
name = var.deployName
namespace = kubernetes_namespace.test.metadata.0.name
}
spec {
replicas = var.replicas
selector {
match_labels = {
app = "MyTestApp"
}
}
template {
metadata {
labels = {
app = "MyTestApp"
}
}
spec {
container {
image = "nginx"
name = "nginx-container"
port {
container_port = 80
}
}
}
}
}
}

Configuring the Spinnaker Pipeline

Our Spinnaker pipeline is going to take some parameters from the user. It leverages Armory’s Evaluate Artifacts stage to encode these parameters into Terraform artifacts. Then, it uses Armory’s Terraform stage to create an execution plan and deploy the infrastructure using this plan.

Parameters

To create my Spinnaker pipeline, I start by adding two parameters. The first is called ‘nameAndSpace’, and it is intended to receive JSON as input. This is useful when you are triggering a pipeline from an external system and want to simplify passing parameters from that system. The second parameter is called ‘replicas’ and is passed as a single value, which is more convenient when humans are manually invoking the pipeline and entering in parameters.

Evaluate Artifacts

In my pipeline, I add an ‘Evaluate Artifacts’ stage. I’m going to use this stage to create a Terraform variable file. In the stage, I add a new artifact with the following payload and name it ‘testvariables.tfVar’

namespace="${#readJson(parameters['nameAndSpace'])['space']}"
deployName="${#readJson(parameters['nameAndSpace'])['name']}"
replicas=${parameters.replicas}

This payload creates an artifact with the correct format for Terraform variables. It contains 3 variables: the first two are read from the JSON that we feed into the ‘nameAndSpace’ parameter, and the third is read directly from the ‘replicas’ parameter. Since replicas is of type integer, it does not need quotes. These are all SpEL expressions.

the configured evaluate variables step that leverages SpEL to read spinnaker parameters and store them in an artifact.

Now, we’ll add a second artifact that contains the Terraform script provided earlier as it’s payload. We’ll name this artifact ‘main.tf’.

We could combine both of these artifacts into a single artifact here by directly using SpEL within this second artifact instead of having a separate variables file. However, many companies store their Terraform scripts in source control systems like git. With two files, this second artifact can be seamlessly moved to git while keeping only the Terraform variables within Spinnaker, if so desired.

Terraform Plan

The Terraform plan stage takes the Terraform script, combines it with the Terraform variables file, and builds a Terraform execution plan. This execution plan gets stored in another artifact for later use.

The configuration of the Terraform 'Plan' stage

To configure the stage, add a new ‘Terraform’ stage with an action of ‘Plan’. We select the ‘main.tf’ artifact as the ‘Main Terraform artifact’ and also specify the ‘testvariables.tfvar’ artifact as a ‘Terraform Artifact’. Under ‘Produces Artifacts’ we add a new artifact named ‘planfile’ of type ’embedded artifact’. This is where the stage stores the Terraform execution plan.

Manual Judgment

One of the powers of Terraform is that you can review its execution plan before executing it. The Manual Judgment stage gives users a chance to review the output of Terraform plan and decide whether or not they wish to apply it. Many companies do manual review of execution plans before deploying updates to production environments. We do not need to perform any extra configuration on this stage.

Terraform Apply

This stage applies the Terraform execution plan. The stage is of type ‘Terraform’, and its action is ‘Apply’. For the main Terraform artifact we specify ‘main.tf’. We also pass the planfile in as an extra Terraform artifact so that the stage uses our execution plan.

The configuration of the Terraform Apply Stage.

Running the Pipeline

To run the pipeline we will specify the following parameters:

nameAndSpace: {"name":"test-deployment","space":"test-space-param"}

replicas: 2

When the pipeline runs, the Evaluate Artifacts stage creates our two artifacts. It converts the JSON of the ‘nameAndSpace’ variable into two Terraform variables and includes both of them plus the number of replicas in a single tfVars file. Terraform then builds an execution plan using these variables. Finally, if I approve the execution plan during the Manual Judgment stage, Armory Enterprise deploys this configuration using Terraform.

The computed terraform execution plan

I hope this post helped you understand the power of the new Armory Evaluate Artifacts stage and how you can use it to leverage SpEL syntax to reference parameters for any artifact type.

Evaluate Artifacts is available as a plugin for Armory Enterprise for Spinnaker 2.24.x or later and OSS Spinnaker 1.24.x or later. You can find more information in our Armory’s Evaluate Artifacts plugin documentation or by reaching out to your CS Representative.

Recently Published Posts

A Faster Way to Evaluate Self-Hosted Continuous Deployment from Armory

Sep 30, 2022

Introducing Quick Spin One of the most common challenges that organizations face when implementing a continuous deployment strategy is the time and focus that it takes to set up the tools and processes. But a secure, flexible, resilient and scalable solution is available right now. Want to see if it’s the right tool for your […]

Read more

3 Common Spinnaker Challenges (and Easy Ways to Solve Them)

Sep 27, 2022

Spinnaker is the most powerful continuous delivery tool on the market.  DevOps engineers and developers recognize this power and are looking to use Spinnaker as a foundational tool in their Continuous Integration and Continuous Delivery (CI/CD) process for hybrid and multi-cloud deployments. Such a powerful, expansive open source tool needs expertise within your organization to […]

Read more

Streamline Advanced Kubernetes Deployments from GitHub Actions with New Armory Service

Sep 23, 2022

Today, Armory is excited to announce the availability of the GitHub Action for Armory Continuous Deployment-as-a-Service. GitHub is where developers shape the future of software. After a developer writes and tests their code in GitHub, it must be deployed. Armory’s GitHub Action for Continuous Deployment-as-a-Service extends the best-in-class deployment capabilities to Kubernetes. CD-as-a-Service enables declarative […]

Read more