Automated Canary Analysis Local Setup

Canary deployment is one of the deployment methods in which we gradually route a portion of the traffic into a certain new version and deploy that version, called canary which usually handles a low percentage, to reduce the deployment risk of the new version on production. But the analysis of the canary is manual where the team has to check the effect of changes on the defined metrics manually.

We can automate that manual part of analyzing the metrics and give us fast deployment which will be more risk-free as analyzing is nothing but just comparing 2 metrics for a different version of production at different timelines. Recognizing the need for this automation, Netflix and Google have created a tool Kayenta which automates this manual analysis. By utilizing statistical analysis and machine learning techniques, Kayenta can quickly and accurately determine if the canary is performing as expected or if there are any anomalies or regressions.

There are certain benefits to using automated canary analysis(ACA)

  • It speeds up the deployment pipeline, allowing for faster iterations and reducing the time to market

  • It also improves the overall reliability of canary deployments by removing the potential for human bias and oversight.

  • It also empowers development teams to make data-driven decisions and confidently deploy new versions to production with reduced risk and increased efficiency.

Kayenta

Kayenta is a platform for automating canary deployment for the new version release on the production environment. It receives the input in the form of metrics data points for a particular period of time in both the new and old versions. We divide traffic into a second segment with the same traffic volume as the canary, which we refer to as the baseline, to compare it to the canary.
To complete the canary analysis

  • Retrieves the key metrics from the baseline and canary clusters.

  • Judge the metric of baseline and canary.

  • Data validation and cleaning

  • Compare matric and give the result.

To run this locally we can clone this from this git repository.

git clone https://github.com/spinnaker/kayenta.git

Kayenta service is built in gradle spring boot so make sure to make JVM and gradle present in the device. We can see the target Java version in the gradle properties file. Which we have to use in order to run the service. At present, the Java version required is 11 with Gradle version 6.9
Before running the service we need to run Redis on locally on port 6379 as in kayenta.yml file which is used for the configuration of the project requires a Redis connection on the mentioned port. To run Redis locally you follow this article.

To run Redis locally using docker run this command

docker run -d -p 6379:6379 --name redis redis;

After this, we need to create an init.gradle file somewhere in the below code added

initscript {
  repositories {
    maven {
      url "https://plugins.gradle.org/m2/"
    }
  }

  dependencies {
    classpath("org.springframework.boot:spring-boot-gradle-plugin:2.2.0.RELEASE")
  }
}

allprojects {
  apply plugin: org.springframework.boot.gradle.plugin.SpringBootPlugin
}

then we need to run the following command to build the artifact(in some sense jar)

./gradlew --init-script /path/to/init.gradle kayenta-web:bootJar

Now after building this jar which will be present in the kayenta-web/build/libs folder we can run the jar to start the application at port 8090.

java -jar kayenta-web/build/libs/kayenta-web.jar

Or we can try using this command also to run the service in place of the jar.

./gradlew clean kayenta-web:bootrun

Referee

Referee is a tool created by Nike that will be integrated with Kayenta and handle the UI part of Kayenta where we can give the input and get the result of the call for the matric analysis.

We will see the tool setup for the Kayenta and Referee.

We will clone both the repos for GitHub as both are open-sourced.

First, we will run the Referee service.
We can clone the service from this git repo.

git clone https://github.com/Nike-Inc/referee.git

The referee service is built in react.ts and node.ts to find which node service we need to run we can go into the .nvmrc file which contains the target node versions which we should use. We can use a higher version also. If you have a different node version you can use nvm to setup the desired version.

the next command which we need to run is bootstrap the project is

yarn && yarn bootstrap

And to start the application we can run the following command

yarn start

This will start the project at port number 3000.

So this is all for this article, in the next blog we will see how we can connect these 2 tools with our service which is connected to Datadog for the motoring to judge the deployment and give us the analysis for new version change in code while deployment so we can take quick action.