I recently built a new Jenkins server hosted using Kubernetes on EKS. My main use of the Jenkins server is to automate application and infrastructure tests, deployments, and miscellaneous tasks. I get email notifications if these jobs fail, so I know when there is an issue with my software.
Many of my Jenkins jobs work with the AWS CLI and Terraform to interact with and manipulate my cloud infrastructure. I believe these jobs may be useful as templates for others wishing to achieve similar results.
One issue I often faced with my AWS account was inadvertently keeping infrastructure running, using energy and costing me money. To help safeguard against this, I decided to create a Jenkins job which would run on a daily schedule, checking the costs incurred on my account. If the daily costs are under a certain amount, the job passes. If the costs hit a certain threshold, the job throws a warning, and if the costs are way too high it fails.
The Jenkins pipeline is named cost-detection. The full Jenkinsfile is listed below, with the code also available on GitHub.
The first thing to notice is that the Jenkins job imports a shared library named global-jenkins-library. Functions from this library are used throughout the Jenkinsfile, such as git.basicClone(), infrastructuresteps.setupEnvironment(), and genericsteps.postScript(). The code for these functions is in my global-jenkins-library repository.
The job runs on my master Jenkins agent (the Jenkins server's container) and is triggered every morning sometime between 7 and 8 AM UTC. The job has three stages - checkoutRepo, setupEnvironment, and detectAWSCosts. These stages checkout the repository, create a Python virtual environment from a Pipfile, and run a Python script which performs the AWS account cost detection, respectively. In the detectAWSCosts stage you can also see the cost ranges and their respective build results.
Once all the stages are completed, the postScript function is called. This function cleans the Jenkins workspace and sends me an email notification with the job results.
The most important part of the Jenkins job is the Python script which calculates the average cost of my AWS infrastructure over the past three days. The script uses boto3, an AWS SDK for Python. Specifically, it uses the cost explorer API to get cost and usage statistics.
My SaintsXCTF application uses an Amazon RDS database, running MySQL, to hold application data. The application has a production environment and a development environment. Both environments have their own RDS database instance. When my development environment is running, I try to cut costs by shutting down its RDS database at night. I created a Jenkins job called scheduling-dev-database. The full Jenkinsfile is listed below, with the code also available on GitHub.
The job takes in a parameter named action, which determines whether the RDS database is stopped or started. It also runs on a schedule, utilizing the Parameterized Scheduler plugin1. This plugin allows the database to be stopped every night and started up again every morning.
The AWS CLI is utilized to start or stop the database, depending on the action specified. Just like the cost detection job, the results are emailed to me after all the stages are completed.
Most of my AWS Infrastructure is written as code using Terraform. I decided to write Jenkins jobs for all my Terraform modules which create and destroy infrastructure. This has two benefits. The first benefit is that the creation and deletion of infrastructure is automated, so I don't need to manually type out Terraform CLI commands. The second benefit is that the Jenkins job code is a form of documentation for how to build certain infrastructure modules, similarly to how a Dockerfile is documentation for how to host an application on a server. With the Jenkins jobs in place, I can refer to their Jenkinsfiles in case I forget the steps for building infrastructure in the future.
Let's go over an example. I created a Jenkins job called create-database to build an applications RDS infrastructure and a Jenkins job called destroy-database to tear down an applications RDS infrastructure. The full Jenkinsfile for create-database is listed below, with the code also available on GitHub.
The Jenkins job takes two parameters - autoApply and environment. If autoApply is false, then manual intervention is needed to approve the Terraform plan which builds AWS infrastructure. environment provides a choice of environments to create the RDS database in. I then have a series of stages which checkout the repository containing the RDS Terraform scripts and attempt to apply them. The Terraform module that is checked out comes from my saints-xctf-infrastructure repository.
The Terraform stages utilize some reusable functions I've created. These functions, which initialize a Terraform module, validate it, generate a plan for the changes, and apply the changes, are listed below.
Once the database infrastructure is created with Terraform, the destroy-database Jenkins job can be run to tear it down. The Jenkinsfile is listed below and is also available on GitHub.
Just like the create-database job, destroy-database utilizes reusable Groovy scripts for destroying Terraform infrastructure.
The last Jenkins job to discuss takes a Dockerfile, creates an image, and pushes it to an AWS ECR repository. Specifically, this image is for one of my prototype applications which uses GraphQL. The full Jenkinsfile is listed below, with the code also available on GitHub.
The Jenkins job takes two parameters which are used for labeling the Docker image. label defines the numbered version of the image and isLatest determines if this image should also be labelled with latest. The stages of the pipeline checkout the repository containing the Dockerfile, build the image, and push it to an ECR repository. It also performs some cleanup work, such as deleting the Docker image after it's pushed. Finally, just like my other Jenkins jobs, it sends me an email with the results.
Jenkins jobs and other CI/CD scripts are great ways to automate deployments, testing, and infrastructure. I use Jenkins extensively to help my AWS cloud workloads. You can view more of my Jenkins jobs in the global-jenkins-jobs repository and my reusable Jenkins function library in the global-jenkins-library repository.