CI/CD with AWS CodePipeline Using GitHub CLI & AWS CLI

Chinelo Osuji
13 min readAug 14, 2023

--

What is CI/CD?

CI/CD or Continuous Integration/Continuous Delivery (Deployment) is a software development strategy that involves automating the process of software delivery and infrastructure changes. This is facilitated through “pipelines”, which streamlines the phases of building, testing, and deploying software. These pipelines ensure consistent and reliable iterations by enforcing code quality checks, automated tests, and staged deployments. Ultimately leading to enhanced developer productivity, and a more resilient production environment.

Continuous Integration involves merging code back into the main branch of a codebase. These code changes are done frequently, after which automated builds and tests are run to ensure quality. This helps you to detect errors earlier with immediate feedback, which can potentially save you money. Addressing issues at a later stage can be more expensive than if they were addressed initially.

Continuous Delivery(Deployment) involves maintaining software in a constant release state and automating the release of code modifications from repository to production. This reduces the time between writing code and that code being used in production, allowing you to deploy multiple times a day and helping updates to reach users faster.

What is AWS CodePipeline?

A fully managed CI/CD service brought to us by AWS that automates the build, test, and deploy phases of the release process. AWS CodePipeline can integrate with other AWS services such as CodeBuild, CodeDeploy, and Lambda, and also third-party tools like GitHub, Jenkins and more.

Scenario

Let’s says asoftware development team has been tasked with a new website project. And your manager has asked you to create a way to automate the deployment of a website. Currently, the developers have to go through the process manually to test each new update to their code. This manual process has led to infrequent releases and occasional oversights during deployment, which sometimes carries defects into the production environment. By implementing a CI/CD pipeline, you can streamline the deployment process, ensure higher code quality, and provide developers with immediate feedback on their changes. This will lead to faster, more reliable releases and a more efficient development lifecycle.

So here’s what we will accomplish:

Using GitHub CLI (Command Line Interface) we will:
Create a new repository in GitHub and load the HTML file of the static website.

Using AWS CLI we will:
Create and configure a S3 bucket to host the static website.
Create a CI/CD pipeline using AWS Codepipeline.
Set the repository as the Source Stage of the Codepipeline that is triggered when an update is made.
Select the S3 bucket for the Deploy Stage.
Deploy the pipeline and verify that we can reach the static website.

And then, using GitHub CLI again, we will:
Make a simple update to the code in GitHub to verify that the Codepipeline is triggered.

First, let’s create the HTML file for the website.

Copy and paste the code below into your Notepad and save the file with the extension “.html” in a separate folder.

Now go to github.com and create an account. Then go to Settings.
Scroll down and click Developer settings.
On the left side select Peronal access tokens then click Tokens (classic).
Click Generate new token (classic).

Right now we are creating a token to authenticate and use GitHub CLI with our account.

Enter a name for your token in the Notes field and select the scopes for this token. Scopes are basically the permissions your token will have. For this demonstration, I selected all scopes for my token.
At the bottom of the page, click Generate token.
On the next page, copy your Personal Access Token and save it somewhere safe. You won’t be able to see it again.

Now let’s download the GitHub CLI.
Since I’m using Windows, I went to git-scm.com/download/win and downloaded the latest version.
Click the download and follow all of the steps. I decided to go with all of the default settings.

Now go to your computer’s terminal. I’m using Windows PowerShell.
Run gh auth login to sign into your GitHub account from the terminal.
Select Github.com and click Enter.

Select HTTPS and click Enter.

Type Y and click Enter to authenticate Git with your GitHub credentials.

Select Paste an authentication token and click Enter.

Copy your Personal Access Token you saved earlier, paste it and click Enter.

Now that we’re signed in to our GitHub account, run gh repo create <YOUR_REPOSITORY_NAME> --public to create a new repository that is publicly visible.
A repository, or commonly called a “repo”, is a storage space for folders and files for your project.
Public repositories can be forked by others, meaning they can create a copy and contribute back to the original.

I ran the cd command to switch to the folder containing the HTML file.
Run git init to initialize a new repository in the current folder.
This turns the folder into a Git repository, enabling version control for its content.
It’s typically one of the first commands you’d use when starting a new project that you intend to manage with Git.

Run git remote add origin https://github.com/<USERNAME>/<REPO-NAME>.git to add a reference to a new remote or online repository with the default name “origin”. Behind the scenes, Git will create entries in the .git/config file of the local repository storing the remote name and its URL. This allows you to fetch from, push to, or interact with the remote repository.

Then run git add <YOUR-HTML-FILE> to stage changes in the HTML file. By using this command, we’re telling Git to prepare the HTML file’s changes we want to be include in the next commit.

Run git commit -m "<YOUR-MESSAGE>"to save or “commit” your changes to the local repository, followed by a message that can be used to describe the changes you’ve made.

Run git push -u origin master to transfer (or “push”) the commit (or changes) to the master branch on the online repository.

Now if you go to your repository in your GitHub account online, you will see the HTML file now stored there.

Now, using the same terminal window, let’s authenticate into AWS account to use AWS CLI.

If you need assistance with this, please refer to my previous article on setting up AWS CLI. Click HERE.

Once authenticate, run aws s3api create-bucket --bucket <BUCKET-NAME> --region <REGION> to create an S3 bucket within a specific region.

And run aws s3 website s3://<BUCKET-NAME>/ --index-document <HTML-FILE> to configure the S3 bucket for static website hosting and set the HTML file as the default document to serve when visitors access the site’s root URL.

Then run aws s3api get-public-access-block --bucket <BUCKET-NAME> to get the Public Access Block Configuration for the S3 bucket. The configurations are set to true which means public access is blocked.

And run aws s3api delete-public-access-block --bucket <BUCKET-NAME> to remove the Public Access Block Configuration from the S3 bucket.
As a result, the S3 bucket will be publicly accessible.
Without these settings, the bucket will rely on ACL’s and bucket policies to handle public access, which should be verified to avoid unintended public exposure.

Now let’s create a policy for access to the S3 bucket.
Copy and paste the code below into your Notepad and save the file with the extension “.json”.
This policy grants permissions to retrieve and upload objects to the S3 bucket, as well as list its contents.
Keep in mind, this policy is overly permissive and should not be used in cases where security is a concern.

Make sure that your in the same directory of the bucket policy file.

Run aws s3api put-bucket-policy --bucket <BUCKET-NAME> --policy file://<BUCKET-POLICY-FILE> to assign the bucket policy to the S3 bucket.

Now make sure that your in the same directory of the HTML file.

And run aws s3 cp <HTML-FILE> s3://<BUCKET-NAME>/ to copy the HTML file to the S3 bucket.

Now if you go to your web browser and enter the static website URL, which will be in this format:<BUCKET-NAME>.s3-website-<REGION>.amazonaws.com you’ll see the HTML file that’s served.

Run aws s3api create-bucket --bucket <ARTIFACT-BUCKET-NAME> --region <REGION> to create another S3 bucket as an Artifact Store.

CI/CD involves building code into artifacts like .zip files and these artifacts need to be stored between the build and deploy phase.
In this case, we’re going to store these artifacts in a separate S3 bucket from the bucket used to host the static website.
It’s a best practice to separate different types of data.

And run aws s3api put-bucket-versioning --bucket <ARTIFACT-BUCKET-NAME> --versioning-configuration Status=Enabled to enable versioning for the Artifact store bucket.
When this feature is enabled, S3 will keep track of all changes to files in the bucket.
So, if you update or delete a file, you can still access older versions of that file.

Now let’s define an IAM Trust Relationship Policy.

AWS services often need to interact with one another.
For example, CodePipeline might need access to S3.
This policy below facilitates secure cross-service actions by letting CodePipeline assume a role with specific permissions.
In IAM, when you’re working with roles, the trust relationship is important.
It defines which AWS services or accounts can assume a role.
And when you’re integrating AWS with external services like GitHub, trust relationships ensure that permissions are granted securely.

Copy and paste the code below into your Notepad and save the file with the extension “.json”.

And then run aws iam create-role --role-name CodePipelineRole --assume-role-policy-document file://trustrelationship.json to create a new IAM role named “CodePipelineRole”, with the trust relationship from the trustrelationship.json file attached to this role.

Now let’s define an IAM policy for S3 and CodePipeline.

This policy below provides full read and write access to the S3 bucket and all of its contents and full access to all AWS CodePipeline actions and resources.

Copy and paste the code below into your Notepad and save the file with the extension “.json”.
Keep in mind, this policy is overly permissive and should not be used in cases where security is a concern.

Then run aws iam create-policy --policy-name CodePipelineS3Policy --policy-document file://codepipelinepolicy.json to create a new IAM policy called “CodePipelineS3Policy” based on the permissions defined in the codepipelinepolicy.json file.

And run aws iam attach-role-policy --role-name CodePipelineRole --policy-arn arn:aws:iam::<ACCOUNT-ID>:policy/CodePipelineS3Policy to attach the IAM policy “CodePipelineS3Policy” to the IAM role named “CodePipelineRole.”
Replace <ACCOUNT-ID> with your AWS account ID.

Before we create the pipeline, let’s define the configurations for it.

The code below defines an AWS CodePipeline with 2 Stages:
“Source”, which is set as the GitHub repo.
And “Deploy”, which is set as the S3 bucket that’s hosting the static website.
This automates the process of fetching code from the GitHub repo, and deploying it to the S3 bucket.

Copy and paste the code below into your Notepad and save the file with the extension “.json”.

Then run aws codepipeline create-pipeline --cli-input-json file://pipeline.json to automate the setup of the CodePipeline in AWS using the configurations contained in the .json file.

And run aws codepipeline start-pipeline-execution --name <PIPELINE-NAME> to trigger the execution of the CodePipeline.

Make sure that your in the same directory of the HTML file.

Run echo "<!-- YOUR-MESSAGE -->" >> <HTML-FILE> to add a comment to the HTML file.

Run git add <HTML-FILE> to stage or prepare changes made to the HTML file to be included in the next commit.

Run git commit -m "<YOUR-MESSAGE>"to save or “commit” your changes to the local repository, followed by a message that can be used to describe the changes you’ve made.

Run git push to send the committed changes from the local repository to the remote repository hosted on GitHub.

You can run aws codepipeline list-pipeline-executions --pipeline-name <PIPELINE-NAME> --max-items 3 to retrieve a list of the most recent executions of the CodePipeline.
This can be used to track the success or failure of each execution.

You can also go to CodePipeline in AWS Console to see the pipeline we created and triggered by the change we made.

Now let’s say the website is very popular all around the world but some users are complaining about slow load times in some regions. You have been asked to add CloudFront as a Content Delivery Network (CDN) for the static website. CloudFront should allow caching of your static webpage and only allow HTTPS traffic to your site.

Let’s run aws cloudfront create-distribution --origin-domain-name <BUCKET-NAME>.s3.amazonaws.com -default-root-object <HTML-FILE> to create a CloudFront distribution that uses the S3 bucket as its origin and the HTML file as the Default Root Object.
CloudFront will cache and distribute the content from the S3 bucket to edge locations around the world, reducing latency and improving the speed for users that access the static website.

We need to ensure that CloudFront only allows HTTPS traffic to the site.

Run aws cloudfront get-distribution-config --id <DISTRIBUTION-ID> > current-config.json to retrieve and save the configuration of the CloudFront distribution to a .json file called current-config.json.

Now locate the current-config.json file on your computer and open it.

For ViewerProtocolPolicy, change “allow-all” to “https-only” and save the file.

Your current-config.json file should look similar to what’s below:

{
"CallerReference": "cli-1691885883-826781",
"Aliases": {
"Quantity": 0
},
"DefaultRootObject": "simplewebpage.html",
"Origins": {
"Quantity": 1,
"Items": [
{
"Id": "chinelocicdbucket.s3.amazonaws.com-1691885883-760186",
"DomainName": "chinelocicdbucket.s3.amazonaws.com",
"OriginPath": "",
"CustomHeaders": {
"Quantity": 0
},
"S3OriginConfig": {
"OriginAccessIdentity": ""
},
"ConnectionAttempts": 3,
"ConnectionTimeout": 10,
"OriginShield": {
"Enabled": false
},
"OriginAccessControlId": ""
}
]
},
"OriginGroups": {
"Quantity": 0
},
"DefaultCacheBehavior": {
"TargetOriginId": "chinelocicdbucket.s3.amazonaws.com-1691885883-760186",
"TrustedSigners": {
"Enabled": false,
"Quantity": 0
},
"TrustedKeyGroups": {
"Enabled": false,
"Quantity": 0
},
"ViewerProtocolPolicy": "https-only",
"AllowedMethods": {
"Quantity": 2,
"Items": [
"HEAD",
"GET"
],
"CachedMethods": {
"Quantity": 2,
"Items": [
"HEAD",
"GET"
]
}
},
"SmoothStreaming": false,
"Compress": false,
"LambdaFunctionAssociations": {
"Quantity": 0
},
"FunctionAssociations": {
"Quantity": 0
},
"FieldLevelEncryptionId": "",
"ForwardedValues": {
"QueryString": false,
"Cookies": {
"Forward": "none"
},
"Headers": {
"Quantity": 0
},
"QueryStringCacheKeys": {
"Quantity": 0
}
},
"MinTTL": 0,
"DefaultTTL": 86400,
"MaxTTL": 31536000
},
"CacheBehaviors": {
"Quantity": 0
},
"CustomErrorResponses": {
"Quantity": 0
},
"Comment": "",
"Logging": {
"Enabled": false,
"IncludeCookies": false,
"Bucket": "",
"Prefix": ""
},
"PriceClass": "PriceClass_All",
"Enabled": true,
"ViewerCertificate": {
"CloudFrontDefaultCertificate": true,
"SSLSupportMethod": "vip",
"MinimumProtocolVersion": "TLSv1",
"CertificateSource": "cloudfront"
},
"Restrictions": {
"GeoRestriction": {
"RestrictionType": "none",
"Quantity": 0
}
},
"WebACLId": "",
"HttpVersion": "http2",
"IsIPV6Enabled": true,
"ContinuousDeploymentPolicyId": "",
"Staging": false
}

Now run aws cloudfront update-distribution --id <DISTRIBUTION-ID> --if-match <ETAG> --distribution-config file://current-config.json to update the CloudFront distribution with the new configuration provided in the current-config.json file.

Copy and paste the Domain Name from the current-config.json file in your web browser and add the Default Root Object at the end.

You will see HTTPS and not HTTP in the URL.

The CloudFront configuration has automatic redirection, which means that any HTTP request to this CloudFront distribution will be automatically redirected to HTTPS.

And that’s it. Thank you for making it this far and taking the time out to read this article. It’s greatly appreciated.

--

--

Chinelo Osuji

DevOps | Cloud | Data Engineer | AWS | Broward College Student