Deploying AWS CloudFormation Templates Through AWS CodePipeline

A robust and efficient CI/CD pipeline can do wonders for your development cycles. In implementing a CI/CD pipeline, you have to think about how new code can be processed and deployed with as much automation as possible. This is where services like AWS CodePipeline come in handy.

AWS CodePipeline automates the release of every iteration by tapping directly into your repository. It works well with GitHub and can be used to trigger additional actions within the AWS ecosystem, including the provisioning of resources using CloudFormation.

Good Source Code Management as a Start

Before we get into the details of how you can provision cloud resources, automate the deployment of CloudFormation templates, and trigger everything with a simple commit to GitHub, we need to first discuss how good source code management is the key to this streamlined process. You basically need two branches: the source branch and the release branch.

The CI/CD pipeline will automatically pull committed codes from GitHub. Once the code is tested and built, it is then merged to the release branch and flagged for deployment. The process between these two steps can be highly customized; you can, for instance, use CodeBuild to integrate TaskCat and run testing on the code automatically.

Once the code is merged, however, the process is a lot simpler. CodePipeline can be told to automate the deployment process completely. As long as you have a valid GitHub key, the possibilities are endless. Additional services like AWS Lambda can also be integrated to run custom code and automate tasks further.

Selecting a CloudFormation Template

There are ready-made templates that you can use out of the box, but every app has specific needs and customizing your cloud architecture is always the best way to go. You can use the AWS CloudFormation stack setup wizard to craft a suitable architecture for your app. There are several parameters that need to be customized, and they are:

  • OutputBucketName for identifying the S3 bucket that is used for your zipped code
  • AllowedIps for identifying IP CIDR blocks to work with Git IP authentication
  • GitToken for adding your personal access token
  • ApiSecret for when you need to bypass the IP authentication and use webhooks

You also select a particular AWS region to work within the beginning of the process, but everything else is configured automatically. You can then start the stack creation and wait until the process is completed. Write down the parameters found in the Outputs tab for future steps.

The next part is configuring the repository to use, and that means configuring webhooks or git pull so that committed code is stored as zip files in the designated S3 bucket. The zip file gets updated every time new code is committed to the repository. This is what triggers the CodePipeline operations.

So, What Is AWS CodePipeline?

AWS CodePipeline is a continuous delivery service that assists you in automating your pipelines for fast and reliable application and infrastructure updates. when new commits happen AWS CodePipeline automates the build, test, and deployment stages of your new release. AWS CodePipeline easily integrates your AWS cloud environments with third-party services such as GitHub or with your own custom plugin.

Setting up AWS CodePipeline is relatively easy. We outline the steps in full below.

Steps to Deploy a CloudFormation Template Through AWS CodePipeline

Now that we have a CloudFormation template, we need to deploy it through AWS CodePipeline. In this case, we will be using GitHub so make sure to place your template in GitHub prior to this.

Your first step will be creating one custom and one default role in your IAM console. These roles will be created specifically for CodePipeline to access your other AWS services.

Keep in mind that there is no CodePipeline role provided by AWS, but we can select EC2 and customize the role for this case.

Our next step is to create permissions, here you can mention any tags you may need before creating your role.

Once this role is created, we can then attach our custom inline policy which is given below.

The inline policy:

{
    "Statement": [
        {
            "Action": [
                "CloudFormation:GetObject",
                "CloudFormation:GetObjectVersion",
                "CloudFormation:GetBucketVersioning"
            ],
            "Resource": "*",
            "Effect": "Allow"
        },
        {
            "Action": [
                "CloudFormation:PutObject"
            ],
            "Resource": [
                "arn:AWS:CloudFormation:::CodePipeline*",
                "arn:AWS:CloudFormation:::elasticbeanstalk*"
            ],
            "Effect": "Allow"
        },
        {
            "Action": [
                "codecommit:CancelUploadArchive",
                "codecommit:GetBranch",
                "codecommit:GetCommit",
                "codecommit:GetUploadArchiveStatus",
                "codecommit:UploadArchive"
            ],
            "Resource": "*",
            "Effect": "Allow"
        },
        {
            "Action": [
                "codedeploy:CreateDeployment",
                "codedeploy:GetApplicationRevision",
                "codedeploy:GetDeployment",
                "codedeploy:GetDeploymentConfig",
                "codedeploy:RegisterApplicationRevision"
            ],
            "Resource": "*",
            "Effect": "Allow"
        },
        {
            "Action": [
                "elasticbeanstalk:*",
                "ec2:*",
                "elasticloadbalancing:*",
                "autoscaling:*",
                "CloudFormation:*",
                "CloudFormation:*",
                "iam:PassRole"
 
            ],
            "Resource": "*",
            "Effect": "Allow"
        },
        {
            "Action": [
                "lambda:InvokeFunction",
                "lambda:ListFunctions"
            ],
            "Resource": "*",
            "Effect": "Allow"
        },
        {
            "Action": [
                "opsworks:CreateDeployment",
                "opsworks:DescribeApps",
                "opsworks:DescribeCommands",
                "opsworks:DescribeDeployments",
                "opsworks:DescribeInstances",
                "opsworks:DescribeStacks",
                "opsworks:UpdateApp",
                "opsworks:UpdateStack"
            ],
            "Resource": "*",
            "Effect": "Allow"
        },
        {
            "Action": [
                "CloudFormation:CreateStack",
                "CloudFormation:DeleteStack",
                "CloudFormation:DescribeStacks",
                "CloudFormation:UpdateStack",
                "CloudFormation:CreateChangeSet",
                "CloudFormation:DeleteChangeSet",
                "CloudFormation:DescribeChangeSet",
                "CloudFormation:ExecuteChangeSet",
                "CloudFormation:SetStackPolicy",
                "CloudFormation:ValidateTemplate",
                "iam:PassRole"
            ],
            "Resource": "*",
            "Effect": "Allow"
        },
        {
            "Action": [
                "codebuild:BatchGetBuilds",
                "codebuild:StartBuild"
            ],
            "Resource": "*",
            "Effect": "Allow"
        }
    ],
    "Version": "2012-10-17"
}

Ensure you edit your trust relationship as this is not automatically created in this case. Paste the following under ‘Policy Document’ and click Update Trust Policy.

As mentioned above, we need to create 2 roles. The next role is specifically for CloudFormation to access your services. In this case, you’ll be giving CloudFormation full access to your S3 services, and like before, mention any tags you may need before moving on and creating the role.

Under the ‘Services’ tab, you will be able to edit your Pipeline settings. First, create a CodePipeline mentioning your Pipeline name, then attach the service role we first created.

Our next step will be to select a source stage, in this example, we’ll be using GitHub for our code, but the option to select Amazon S3, AWS CodeCommit, and Amazon ECR is also available.

If this is your first time connecting to GitHub via CodePipeline, you will need to authorize the application first, you can view how this is done here. At this point you will have to enter your GitHub credentials to continue, followed by the repository and branch of your code.

If you plan on going directly to a deployment stage, the build stage is optional in this case. Choose CloudFormation as your deploy provider and the region for your deployment. For this example, our Action mode will be ‘Create or update a stack’.

Finally, mention your Stack name, Artifact name, and the file name of your GitHub source and upon choosing next, the option to create your pipeline will be available.

Open CloudFormation in your AWS Console to confirm that your stack has been created.

And that’s all there is to it! You now have your template deployed and your stack created for your CI/CD pipeline.

Benefits of Using CodePipeline

Automating your CI/CD pipeline with AWS CodePipeline has its advantages. For starters, the release process becomes fully automated. There is no need to manually build, test, and deploy codes. There are safeguards that prevent new codes from bringing down the entire application too.

You also get plenty of room to manage code quality. Additional tools can be integrated, testing can be done on the fly, and further checks—including checks against known attack vectors and coding standards—can be added to the automated pipeline from CodePipeline.

Since you also have access to services like Lambda and CodeBuild, the flexibility of CodePipeline becomes a huge advantage. You can have a standard pipeline or a fully customized one without jumping through hoops.

A consistent release process is a real advantage for developers. With the entire process being standardized and automated, developers can focus on what really matters: maintaining the highest quality standard with every code that gets committed to GitHub.


Ibexlabs is an experienced DevOps & Managed Services provider and an AWS consulting partner. Our AWS Certified DevOps consultancy team evaluates your infrastructure and makes recommendations based on your individual business or personal requirements. Contact us today and set up a free consultation to discuss a custom-built solution tailored just for you.

Prem Sai Majeti

Leave a Comment

Your email address will not be published. Required fields are marked *

As AWS Certified Consulting Partners, you get more than just extensive cloud expertise and first-rate IT support. Our team gives true meaning to the words “brand ambassadors.”

We leverage our comprehensive industry experience on your business' behalf to resolve system pain points, transform your infrastructure, and work in tandem with you.

All for the growth and acceleration of your company.

Follow Us
Subscribe To Our Newsletter
Copyright © 2020 IbexLabs
Scroll to Top