AWS CodePipeline Create Pipeline For .NET Core - gecko-8/devwiki GitHub Wiki

Up

Introduction

In this guide, we'll use AWS CodePipeline to orchestrate a deployment from AWS CodeCommit to AWS Elastic Beanstalk. We'll cover two cases:

  1. A single Docker based .NET Core project
  2. a multi-container Docker based .NET Core project with an Admin panel, API, and Nginx container to route requests

We'll use the following AWS services:

  1. CodePipeline - Manages the whole process
  2. CodeCommit - Stores the code repository
  3. CodeBuild - Builds the Docker image(s)
  4. CodeDeploy - Triggers the Elastic Beanstalk deployment once Docker image(s) are created
  5. Elastic Beanstalk (EBS) - Hosts the resulting site using EC2 instances
  6. S3 - Stores the code between CodeCommit and CodeBuild
  7. Elastic Container Registry (ECR) - Stores the Docker image(s)

IMPORTANT: Create all resources in the same AWS region (e.g. ca-central-1)

Prerequisites

  • Docker-based .NET Core project
  • AWS CodeCommit repository with your code project

Find Your Account ID

  1. Log into the AWS console.
  2. Click your profile dropdown in the top right.
  3. Next to Account you should see your Account Id. Record this for later.

Create an S3 Bucket

  1. Log into the AWS console with the root user or an admin user with ability to manage S3 services.
  2. Navigate to the S3 page.
  3. Click the + Create Bucket button.
  4. Under Bucket Name, enter a name. For example, <project name>-deploy-s3.
  5. Make sure Region is your desired region, otherwise you may have AWS set to the wrong region.
  6. Under Bucket settings for Block Public Access, uncheck Block All Public Access (we will change this back later).
  7. Click Create Bucket.

Create the ECR

  1. Log into the AWS console with the root user or an admin user with ability to manage ECR services.
  2. Navigate to the ECR page.
  3. Select Repositories on the left-hand nav.
  4. Click the Create Repository button at the top right.
  5. Enter a name in the Repository Name field. This could just be the name of your project.
  6. Click the Create Repository button.
  7. Record the full repository URI from the Repositories list that appears, you'll need it shortly.

Create the EBS Environment

  1. Create the environment. We won't go into the details of this here as it's very project specific. The important item to remember is to choose the correct Docker environment type (single or multi-container).

Add Project Files

We need to add a few files to our project that AWS will use to perform the build and deployment operations.

buildspec Files

Single Container Project

  1. Add a buildspec.yml file at the root of your project. It should look a lot like the following:
version: 0.2

phases:
  pre_build:
    commands:
      - echo Logging in to Amazon ECR...
      - aws ecr get-login-password --region $AWS_DEFAULT_REGION | docker login --username AWS --password-stdin $AWS_ACCOUNT_ID.dkr.ecr.$AWS_DEFAULT_REGION.amazonaws.com
  build:
    commands:
      - echo Build started on `date`
      - echo Building the Docker image...          
      - docker build -t $IMAGE_REPO_NAME:$IMAGE_TAG -f ./Dockerfile .
      - docker tag $IMAGE_REPO_NAME:$IMAGE_TAG $AWS_ACCOUNT_ID.dkr.ecr.$AWS_DEFAULT_REGION.amazonaws.com/$IMAGE_REPO_NAME:$IMAGE_TAG      
  post_build:
    commands:
      - echo Build completed on `date`
      - echo Pushing the Docker image...
      - docker push $AWS_ACCOUNT_ID.dkr.ecr.$AWS_DEFAULT_REGION.amazonaws.com/$IMAGE_REPO_NAME:$IMAGE_TAG
artifacts:
  files: Dockerrun.aws.json
  • The "pre_build" phase just connects to ECR.
  • The "build" phase actually builds and tags your docker images. You may need to change the path of your Dockerfile to match your project structure.
  • The "post_build" phase pushes the Docker image to ECR.
  • The "artifacts" section is very important as EBS will need the Dockerrun.aws.json file to actually deploy your Docker images. We'll create this file next.

Multi Container Project

  1. Perform the above Single Container Project steps for each of your containers. Name the files buildspec.<project identifier>.yml, where <project identifier> can be anything (e.g. admin, api, or nginx). The location of your Dockerfiles will also need to be adjusted to point to the correct files.

Dockerrun.aws.json File

Single Container Project

Add a Dockerrun.aws.json file to the root of your project. Here's an example:

{
  "AWSEBDockerrunVersion": "1",
  "Image": {
    "Name": "<ECR URI recorded earlier>:latest",
    "Update": "true"
  },
  "Ports": [
    {
      "ContainerPort": "80",
      "HostPort": "80"
    }
  ]
}

Multi Container Project

Add a Dockerrun.aws.json file to the root of your project. Here's an example that has 3 containers (admin, api, nginx):

{
  "AWSEBDockerrunVersion": "2",
  "containerDefinitions": [
    {
      "name": "webapi",
      "image": "<ECR URI recorded earlier>:api",
      "hostname": "webapi",
      "essential": true,
      "memory": 876
    },
    {
      "name": "admin",
      "image": "<ECR URI recorded earlier>:admin",
      "hostname": "admin",
      "essential": true,
      "memory": 674
    },
    {
      "name": "nginx",
      "image": "<ECR URI recorded earlier>:nginx",
      "hostname": "nginx",
      "essential": true,
      "portMappings": [
        {
          "hostPort": 80,
          "containerPort": 80
        }
      ],
      "links": [ "admin", "webapi" ],
      "memory": 250
    }
  ]
}

An important item note is the "links" section in the nginx entry. This tells EBS that the nginx container can access the admin and webapi containers which is necessary for routing.

Give EBS Access to ECR

  1. Log into the AWS console with the root user or an admin user with ability to manage IAM services.
  2. Navigate to the IAM page.
  3. Click Roles in the left nav.
  4. Search for aws-elasticbeanstalk-ec2-role.
  5. Click on the aws-elasticbeanstalk-ec2-role role.
  6. Click Attach Policies.
  7. Search for AmazonEC2ContainerRegistryReadOnly.
  8. Add a check next to it.
  9. Click Attach Policy.

Create the CodePipeline

  1. Log into the AWS console with the root user or an admin user with ability to manage CodePipeline services.
  2. Navigate to the CodePipeline page.
  3. Click the Create Pipeline button.
  4. Enter a Pipeline Name (e.g. <project-name>-pipeline.
  5. Ensure New Service Role is selected under Service Role. Role name should auto populate.
  6. Expand Advanced Settings.
  7. Ensure "Default Location" and "Default AWS Managed Key" are selected.
  8. Click Next.

Add Source Stage

  1. Under Source Provider, select AWS CodeCommit.
  2. Under Repository Name, select your CodeCommit repository.
  3. Under Branch Name, select the CodeCommit branch to use. For example, if this pipeline is for staging, the branch might be called "development".
  4. Leave Change Detection Options set to Amazon CloudWatch Events.
  5. Click Next.

Add Build Stage

NOTE: If creating a multi-container pipeline, follow these steps for a single one of the projects and we'll add the others later.

  1. Under Build Provider, select AWS CodeBuild.
  2. Under Region, ensure it's set to the correct region. If not, you're probably trying to create your pipeline in the wrong region, start over with the correct region set.
  3. Under Project Name, click Create Project. A popup will open.
  4. Enter a Project Name (e.g. <project-name>-build for single container or <project-name>-<project-identifier>-build for multi-container).
  5. Under Environment Image, ensure Managed Image is selected.
  6. Under Operating System, choose Ubuntu.
  7. Under Runtime, choose Standard.
  8. Under Image, choose aws/codebuild/standard:4.0.
  9. Under Privileged, ensure the checkbox is selected. This is critical or the Docker containers won't build.
  10. Under Service Role, ensure New Service Role is selected.
  11. Under Role Name, add a name (e.g. <project-name>-build for single container or <project-name>-<project-identifier>-build for multi-container).
  12. Expand Additional Configuration
  13. Leave all defaults until you reach Environment Variables.
  14. Create a variable called AWS_DEFAULT_REGION and set it to your region (e.g. ca-central-1).
  15. Create a variable called IMAGE_REPO_NAME and set it to the short name of your repository (e.g. the name of your project).
  16. Create a variable called IMAGE_TAG and set it according to:
    1. Single Container: latest
    2. Multi Container: <project-identifier>
  17. Create a variable called AWS_ACCOUNT_ID and set it to your account ID as recorded earlier.
  18. Under Build Specifications, ensure Use a Buildspec File is selected.
  19. Enter the name of your buildspec file. For single container projects this will just be buildspec.yml. For multi-container it will be buildspec.<project identifier>.yml.
  20. Click Continue to CodePipeline. The popup will close and you'll be back at the Pipeline wizard.
  21. Click Next.

Add Deploy Stage.

  1. Under Deploy Provider, choose AWS Elastic Beanstalk.
  2. Under Application Name, choose the EBS application you created earlier.
  3. Under Environment Name, choose the EBS environment you created earlier.
  4. Click Next.
  5. Click Create Pipeline. You'll be taken to the pipeline overview.
  6. Click Stop Execution (we're not quite done setup so build will fail anyway). This will open a popup.
  7. Under Select Execution choose the execution that's running (should only be one option).
  8. Under Choose a Stop Mode for the Execution, choose Stop and Abandon.
  9. Click Stop. Popup will close and pipeline will stop.

Add Remaining Build Tasks for Multi Container Projects

Skip this step for single container projects.

  1. Click Edit at the top right.
  2. Click Edit Stage in the Edit: Build section.
  3. Perform the following for each remaining project.
    1. Click the + Add Action square.
    2. Under Action Name, enter Build<Capitalized project identifier>.
    3. Under Action Provider, choose AWS CodeBuild.
    4. Under Input Artifacts, choose SourceArtifact.
    5. Under Project Name, click Create Project. A popup will open the same as before.
      1. Under Project Name, click Create Project. A popup will open.
      2. Enter a Project Name (e.g. <project-name>-<project-identifier>-build).
      3. Under Environment Image, ensure Managed Image is selected.
      4. Under Operating System, choose Ubuntu.
      5. Under Runtime, choose Standard.
      6. Under Image, choose aws/codebuild/standard:4.0.
      7. Under Privileged, ensure the checkbox is selected. This is critical or the Docker containers won't build.
      8. Under Service Role, select Existing Service Role.
      9. Under Role ARN, select the service role you created in the initial build project setup. Should be something like <project-name>-<project-identifier>-build.
      10. Expand Additional Configuration
      11. Leave all defaults until you reach Environment Variables.
      12. Create a variable called AWS_DEFAULT_REGION and set it to your region (e.g. ca-central-1).
      13. Create a variable called IMAGE_REPO_NAME and set it to the short name of your repository (e.g. the name of your project).
      14. Create a variable called IMAGE_TAG and set it according to:
        1. Single Container: latest
        2. Multi Container: <project-identifier>
      15. Create a variable called AWS_ACCOUNT_ID and set it to your account ID as recorded earlier.
      16. Under Build Specifications, ensure Use a Buildspec File is selected.
      17. Enter the name of your buildspec file. For single container projects this will just be buildspec.yml. For multi-container it will be buildspec.<project identifier>.yml.
      18. Click Continue to CodePipeline. The popup will close and you'll be back at the Add Action wizard.
    6. Click Done

Final Permissions

  1. Log into the AWS console with the root user or an admin user with ability to manage IAM services.
  2. Navigate to the IAM page.
  3. Click Roles in the left nav.
  4. Search for codebuild.
  5. Click on the role for your build. Should be something like <project-name>-<project-identifier>-build.
  6. On the Permissions tab, click Attach Policies.

ECR Permissions

  1. Click Create Policy. This will open a new browser window.
  2. Click the JSON tab.
  3. Replace the existing content with (make sure to replace <region>, <account id>, and <repository name> with the correct values recorded earlier):
{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Sid": "VisualEditor0",
            "Effect": "Allow",
            "Action": [
                "ecr:CompleteLayerUpload",
                "ecr:UploadLayerPart",
                "ecr:InitiateLayerUpload",
                "ecr:BatchCheckLayerAvailability",
                "ecr:PutImage"
            ],
            "Resource": "arn:aws:ecr:<region>:<account id>:repository/<repository name>"
        },
        {
            "Sid": "VisualEditor1",
            "Effect": "Allow",
            "Action": "ecr:GetAuthorizationToken",
            "Resource": "*"
        }
    ]
}
  1. Click Review Policy.
  2. Enter a name. For example, <project name>-ecr.
  3. Click Create Policy.
  4. Close the Create Policy browser window.

S3 Permissions

  1. Click Create Policy.
  2. Click the JSON tab.
  3. Replace the existing content with (make sure to replace <project name> with the name of your S3 bucket recorded earlier):
{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Sid": "VisualEditor0",
            "Effect": "Allow",
            "Action": [
                "s3:GetObject",
                "s3:GetBucketAcl",
                "s3:GetBucketLocation",
                "s3:GetObjectVersion"
            ],
            "Resource": [
                "arn:aws:s3:::<project name>/*",
                "arn:aws:s3:::<project name>"
            ]
        }
    ]
}
  1. Click Review Policy.
  2. Enter a name. For example, <project name>-s3.
  3. Click Create Policy.
  4. Close the Create Policy browser window.

Add the New Policies

  1. In the original IAM browser window, click the Refresh button at the top right of the policy list.
  2. Find each of the new policies and add a check beside them.
  3. Click the Attach Policy button.
  4. IMPORTANT: Return to the S3 page.
  5. Find your S3 bucket, edit it's permissions and re-select Block All Public Access.

Test Your Pipeline

  1. Return to the CodePipelines page.
  2. Click on your new Pipeline.
  3. Click the Release Change button at the top right.
  4. Click Release.
  5. The pipeline should now run all the way through.

Troubleshooting Tips

  • If a stage fails, a small Details link should appear in that task. Clicking that link will show the log for that task.
  • Sometimes the log doesn't show enough information. Switching the Details view to the Phase Details tab can sometimes show more information.

Build Task Stops at "Waiting for DOWNLOAD_SOURCE"

This is likely caused by CodeBuild not having access to the S3 bucket containing the SourceArtifact. Check the S3 permissions from the Final Permissions section above.

⚠️ **GitHub.com Fallback** ⚠️