Using Docker + AWS to build, deploy and scale your application

Brandon Klimek AWS, DevOps, Docker, Spring, Spring Boot, Tutorial 7 Comments

I recently worked to develop a software platform that relied on Spring Boot and Docker to prop up an API. Being the only developer on the project, I needed to find a way to quickly and efficiently deploy new releases. However, I found many solutions overwhelming to set up.

That was until I discovered AWS has tools that allow any developer to quickly build and deploy their application.

In this 30 minute tutorial, you will discover how to utilize the following technologies:

Once finished, you will have a Docker application running that automatically builds your software on commit, and deploys it to the Elastic beanstalk sitting behind a load balancer for scalability. This continuous integration pipeline will allow you to worry less about your deployments and get back to focusing on feature development within your application.

Here is the order in which to configure services:

  1. Git repository initialization using CodeCommit
  2. CodeBuild Setup
  3. EBS Configuration
  4. CodePipeline Configuration
Background Knowledge

I am using Docker for this tutorial application. However AWS supports a wide range of configurable environments in the Elastic beanstalk; .NET, Java, NodeJS, PHP, Python, Ruby. Docker was chosen for this tutorial so that the reader can focus more on the build process and less on the project setup. With that being said, I will not be diving deeply into Docker. If you wish to learn more about Docker, start by reading the introduction on the Docker website.

The Application

The example Spring Boot source code that will be used can be found at: https://github.com/sixthpoint/Docker-AWS-CodePipeline

The application is a Spring Boot project configured to run on port 5000 and has a REST controller with a single endpoint.

The API REST controller is very basic. It maps /api/ path to a method which returns a list of strings in JSON format. This is the endpoint we will use to verify our application has successfully built and deployed on the AWS EBS.

ApiController.java
@RestController
@RequestMapping( value = "/api" )
public class ApiController {

    @RequestMapping( value = "/", method = RequestMethod.GET )
    public List<String> index() {
        List<String> s = new ArrayList<>();
        s.add("Docker + AWS CodePipline Tutorial");
        s.add("Learn more at: https://github.com/sixthpoint/Docker-AWS-CodePipeline");
        return s;
    }
}

The application creates am example-1.0.0-SNAPSHOT.jar file when built using Maven. This file is important for us to reference in our Dockerfile.

Maven build:
mvn clean install

Would produce target/example-1.0.0-SNAPSHOT.jar. The Dockerfile below uses a flavor of Alpine Linux to add, expose and run the Spring Boot application.

Dockerfile
FROM openjdk:8-jdk-alpine
VOLUME /tmp
ADD target/example-1.0.0-SNAPSHOT.jar app.jar
EXPOSE 5000
ENV JAVA_OPTS=""
ENTRYPOINT [ "sh", "-c", "java $JAVA_OPTS -Djava.security.egd=file:/dev/./urandom -jar /app.jar" ]
1. Git repository initialization using CodeCommit

First things first, we need a git repository to build our code from. AWS CodeCommit is cheap, reliable, and secure. It uses S3 which is a scalable storage solution subject to S3 storage pricing.

Begin by logging into your AWS console and creating a repository in CodeCommit. For the purpose of this tutorial, I have called the repository name the same name as the Spring Boot application. Once created, you will be presented with the standard HTTPS and SSH URLs of the repository.

The above example has generated the following repository location; notice if I try to do a clone from the repository, access is denied.

git clone https://git-codecommit.us-east-1.amazonaws.com/v1/repos/DockerCodePipleline

This is because the repository is private. AWS offers a flexible solution to authorize users to access a range of products in AWS using IAM.

See Also:  Using Amazon ElastiCache for Redis To Optimize Your Spring Boot Application
1A. Configuring Identity and Access Management (IAM)

IAM or Identity and Access Management enables you to securely control access to AWS services and resources. To authorize a user to access our private git repository, navigate to the IAM services page. Begin by adding a user. I have named the user the same name as the project and git repository. Choose programmatic access which will allow for policies to be added.

In order to allow this new user to fully administer our new git repository, attach the AWSCodeCommitFullAccess policy. Once added, click through to finish creating your user.

Now that a user has been created with the correct policies, GIT credentials are needed to work with the new CodeCommit repository. Navigate to the new user and look for the “HTTPS Git credentials for AWS CodeCommit” shown below. Generate a new username and password and download the .gitCrendientials file once prompted. Inside that file is the information needed to access your repository.

Note: Only two keys are allowed per user at this time. If you lose your key, a new one will need to be generated to access the repository. For more in-depth information on setting up git credentials in AWS, check out the guide for setting up HTTPS users using Git credentials.

1B. Moving the code to the new CodeCommit repository

With the new repository created, clone the Github repository holding our sample Spring Boot application. Change the remote to your new CodeCommit repository location, then finally push the master branch to master.

git clone https://github.com/sixthpoint/Docker-AWS-CodePipeline.git
git remote set-url origin git://https://git-codecommit.us-east-1.amazonaws.com/v1/repos/DockerCodePipleline
git push master master
2. CodeBuild Setup

Now that the CodeCommit repository holds our sample Spring boot application, the code needs to be built for deployment. Navigate to CodeBuild. CodeBuild is a source code compiler which is pay on demand.

Start by creating a new build project and point the source to the AWS CodeCommit repository that was created in Step 1. You can see I have pointed this new build project to the AWS CodeCommit source provider, and specified the DockerCodePipeline repository.

Next it asks for environmental information. The default system image is fine for this build process. The most important part is to tell CodeBuild to use the buildspec.yml. The buildspec contains the necessary commands to generate the artifacts needed to deploy to the EBS.

Included in the sample Spring Boot application is a buildspec.yml. This file is used to tell CodeBuild what commands to run in each phase, and what files to bundle up and save in the artifacts.

Additional configuration options can be found at: http://docs.aws.amazon.com/codebuild/latest/userguide/build-spec-ref.html.

Buildspec.yml
version: 0.2

phases:
  build:
    commands:
      - mvn clean
      - mvn install
artifacts:
  files:
    - 'Dockerfile'
    - 'target/example-1.0.0-SNAPSHOT.jar'

Final setup for the build process is to specify the location where the artifact made from the buildspec.yml will be stored. In the example below, I put all artifacts in the Amazon S3 under the name dockerAWSCodePipeline, and in a bucket named irdb-builds. The bucket can be in bucket of your choice. You must go into S3 and create this bucket prior to creating the build project.

The build project is now configured and ready to use. Builds can manually be run from the console creating artifacts stored in S3 as defined above.

4. EBS Setup

Now that the code is in CodeCommit, and the artifacts are built using CodeBuild, the final resource needed is a server to deploy the code. That is where the Elastic beanstalk comes in useful. The EBS is a service that automatically handles provisioning, load balancing, auto-scaling, etc. It is a very powerful tool to help your manage and monitor your applications servers.

See Also:  AWS AppSync with Lambda Data Sources

Let’s assume, for example, my API needs to have four servers due to the amount of requests I am receiving. The EBS makes the scaling of those servers simple with configuration options.

Begin by creating a new webserver environment and give it a name and domain name. This domain name is your AWS domain name; if you have a personal domain name you can point it to this load balancer being created using Route53.

The last step of creating your webserver worker environment is to tell EBS that we want to run Docker and to use the example application code. Later our code from CodeBuild will replace the AWS sample application.

The server and environment will take several minutes to start. Once complete, navigate to the configuration page of your new EBS environment.

By default the environment has a load balancer installed and auto scales. A scaling trigger can be set to adjust the number of instances to run given certain requirements. For example: I could set my minimum instances to 1 and maximum to 4 and tell the trigger to start a new instance each time the CPUUtilization exceeds 75%. The load balancer would then scale requests across the number of instances currently running.

5. CodePipeline Configuration

This is the final piece of the puzzle which brings all steps 1-4 above together. You will notice that up until now we have had to manually tell CodeBuild to run, then would have to go to the EBS and manually specify the artifact for deployment. Wouldn’t it be great if all this could be done for us?

That is exactly what Codepipeline does. It fully automates the building and provisioning of the project. Once new code is checked in, the system magically takes care of the rest. Here is how to set it up.

Begin by creating a new CodePipeline. In each step. select the repository, build project, and EBS environment created in step 1-4 above.

Once complete the CodePipeline will begin monitoring changes to your repository. When a change is detected, it will build the project, and deploy it to the available servers in your EBS application. You can monitor the CodePipeline in real time from the pipelines detail page.

A Final Word

When configured properly, the CodePipeline is a handy tool for the developer who wants to code more and spend less time on DevOps.

This pipeline gives a developer easy access to manage a application big or small. It doesn’t take a lot of time or money to set yourself up with a scalable application that utilizes a quick and efficient build and deployment process.

If you are in need of a solution to build, test, deploy, and scale your application, consider AWS CodePipeline as a great solution to get your project up and running quickly.

Comments 7

  1. a great article, but don’t you build an image and publish it to ECR or DockerHub. What do you think about no-downtime deployment, blue-green deployment for instance

    1. Thanks for reading, Igor!

      Due to the length of the article, I excluded the ability to push the docker image to ECR. The code itself is rather simple.

      An updated buildspec would build the docker image and push it to ECR. In Codebuild, I supply the following env variables:

      AWS_DEFAULT_REGION
      IMAGE_REPO_NAME
      IMAGE_TAG
      AWS_ACCOUNT_ID

      Buildspec.yml
      version: 0.2

      phases:
      install:
      commands:
      – echo Entered the install phase…
      – apt-get update -y
      – apt-get install -y software-properties-common
      – add-apt-repository ppa:openjdk-r/ppa
      – apt-get update -y
      – apt-get install -y openjdk-8-jdk
      – apt-get install -y maven
      pre_build:
      commands:
      – echo Logging in to Amazon ECR…
      – $(aws ecr get-login –no-include-email –region $AWS_DEFAULT_REGION)
      build:
      commands:
      – mvn clean
      – mvn install
      – echo Building the Docker image…
      – docker build -t $IMAGE_REPO_NAME:$IMAGE_TAG .
      – docker tag $IMAGE_REPO_NAME:$IMAGE_TAG $AWS_ACCOUNT_ID.dkr.ecr.$AWS_DEFAULT_REGION.amazonaws.com/$IMAGE_REPO_NAME:$IMAGE_TAG
      post_build:
      commands:
      – docker push $AWS_ACCOUNT_ID.dkr.ecr.$AWS_DEFAULT_REGION.amazonaws.com/$IMAGE_REPO_NAME:$IMAGE_TAG
      artifacts:
      files:
      – ‘Dockerrun.aws.json’

      Finally, modification to the docker run file to pull the build image from ECR. Note; proper permissions must be configured to authorize the the pull of the image from ECR.

      Dockerrun.aws.json

      {
      “AWSEBDockerrunVersion”: “1”,
      “Image”: {
      “Name”: “02934802394.dkr.ecr.us-east-1.amazonaws.com/ECR_REPO_NAME:latest”,
      “Update”: “true”
      },
      “Ports”: [
      {
      “ContainerPort”: “5000”
      }
      ]
      }

      As far as for blue-green deployments, a ELB with multi region instances could be configured to release the update in rolling deployment, routing traffic while one server is down and updating. This would be a good future article.

  2. Hi Brandon,

    Nice article. But i am unable to make it work may be because i am totally new to AWS. For me build stage itself is successful but i am seeing the below in build logs

    [Container] 2018/08/31 05:52:48 Expanding target/example-1.0.0-SNAPSHOT.jar
    [Container] 2018/08/31 05:52:48 Skipping invalid artifact path target/example-1.0.0-SNAPSHOT.jar
    [Container] 2018/08/31 05:52:48 Found 1 file(s)
    [Container] 2018/08/31 05:52:48 Phase complete: UPLOAD_ARTIFACTS Success: true
    [Container] 2018/08/31 05:52:48 Phase context status code: Message:

    May be because of this Docker image is not getting deployed.

    Can you please help ?

    thanks,
    Gaurav

    1. Below is the snapshot log is that helps to debug the issue :

      /var/log/eb-activity.log

      [2018-08-31T06:16:55.781Z] INFO [32169] – [Application update code-pipeline-1535696020442-MyAppBuild-6196c1b1-bc56-4a76-a2b2-53dd10189ee1@9/AppDeployStage0/AppDeployPreHook/03build.sh] : Starting activity…
      [2018-08-31T06:17:11.155Z] INFO [32169] – [Application update code-pipeline-1535696020442-MyAppBuild-6196c1b1-bc56-4a76-a2b2-53dd10189ee1@9/AppDeployStage0/AppDeployPreHook/03build.sh] : Activity execution failed, because: cat: Dockerrun.aws.json: No such file or directory
      cat: Dockerrun.aws.json: No such file or directory
      cat: Dockerrun.aws.json: No such file or directory
      8-jdk-alpine: Pulling from library/openjdk
      8e3ba11ec2a2: Pulling fs layer
      311ad0da4533: Pulling fs layer
      df312c74ce16: Pulling fs layer
      311ad0da4533: Verifying Checksum
      311ad0da4533: Download complete
      8e3ba11ec2a2: Verifying Checksum
      8e3ba11ec2a2: Download complete
      8e3ba11ec2a2: Pull complete
      311ad0da4533: Pull complete
      df312c74ce16: Verifying Checksum
      df312c74ce16: Download complete
      df312c74ce16: Pull complete
      Digest: sha256:1fd5a77d82536c88486e526da26ae79b6cd8a14006eb3da3a25eb8d2d682ccd6
      Status: Downloaded newer image for openjdk:8-jdk-alpine
      Successfully pulled openjdk:8-jdk-alpine
      Sending build context to Docker daemon 2.048kB

      Step 1/6 : FROM openjdk:8-jdk-alpine
      —> 5801f7d008e5
      Step 2/6 : VOLUME /tmp
      —> Running in c109bd2f7cff
      Removing intermediate container c109bd2f7cff
      —> 75d425832c6e
      Step 3/6 : ADD target/example-1.0.0-SNAPSHOT.jar app.jar
      ADD failed: stat /var/lib/docker/tmp/docker-builder561725111/target/example-1.0.0-SNAPSHOT.jar: no such file or directory
      Failed to build Docker image aws_beanstalk/staging-app, retrying…
      Sending build context to Docker daemon 2.048kB

      Step 1/6 : FROM openjdk:8-jdk-alpine
      —> 5801f7d008e5
      Step 2/6 : VOLUME /tmp
      —> Using cache
      —> 75d425832c6e
      Step 3/6 : ADD target/example-1.0.0-SNAPSHOT.jar app.jar
      ADD failed: stat /var/lib/docker/tmp/docker-builder449491179/target/example-1.0.0-SNAPSHOT.jar: no such file or directory
      Failed to build Docker image aws_beanstalk/staging-app: -1.0.0-SNAPSHOT.jar app.jar
      ADD failed: stat /var/lib/docker/tmp/docker-builder449491179/target/example-1.0.0-SNAPSHOT.jar: no such file or directory. Check snapshot logs for details. (ElasticBeanstalk::ExternalInvocationError)
      caused by: cat: Dockerrun.aws.json: No such file or directory
      cat: Dockerrun.aws.json: No such file or directory
      cat: Dockerrun.aws.json: No such file or directory
      8-jdk-alpine: Pulling from library/openjdk
      8e3ba11ec2a2: Pulling fs layer
      311ad0da4533: Pulling fs layer
      df312c74ce16: Pulling fs layer
      311ad0da4533: Verifying Checksum
      311ad0da4533: Download complete
      8e3ba11ec2a2: Verifying Checksum
      8e3ba11ec2a2: Download complete
      8e3ba11ec2a2: Pull complete
      311ad0da4533: Pull complete
      df312c74ce16: Verifying Checksum
      df312c74ce16: Download complete
      df312c74ce16: Pull complete
      Digest: sha256:1fd5a77d82536c88486e526da26ae79b6cd8a14006eb3da3a25eb8d2d682ccd6
      Status: Downloaded newer image for openjdk:8-jdk-alpine
      Successfully pulled openjdk:8-jdk-alpine
      Sending build context to Docker daemon 2.048kB

        1. I was able to fix it by making sure that the JAR file name is same in all 3 files (POM.xml, DockerFile & buildSpec)

What Do You Think?