Using Docker + AWS to Build, Deploy and Scale your App

Brandon Klimek AWS, Cloud, DevOps, Docker, Spring, Spring Boot, Tutorial 8 Comments

Attention: The following article was published over 7 years ago, and the information provided may be aged or outdated. Please keep that in mind as you read the post.

I recently worked to develop a software platform that relied on Spring Boot and Docker to prop up an API. Being the only developer on the project, I needed to find a way to quickly and efficiently deploy new releases. However, I found many solutions overwhelming to set up.

That was until I discovered AWS has tools that allow any developer to quickly build and deploy their application.

In this 30 minute tutorial, you will discover how to utilize the following technologies:

Once finished, you will have a Docker application running that automatically builds your software on commit, and deploys it to the Elastic beanstalk sitting behind a load balancer for scalability. This continuous integration pipeline will allow you to worry less about your deployments and get back to focusing on feature development within your application.

Here is the order in which to configure services:

  1. Git repository initialization using CodeCommit
  2. CodeBuild Setup
  3. EBS Configuration
  4. CodePipeline Configuration
Background Knowledge

I am using Docker for this tutorial application. However AWS supports a wide range of configurable environments in the Elastic beanstalk; .NET, Java, NodeJS, PHP, Python, Ruby. Docker was chosen for this tutorial so that the reader can focus more on the build process and less on the project setup. With that being said, I will not be diving deeply into Docker. If you wish to learn more about Docker, start by reading the introduction on the Docker website.

The Application

The example Spring Boot source code that will be used can be found at: https://github.com/sixthpoint/Docker-AWS-CodePipeline

The application is a Spring Boot project configured to run on port 5000 and has a REST controller with a single endpoint.

The API REST controller is very basic. It maps /api/ path to a method which returns a list of strings in JSON format. This is the endpoint we will use to verify our application has successfully built and deployed on the AWS EBS.

ApiController.java
@RestController
@RequestMapping( value = "/api" )
public class ApiController {

    @RequestMapping( value = "/", method = RequestMethod.GET )
    public List<String> index() {
        List<String> s = new ArrayList<>();
        s.add("Docker + AWS CodePipline Tutorial");
        s.add("Learn more at: https://github.com/sixthpoint/Docker-AWS-CodePipeline");
        return s;
    }
}

The application creates am example-1.0.0-SNAPSHOT.jar file when built using Maven. This file is important for us to reference in our Dockerfile.

Maven build:
mvn clean install

Would produce target/example-1.0.0-SNAPSHOT.jar. The Dockerfile below uses a flavor of Alpine Linux to add, expose and run the Spring Boot application.

Dockerfile
FROM openjdk:8-jdk-alpine
VOLUME /tmp
ADD target/example-1.0.0-SNAPSHOT.jar app.jar
EXPOSE 5000
ENV JAVA_OPTS=""
ENTRYPOINT [ "sh", "-c", "java $JAVA_OPTS -Djava.security.egd=file:/dev/./urandom -jar /app.jar" ]
1. Git repository initialization using CodeCommit

First things first, we need a git repository to build our code from. AWS CodeCommit is cheap, reliable, and secure. It uses S3 which is a scalable storage solution subject to S3 storage pricing.

Begin by logging into your AWS console and creating a repository in CodeCommit. For the purpose of this tutorial, I have called the repository name the same name as the Spring Boot application. Once created, you will be presented with the standard HTTPS and SSH URLs of the repository.

The above example has generated the following repository location; notice if I try to do a clone from the repository, access is denied.

git clone https://git-codecommit.us-east-1.amazonaws.com/v1/repos/DockerCodePipleline

This is because the repository is private. AWS offers a flexible solution to authorize users to access a range of products in AWS using IAM.

1A. Configuring Identity and Access Management (IAM)

IAM or Identity and Access Management enables you to securely control access to AWS services and resources. To authorize a user to access our private git repository, navigate to the IAM services page. Begin by adding a user. I have named the user the same name as the project and git repository. Choose programmatic access which will allow for policies to be added.

In order to allow this new user to fully administer our new git repository, attach the AWSCodeCommitFullAccess policy. Once added, click through to finish creating your user.

Now that a user has been created with the correct policies, GIT credentials are needed to work with the new CodeCommit repository. Navigate to the new user and look for the “HTTPS Git credentials for AWS CodeCommit” shown below. Generate a new username and password and download the .gitCrendientials file once prompted. Inside that file is the information needed to access your repository.

Note: Only two keys are allowed per user at this time. If you lose your key, a new one will need to be generated to access the repository. For more in-depth information on setting up git credentials in AWS, check out the guide for setting up HTTPS users using Git credentials.

1B. Moving the code to the new CodeCommit repository

With the new repository created, clone the Github repository holding our sample Spring Boot application. Change the remote to your new CodeCommit repository location, then finally push the master branch to master.

git clone https://github.com/sixthpoint/Docker-AWS-CodePipeline.git
git remote set-url origin git://https://git-codecommit.us-east-1.amazonaws.com/v1/repos/DockerCodePipleline
git push master master
2. CodeBuild Setup

Now that the CodeCommit repository holds our sample Spring boot application, the code needs to be built for deployment. Navigate to CodeBuild. CodeBuild is a source code compiler which is pay on demand.

Start by creating a new build project and point the source to the AWS CodeCommit repository that was created in Step 1. You can see I have pointed this new build project to the AWS CodeCommit source provider, and specified the DockerCodePipeline repository.

Next it asks for environmental information. The default system image is fine for this build process. The most important part is to tell CodeBuild to use the buildspec.yml. The buildspec contains the necessary commands to generate the artifacts needed to deploy to the EBS.

Included in the sample Spring Boot application is a buildspec.yml. This file is used to tell CodeBuild what commands to run in each phase, and what files to bundle up and save in the artifacts.

Additional configuration options can be found at: http://docs.aws.amazon.com/codebuild/latest/userguide/build-spec-ref.html.

Buildspec.yml
version: 0.2

phases:
  build:
    commands:
      - mvn clean
      - mvn install
artifacts:
  files:
    - 'Dockerfile'
    - 'target/example-1.0.0-SNAPSHOT.jar'

Final setup for the build process is to specify the location where the artifact made from the buildspec.yml will be stored. In the example below, I put all artifacts in the Amazon S3 under the name dockerAWSCodePipeline, and in a bucket named irdb-builds. The bucket can be in bucket of your choice. You must go into S3 and create this bucket prior to creating the build project.

The build project is now configured and ready to use. Builds can manually be run from the console creating artifacts stored in S3 as defined above.

4. EBS Setup

Now that the code is in CodeCommit, and the artifacts are built using CodeBuild, the final resource needed is a server to deploy the code. That is where the Elastic beanstalk comes in useful. The EBS is a service that automatically handles provisioning, load balancing, auto-scaling, etc. It is a very powerful tool to help your manage and monitor your applications servers.

Let’s assume, for example, my API needs to have four servers due to the amount of requests I am receiving. The EBS makes the scaling of those servers simple with configuration options.

Begin by creating a new webserver environment and give it a name and domain name. This domain name is your AWS domain name; if you have a personal domain name you can point it to this load balancer being created using Route53.

The last step of creating your webserver worker environment is to tell EBS that we want to run Docker and to use the example application code. Later our code from CodeBuild will replace the AWS sample application.

The server and environment will take several minutes to start. Once complete, navigate to the configuration page of your new EBS environment.

By default the environment has a load balancer installed and auto scales. A scaling trigger can be set to adjust the number of instances to run given certain requirements. For example: I could set my minimum instances to 1 and maximum to 4 and tell the trigger to start a new instance each time the CPUUtilization exceeds 75%. The load balancer would then scale requests across the number of instances currently running.

5. CodePipeline Configuration

This is the final piece of the puzzle which brings all steps 1-4 above together. You will notice that up until now we have had to manually tell CodeBuild to run, then would have to go to the EBS and manually specify the artifact for deployment. Wouldn’t it be great if all this could be done for us?

That is exactly what Codepipeline does. It fully automates the building and provisioning of the project. Once new code is checked in, the system magically takes care of the rest. Here is how to set it up.

Begin by creating a new CodePipeline. In each step. select the repository, build project, and EBS environment created in step 1-4 above.

Once complete the CodePipeline will begin monitoring changes to your repository. When a change is detected, it will build the project, and deploy it to the available servers in your EBS application. You can monitor the CodePipeline in real time from the pipelines detail page.

A Final Word

When configured properly, the CodePipeline is a handy tool for the developer who wants to code more and spend less time on DevOps.

This pipeline gives a developer easy access to manage a application big or small. It doesn’t take a lot of time or money to set yourself up with a scalable application that utilizes a quick and efficient build and deployment process.

If you are in need of a solution to build, test, deploy, and scale your application, consider AWS CodePipeline as a great solution to get your project up and running quickly.

0 0 votes
Article Rating
Subscribe
Notify of
guest

8 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments