Taking on the Azure Developer Certification (70-532) Exam

Vince Pendergrass .NET, Azure, Opinion, Service Fabric 6 Comments

Many of the companies that we work with use various cloud providers (such as Google, Amazon and Microsoft) for IT Service Delivery. This has created a great need for those who assist these companies to possess the technical skills required for proper and effective implementation of such services.

An excellent way to make yourself stand apart from the crowd in this space (and your company for that matter), is to obtain a developer/architect certification, such as the Microsoft Azure Developer Certification. Plus, if your company is focusing on becoming a Microsoft partner, it may be necessary to have a few developers on your team spend some time working to become certified. Fortunately, my awesome company Keyhole Software presented me with this opportunity.

In this blog, I share what I did to prepare for the Azure developer certification, specifically the 70-532 Developing Microsoft Azure Solutions Certification exam. I’ll include a couple of prep tools that helped me significantly, as well as a few unexpected “gotchas” I encountered when taking the exam…

Azure Functions Breakfast Boost Scheduled

Keyhole Software .NET, Azure, Company News Leave a Comment

We are excited to announce the next free public Keyhole Software educational event: Building Your Evil(?) Empire with Azure Functions.

This Breakfast Boost event is a live learning opportunity that is open to the public. The presentation is geared to benefit software developers who are interested in implementing Azure Functions or building Cloud Solutions using JavaScript and .NET technologies.

For this educational talk, Keyhole will bring in guest speaker Bryan Soltis of Kentico, a Microsoft Azure MVP and Technical Evangelist.

This free presentation will be held at the Keyhole Software office in Leawood, Kansas on Wednesday, January 31, 2018 from 8-10 a.m. Space is limited. To get more information and reserve your free tickets, please visit this link to Eventbrite.com or https://azurefunctionswithkeyhole.eventbrite.com…

OpenShift Quick Start

David Pitt AWS, DevOps, Docker, Microservices, OpenShift Leave a Comment

Our previous blog in the series introduced RedHat’s OpenShift solution that provides a way for enterprise teams to implement their own PaaS. Essentially, it sits atop the Docker-based Kubernetes platform to provide a ready-to-use DevOps platform.

This blog introduces two hands-on exercises (taken from our OpenShift Course), that work to walk you through the following tasks:

– Installing OpenShift locally
– Adding a Container with an API service to a Pod

Unfortunately, it will take more than this quick start blog to get OpenShift installed and enabled in an enterprise. That said, developers, system admins, and any party that may be working on or responsible for the platform, will benefit from understanding how to get OpenShift up and running on a local machine as shown in this blog.

Managing Docker Containers with OpenShift and Kubernetes

Casey Justus AWS, DevOps, Docker, Microservices, Technology Snapshot Leave a Comment

For the last few years, Docker containers have been all the rage in the DevOps world. After all, what’s not to like? They allow you to strip out 99% of stuff in your VM and just deploy your code.

Containers can save resources, speed deployment, scale well and offer more fault tolerance. But how do you manage them?

In my experience, the Docker Machine and Docker Swarm stack hasn’t lived up my to expectations. It has a limited API, no support for monitoring and logging, and much more manual scaling. AWS’s EC2 containers scale well, but you’ll be locked into Amazon.

In my opinion, the best current stack for Docker containers includes Kubernetes and OpenShift. In this blog I will give a brief introduction to Kubernetes + OpenShift with an eye for what they do well…

Using Docker + AWS to build, deploy and scale your application

Brandon Klimek AWS, DevOps, Docker, Spring, Spring Boot, Tutorial 7 Comments

I recently worked to develop a software platform that relied on Spring Boot and Docker to prop up an API. Being the only developer on the project, I needed to find a way to quickly and efficiently deploy new releases. However, I found many solutions overwhelming to set up.

That was until I discovered AWS has tools that allow any developer to quickly build and deploy their application.

In this 30 minute tutorial, you will discover how to utilize the following technologies:
– AWS CodeCommit – source control (git)
– AWS Code Build – source code compiler, rest runner
– AWS Codepipeline – builds, tests, and deploys code every time the repo changes
-AWS Elastic Beanstalk – service to manage EC2 instances handling deployments, provisioning, load balancing, and health monitoring
-Docker + Spring Boot – Our containerized Spring Boot application for the demo

Once finished, you will have a Docker application running that automatically builds your software on commit, and deploys it to the Elastic beanstalk sitting behind a load balancer for scalability. This continuous integration pipeline will allow you to worry less about your deployments and get back to focusing on feature development within your application.