Let's face it: manually building, testing, and deploying applications is a drag. It's slow, prone to human error, and leads to the classic "but it works on my machine!" problem. Every time you want to release a new feature or a bug fix, you have to go through a checklist of tedious steps, and if you miss one, the whole thing can break.
This project tackles that problem head-on by creating a fully automated Continuous Integration and Continuous Deployment (CI/CD) pipeline using GitHub Actions. The goal is simple: once a developer pushes new code to the repository, this system automatically tests it, builds it, packages it into a Docker container, and deploys it to the cloud (AWS ECS Fargate) with zero manual intervention.
This means faster releases, more reliable deployments, and more time for developers to do what they do best: write code.
Our backend is a standard Spring Boot application. The pipeline ensures that every code change is validated and deployed safely. Here's what the completed workflow looks like in GitHub Actions:

This workflow is broken down into several dependent jobs. The test
and build
jobs run in parallel to save time. Only after they succeed do we move on to packaging the application and, finally, building and deploying the Docker image.
Let's look at the detailed flow for each job.

-
Test & Build Jobs (Run in Parallel):
- Checkout code: Grabs the latest version of the code from the repository.
- Set up JDK 21: Prepares the Java environment needed to build the project.
- Run Unit Tests / Compile project:
- The
Test
job runs all the unit tests using a command likemvn test
. If any test fails, the entire pipeline stops to prevent a buggy deployment. - The
Build
job simply compiles the source code usingmvn compile
to ensure everything is syntactically correct.
- The
-
Package Jar Job (
package-jar
):- This job only runs after the
Test
andBuild
jobs have passed. - It takes the compiled code and packages it into an executable
.jar
file usingmvn package
. - This
.jar
file is then saved as an artifact—a file that can be passed to other jobs in the workflow.
- This job only runs after the
-
Build, Publish, and Deploy Job (
build-docker-image
):- This is the final and most critical stage.
- Download built jar: It starts by downloading the
.jar
artifact created in the previous job. - Log in to Docker Hub: It securely logs into Docker Hub using credentials stored in GitHub Secrets.
- Build Docker Image: A
Dockerfile
in the repository is used to create a container image that includes our Java application. - Configure AWS credentials & Login to ECR: Using secrets again, it configures access to our AWS account and logs into the Amazon Elastic Container Registry (ECR).
- Tag and push to ECR: The newly built Docker image is tagged with a unique identifier and pushed to our private ECR repository.
- Update ECS service: The final step is to tell our Amazon ECS service to pull the new image from ECR and deploy it. ECS Fargate handles the underlying infrastructure, seamlessly replacing the old containers with the new ones, resulting in a zero-downtime update.
The frontend is a modern Vite-powered React application written in TypeScript. Its pipeline is a bit different, as we're dealing with static files (HTML, CSS, JS) that need to be served by a web server.
Here’s the high-level view from GitHub Actions:

This is a simpler two-stage process: first, we build the static files, and then we package them into a Docker image and deploy them.
Below is the detailed diagram of the steps involved.

-
Build Job (
build-frontend
):- Checkout code: Gets the latest frontend code.
- Setup Node.js: Prepares the required Node.js environment.
- Install dependencies: Runs
npm install
to download all the necessary libraries. - Build frontend: Runs
npm run build
to compile the TypeScript/React code into optimized, static HTML, CSS, and JavaScript files. - Upload build artifact: The resulting
dist
folder, containing all the static files, is uploaded as an artifact.
-
Deploy Job (
deploy-frontend
):- Download build artifact: The job begins by grabbing the
dist
folder from the previous step. - Build Docker Image: You might wonder why we need Docker for a frontend app. We use a multi-stage
Dockerfile
that takes a lightweight web server (like Nginx) and copies our static build files into it. This creates a self-contained, runnable server for our frontend. - Deployment to AWS: The rest of the steps mirror the backend deployment process exactly:
- Log in to Docker Hub and Amazon ECR.
- Tag and push the new Nginx-based image to ECR.
- Update the corresponding ECS service to deploy the new frontend container.
- Download build artifact: The job begins by grabbing the
Security is key. We never hardcode sensitive information like passwords or access keys directly in the code. All credentials are stored securely in GitHub Actions Secrets. The workflows reference these secrets when needed.
The necessary secrets for this project are:
AWS_ACCESS_KEY_ID
: Your AWS access key.AWS_SECRET_ACCESS_KEY
: Your AWS secret key.AWS_REGION
: The AWS region for your ECR and ECS services (e.g.,us-east-1
).ECR_REPOSITORY
: The name of your repository in ECR.DOCKER_USERNAME
: Your Docker Hub username.DOCKER_PASSWORD
: Your Docker Hub password or access token.