AWS S3 + GitHub Actions (CI/CD Workflow)


I’m studying for the AWS Architect Associate’s exam, so am learning about the different services in detail. My studies today took me deeper into the world of S3 buckets, but I thought that it would be good to get some hands-on experience at the same time. So, I purchased a domain, which I had previously had years ago but let lapse, set up a S3 bucket, and then followed the steps to create a static website using that bucket. I won’t go through all the steps because I literally followed AWS’s tutorials on how to do that – it’s very excellent!- so no need to reinvent the wheel. However, what I would like to share is how I then, as part of the Cloud Resume Challenge, took the additional steps of obtaining a certificate for the s3 bucket and linking up the CloudFront CDN services. The other part that I’ve been itching to do is to learn how to build in GitHub Actions to help automate the whole process of pushing revisions in the code (subsequently pushed to the GitHub repo) directly to the S3 bucket. It’s another tool that I am starting to learn, but find myself excited about the possibilities!

Let me start at the end. I configured the GitHub Action Workflow correctly, which I’ll describe more soon, and after that, if I made any changes in the code, and after I commited the changes and pushed them to my repo, those changes would then trigger the GitHub Action of pushing those edits to the s3 bucket. And then I just needed to refresh my browser to see the changes. Cool stuff.

Prior to that, I had been making edits to the code and then switching from my editor to the terminal where I then commited and pushed the code to the repo. That was just to backup my work. The code files of my project were still located on my host (local) computer, and so I was going to the AWS website to navigate to the s3 bucket, click the ‘add file’ button, and then upload the newly-edited file directly. With that file uploaded, I could then refresh the browser to view the changes.

That was a lot of steps, and it was good for me to do- I better understand how things work, and I like that before I move into any abstraction level. But it’s also laborious, and once that learning curve and experience has been obtained, it’s time to automate. That’s what the beauty of these tools (Terraform, GitHub Actions) have pleasantly shown me.

One thing that I found a little tricky from reading online tutorials and such was how to configure the repo so that it could authenticate with AWS services. What worked for me, and this tutorial was very helpful, was by first setting up a new IAM user with S3 bucket permissions, creating a new key-pair that was downloaded to my computer, and then setting those values (of the bucket, of the key id, and of the secret access key) to Action Secrets directly under GitHub settings for that particular repo. I had to create a secret for each value, and each of those secrets was referenced by a name that I gave it. So, for example, one of the secrets was AWS_S3_BUCKET, the next AWS_ACCESS_KEY_ID, and so on. Those values were inputted directly to GitHub, and then I refererenced just the variable within the Workflow script itself.

name: Upload Website

on:
  push:
    branches:
    - master

jobs:
  deploy:
    runs-on: ubuntu-latest
    steps:
    - uses: actions/checkout@master
    - uses: jakejarvis/s3-sync-action@master
      with:
        args: --follow-symlinks --delete --exclude '.git*/*'
      env:
        SOURCE_DIR: ./
        AWS_REGION: us-east-1
        AWS_S3_BUCKET: ${{ secrets.AWS_S3_BUCKET }}
        AWS_ACCESS_KEY_ID: ${{ secrets.AWS_ACCESS_KEY_ID }}
        AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_SECRET_ACCESS_KEY }}

I added the necessary values and this allowed me to push any changes to the s3 bucket upon any commit. A really nice workflow!

This was my first taste with GitHub Actions, but I’m definitely interested in learning more!

,

Leave a comment