Modernizing This Site II
If you’re returning here, you may notice all the previous links have been broken and the site looks pretty different. This is because I have switched from my old SSG Pelican to Hugo. I’m using the etch theme, which I like a lot.
If this was a professional project, I would have ensured all the URLs tracked by Google and friends would resolve to the correct places, but since this is just my personal blog I don’t really care.
I’m still hosting the site on S3 with CloudFront as CDN, and at some point I also put CloudFlare in front of CloudFront. This can cause some goofy difficult-to-debug double caching weirdness, but overall has been working well. It’s also been costing me exactly $0 to host over the last three years.
Before this most recent round of upgrades, I was deploying to S3 using the sync
command from my local machine.
This was not ideal for a whole bunch of reasons, including the need to remember how to use makefiles, so a big
priority was to use GitHub Actions for my CI/CD pipeline. I’ll include the pipeline here for reference - there’s a
pretty cool option to the S3 command which will automatically invalidate the CloudFront cache.
name: Build Hugo site and sync to S3
on:
# Runs on pushes targeting the default branch
push:
branches: ["master"]
# Allows you to run this workflow manually from the Actions tab
workflow_dispatch:
# Sets permissions of the GITHUB_TOKEN to allow deployment to GitHub Pages
permissions:
contents: read
pages: write
id-token: write
# Allow only one concurrent deployment, skipping runs queued between the run in-progress and latest queued.
# However, do NOT cancel in-progress runs as we want to allow these production deployments to complete.
concurrency:
group: "deploy"
cancel-in-progress: false
# Default to bash
defaults:
run:
shell: bash
jobs:
# Build job
build_deploy:
environment: production
runs-on: ubuntu-latest
env:
HUGO_VERSION: 0.120.4
steps:
- name: Install Hugo CLI
run: |
wget -O ${{ runner.temp }}/hugo.deb https://github.com/gohugoio/hugo/releases/download/v${HUGO_VERSION}/hugo_extended_${HUGO_VERSION}_linux-amd64.deb \
&& sudo dpkg -i ${{ runner.temp }}/hugo.deb
- name: Checkout
uses: actions/checkout@v4
with:
submodules: recursive
lfs: 'true'
- name: Build with Hugo
env:
# For maximum backward compatibility with Hugo modules
HUGO_ENVIRONMENT: production
HUGO_ENV: production
run: |
hugo
- name: Set up S3cmd cli tool
uses: s3-actions/[email protected]
with:
provider: aws
region: 'us-east-1'
access_key: ${{ secrets.S3_ACCESS_KEY }}
secret_key: ${{ secrets.S3_SECRET_KEY }}
- name: Sync site to S3
run: |
s3cmd sync --reduced-redundancy --delete-removed --cf-invalidate --cf-invalidate-default-index public/* {bucket url here}
This new CI/CD pipeline is pretty great - I can just add a new .md file to my list of posts, commit it, and push it, and my pipeline will automatically run and deploy the result.
If you’re going to do this yourself, make sure to create an IAM user specifically for this GitHub Action that has the least privileges possible. In this case, I only gave it permissions to modify S3 and CloudFront.
One weird issue I ran into was that the theme was not being applied even though it worked when testing locally. I
opened up dev tools, and saw that the stylesheet was being referenced correctly in the HTML files. I double checked
that the stylesheet was actually being downloaded by looking at the Network tab. This really stumped me for a while,
until I noticed the Content-Type
header on the response was set to text/plain
. I dug into the S3 bucket settings
and found that each object can have a custom Content-Type
set, and my stylesheet had text/plain
set there
somehow, probably by the sync
command. Using the GUI, I manually set the content type to text/css
, and this
corrected the issue and seemed to persist after subsequent deployments. Pretty interesting the browser will only
apply the CSS you’re asking to apply if the Content-Type
is set properly on the server side!
Anyways, this seems to be a much nicer experience overall, so there’s a good chance I’ll be posting here a bit more often.