Now I have a Hugo instance up and running, I can add posts, edit posts etc. However, releasing changes to my site, I have to commit, push, build, scp. I could script that, but that’s also what Continuous Deployment is for, and BitBucket has the pipline feature that enables me to automate that.
What I really want to happen, is any time I make a change and push that up to my BitBucket repository, it’s deployed up to production. So then I just live in git
commands which I can mostly run natively in VS Code.
I’ve Googled around and looked at setting up the pipeline. There are other ways of doing this, but, my existing web hosting is a fairly retro affair, so I want to do a clean Hugo build and scp
this onto my server.
In the settings for my Git repository on BitBucket, I went in and enabled Pipelines. Once this is done, I can just add a bitbucket-pipelines.yml
file into my repo, and when code is pushed, the pipeline executes.
YML is the standard meta-language used in infrastructure, so I have to grit my teeth and get on with it.
The Hugo docs have an article on using a specific platform, Aerobatic. This gives the basics. I also saw this one deploying to s3.
With a bit of cut-and-shut I have this for the build:
image: atlassian/default-image:2
options:
max-time: 5
pipelines:
default:
- step:
name: Build Hugo
script:
- apt-get update -y && apt-get install wget
- apt-get -y install git
- wget https://github.com/gohugoio/hugo/releases/download/v0.88.1/hugo_0.88.1_Linux-64bit.deb
- dpkg -i hugo*.deb
- git config --global url."https://".insteadOf git://
- git submodule update --init --remote
- hugo --minify
artifacts:
- public/**
I’ve upgraded the version of Hugo installed to the version that I’m using locally. One key thing here was to remember to include the git submodule update --init --remote
to bring in the theme. If you have added the theme natively into your git repo, you won’t need to do this. If you’re not using submodule
for your theme, you may need another mechanism. I’ve also had to add - git config --global url."https://".insteadOf git://
to force git to prefer https
when loading the submodule, rather than the git
protocol. It used to work with git
, but I’ve recently found that bitbucket pipelines was failing to connect to the github with the following error:
fatal: unable to connect to github.com:
github.com[0: 192.30.255.113]: errno=Connection timed out
That gives me a working build with the artifacts on my bitbucket pipeline, I just then need to deploy that to my production server.
Atlassian provide a documented base model for doing an SCP in a pipeline, and it’s very nicely documented. So I created a couple of encrypted variables, generated a new key, installed the public key on my server, validated my server fingerprint then stuck this in:
- step:
name: Deploy artifacts using SCP to PROD
deployment: production
script:
- pipe: atlassian/scp-deploy:0.3.3
variables:
USER: $USER
SERVER: $SERVER
REMOTE_PATH: '/home/$USER/michael.jervis.co.uk/'
LOCAL_PATH: 'public/**'
So this means any push to any branch will result in a build and deploy to production. Which in the future I may not want, in which case I’ll need to update my bitbucket-pipelines.yml
to only trigger the deployment if I’m on my production branch, so something like:
image: atlassian/default-image:2
options:
max-time: 5
pipelines:
branches:
production:
- step:
name: Build Hugo
script:
- apt-get update -y && apt-get install wget
- apt-get -y install git
- wget https://github.com/gohugoio/hugo/releases/download/v0.88.1/hugo_0.88.1_Linux-64bit.deb
- dpkg -i hugo*.deb
- git submodule update --init --remote
- hugo --minify
artifacts:
- public/**
- step:
name: Deploy artifacts using SCP to PROD
deployment: production
script:
- pipe: atlassian/scp-deploy:0.3.3
variables:
USER: $USER
SERVER: $SERVER
REMOTE_PATH: '/home/$USER/michael.jervis.co.uk/'
LOCAL_PATH: 'public/**'
Super 😄
I’m still not happy about the state of minification, so that needs a bit of looking into.