It took me a few hours, but I’m now able to push changes to this repo and automatically deploy a new site to Google Cloud Run. Previously, I did the build locally on my Mac, and logged into the Google cloud website to deploy the new version of the site. This worked fine because I had all of the dependencies installed locally. But to deploy automatically by pushing to GitLab, I had to create a multi-stage docker file to build entirely within docker.

This docker file starts with node, installs required packages, then runs the build script. Next, the Jekyll container takes the built output and builds the static HTML. Finally, the output from Jekyll is copied to nginx which is used to serve the website. After I got this working, I realized the Jekyll container already includes node, so I can probably remove the first phase.


FROM node as node_build
COPY . /src
RUN npm install
RUN npm run build

FROM jekyll/jekyll as jekyll
COPY --from=node_build /src /src/jekyll
RUN mkdir /src/jekyll/public
RUN chown -R jekyll /src/jekyll/public
WORKDIR /src/jekyll
RUN jekyll build

FROM nginx:alpine
COPY nginx.conf /etc/nginx/conf.d/default.conf
COPY --from=jekyll /src/jekyll/public /usr/share/nginx/html/

I had a bit of trouble at first because I forgot I had WebPack installed globally. After it was added as a dependency to my package.json, the first phase of the build worked fine.

My next source of troubles were permissions problems. Strangely, the Jekyll container requires that the permissions be set to jekyll. After that was fixed, builds worked OK. Originally, this chown -R jekyll /src/jekyll command took the longest amount of time in my pipeline – I’m running on a slow NAS disk. After changing it to RUN chown -R jekyll /src/jekyll/public speed improved dramatically.

Next, I needed to create a ‘.gitlab-ci.yml’ file, and pass some credentials needed to build and upload the container in using gitlab ci environment variables.


  - docker info
  stage: build
  script: "./"
    - docker
  stage: deploy
      - master
      - echo $GCP_SERVICE_KEY > /tmp/$CI_PIPELINE_ID.json
      - gcloud auth activate-service-account --key-file /tmp/$CI_PIPELINE_ID.json
      - gcloud --project $GCP_PROJECT_ID config set run/region us-central1
      - gcloud auth configure-docker
      - docker push$GCP_PROJECT_ID/nginx-static
      - gcloud --project $GCP_PROJECT_ID beta run deploy nginx-static --image$GCP_PROJECT_ID/nginx-static --platform managed
    - docker
    - build


GIT_HASH=$(git log --pretty=format:'%h' -n 1)

docker build -t ${IMAGE}:${GIT_HASH} .
docker tag ${IMAGE}:${GIT_HASH} ${IMAGE}:latest
docker tag nginx-static:latest${IMAGE}

GitLab runner

Finally, I had to dust off my old Ubuntu 16.04 gitlab runner, install docker on it, and authorize the gitlab-runner so it has access.

sudo usermod -aG docker gitlab-runner

Verify runner is able to talk to docker.

sudo -u gitlab-runner -H docker info

Next, I installed gcloud so I can invoke the commands in the deploy portion of the gitlab continuous integration script above.

After all of that, I can deploy a new version of this blog with this incantation:

git push

And if all goes well, this blog post is automatically deployed!