AWS - Gitlab

Upload Gitlab CI artifacts to S3

Upload Gitlab CI artifacts to S3

cover photo: Emma Döbken 2018.

With GitLab CI it is incredibly easy to build a Hugo website (like mine); you can even host it there. But in my case I use AWS S3 and Cloudfront because it is cheap and easy to setup. The CI pipeline to build and upload the static website is also straightforward with the following .gitlab-ci.yml:

variables:
  GIT_SUBMODULE_STRATEGY: recursive

stages:
  - build
  - upload

build:
  stage: build
  image: monachus/hugo
  script:
    - hugo version
    - hugo
  only:
    - master
  artifacts:
    paths:
      - ./public

upload:
  stage: upload
  dependencies:
    - build
  image: dobdata/primo-triumvirato:v0.1.7
  script:
    - aws --version
    - aws configure set region $AWS_DEFAULT_REGION
    - aws s3 sync --delete ./public s3://$S3_BUCKET
  only:
    - master

The build stage generates the static website, which is shared with successive stages as an artifact. The upload stage uses my primo-triumvirato image, but this can be any image that has the aws cli installed. The sync --delete ... command recursively copies new and updated files from the source directory to the destination and deletes files that exist in the destination but not in the source.

Make sure you add the following variables in your GitLab CI project:

  • S3_BUCKET: the name of the S3 bucket
  • AWS_ACCESS_KEY_ID: provided by AWS
  • AWS_SECRET_ACCESS_KEY: provided by AWS
  • AWS_DEFAULT_REGION: the bucket region