Home → The Classics → Farai's Codelab
Compressing Static Webpages With GitLab Pages
Published: Updated:
As great as GitLab Pages is, you need to do a lot of things on your own. One such thing is minification and compression to reduce the amount of bandwidth used when accessing my site. While I can tell Hugo to minify the output, there’s no such option for compressing the website. Thanks to the power of bash, I was able to compress the entire website’s output using the find
command.
- hugo -v --minify
- find public/ -regex ".*\.\(html\|css\|js\|xml\|svg\|pdf\)" -exec gzip -k9 {} +
It searches the public/
directory (where Hugo outputs sites by default) using regex to find all the files with the given file extensions. I could have just run gzip it on all the files in the directory, but images don’t compress well with gzip1. The -exec
command then runs gzip on all the found files using the highest compression possible. The best part about this is that the original files are kept in case a client doesn’t support gzip.
Gzip reduced the webpages by about 25%. There’s a better compression scheme out there called Brotili by Google which I wanted to use. Issue is that it’s a bit slower making it useful for fonts (which I don’t use) and that GitLab Pages can’t reply to requests with brotili.
Enabling gzip is part of my broader strategy to make my website faster. This is probably the most significant change I’ve made to improving my website’s performance.
Same is true for binary file formats in general. While I’m looking for PDFs, that’s because gzipping it actually made the one PDF smaller. ↩︎