So, it has come to this again, uh?
Last year, in 2022, I moved my blog from Hexo, an old CMS tool written in NodeJS, to Grav, a self proclaimed minimalist flat-file CMS. At the time, I had tinkered with Grav a bit, and came to be confident that I could make something cool out of it.
For about a year or so, I did. However, as time went on, I realized two (well, three) things about my actual workflow while writing blog posts.
Number 1: I do not actually use a browser that often
I started using Grav with the premise that I wanted to move away from having scattered pieces of repositories on multiple computers, describing different versions of drafts of posts. I wanted a tool that I could just open on any computer, and start writing.
The problem with that premise is that:
- I actually use the same couple machines for writing, which I almost always configure to have credentials to my git forge, and the right GPG keys to sign commits with
- Opening a browser is a hassle at times
- Assets still stay scattered (for example, on my SNCF blog post, I had to wait to get home multiple times to fetch screenshots)
- Now that I am in a much more stable living situation, I have solid and resilient infrastructure to synchronize code (my forge, Nextcloud), meaning that out-of-sync code is not a problem
Number 2: Grav is actually quite a lot of work
I had to do a non-insignificant amount of testing to place my posts in the order I wanted, right off the bat. At first I ignored it, and after a while things just stayed in the right order, but it was a first taste of things to come.
A lot of things work out of the box, thankfully, and I think Grav would be best suited to someone who wants a more... simple and fancy solution.
I want high levels of customization and control, and I increasingly found myself having to dig into raw Markdown files on Grav just to achieve things as simple as excluding draft posts from feeds.
In the end, one last point pushed me back to the old philosophy that had me use Hexo in the first place...
Number 3: Static sites are resilient
About a month ago, I read a post on my timeline relating this weird phenomenon in web preservation... The late Aaron Swartz (assasinated by the copyright industry; may he rest in peace) relating on his blog at the time the unfortunate passing of Gene Kan, one of the pioneering authors of the Gnutella file transfer protocol. At the time, Aaron noted that Gene's blog was barely preserved, and only static pages previously preserved could be accessed. Yet, the way we can read Aaron's words to this day is because he himself used a static site generator, something that could construct pure static files that were unlikely to break in the event that nobody were to deploy a new version ever again. His blog is still online.
I have known my fair share of websites, old forums from the 2000s and 2010s, completely broken and inaccessible. I have helped rescue one of those for a school club that was absolutely falling apart. Nothing is forever, but non-static websites are shifting sand, and you take a notable risk in depending on them.
Now of course I am not some grey muzzle contemplating the looming threat of death, but I am about Aaron's age when he died, and I cannot help but think that, perhaps, the simplest solutions to my problems are also the most likely to age better.
A new adopted CMS: Zola
I dabbled a bit with static site generators, first Hugo, then Zola. My affinity for Rust (and unstable projects!) made me pick the latter.
It's jank in a way that I appreciate, where it forces me to write templates myself (I based my template on the Tilde template, but heavily customized for just about all pages). That fits my workflow of putting time into things that I can customize and dissect fully, to an extent where I have now written most of the raw HTML code in this very page's template, and can debug it if needed.
Zola, as a static site generator, is cool in the way that I can now rely on my infrastructure to automatically deploy a new version once someone commits into my repository... which is neat! Automation is good and actually fits into my workflow as well.
So, we will see how things go. I will just keep on doing my silly littles things, and you can keep on reading them.
oh but actually
Using Zola introduced a couple of changes:
- For breaking changes, the only thing I could not fix fully were RSS feed URLs. I added a redirection rule in my reverse proxy to redirect the old paths, but I encourage you, potential reader of this stream, to change your URLs now
- I have an about me page now. It's quite barren, but I think it gets through the major important points.
That's it, see you around the web!