I recently realized, that this blog is one of my longest, if not the longest software project, that I currently maintain. While even some of my work I created in late 2000-s is still out there, my impact on them wasn’t as large, and certainly I am not contributing to any of them any more.
So to celebrate it’s 10th year, I took a stroll down the memory lane to talk about some of the most promiment attributes of this site.
Blogging history
While this is the longest software project, my blogging history itself is much longer, as I started in the early 2000s. While it was some time ago (almost a quarter of a century, actually), times were suprisingly similar. Blogs, or rather personal websites, as the term wasn’t really that popular yet, were either self-hosted or published using one of the existing blogging platforms. At the time, the leading software provider was Movable Type (which, TIL, is still alive today), and it was yet a few years before WordPress was created.
And as the blogosphere flourished, so did my writing. Over the years I created, hosted, and wrote literally dozens of blogs, mixing both personal and professional topics. But I always had this site, my personal site. It evolved multiple times, as I was learning the craft of web development: * Starting out under a free domain, hosted on my personal computer at home, its contents consisted mostly of movie reviews and some content about me * Then renamed to lebkowski.info with 3 or 4 major redesigns along the way, it was mostly a blog about me, my travels & adventures, web development and interacting with the blogosphere in general * And some time between late 2013 and early 2014 it was replaced by a static one-pager lebkowski.name, which is was a precursor for this site
Switching from PHP engine to static site
I can’t remember the reasoning behind that decision, but I decided to scrap all existing content on my site and replace it with a simple static page. There was no blog or articles. And as such it was fitting to publish it as a static HTML site. I remember being inspired by another site when creating the layout, and using some kind of fancy editor with the ability to automatically compile assets, synchronize multiple devices and hot reloading (it used an early version of browsersync or a similar solution) and automatically deploy the built website using FTP et. al.
At the time, we were using Less as a CSS preprocessor at Docplanner, so that was my technology of choice for personal projects as well.
When it comes to deployment, I can’t remember how this static HTML was delivered to production (it might have been a manual process), but 2014 were the early days of docker. I was fascinated by this new way of thinking, and the possibilities it opened for PaaS solutions. And soon I adapted Dokku to build and host all my projects, thrilled by the simplicity of the build process it introduced (similarly to heroku).
Not long after I decided to bring back articles, but I wanted the site to remain static, so I moved the engine to Sculpin (PHP static site generator) — I wrote content in markdown, dokku built and released, digitalocean hosted. This was in mid-2014, and this same skeleton in the same git repository lives and powers the site to this day. But there sure were changes since then!
Feeds
RSS, contrary to popular belief, is not dead. So this was the founding block of any of the blogs I built. This site was no exception. Moreover, cool URIs don’t change. This is why if someone subscribed to my site’s feed around 2005, it would work continuously to this day, nearly 20 years later.
Do I miss out on analyzing visitor traffic by allowing consuming my content on different platforms this way? Technically, yes. But also: I removed visitor tracking altogether when Google tried to force a migration to Analytics v4, and I’ve been living happily without knowing the numers ever since.
Using media in my content
At some point I stumbled upon an obstacle: how to embed rich content like youtube videos in my content, which is created in markdown. While markdown technically allows mixing with HTML, I did not want that and opted for a simpler option: just link to the content.
You know, back in 2008, on the wave of Web 2.0 hype, some person named Leah Culver proposed a standard protocol for sharing and embedding rich web content: oEmbed. This allowed me to just write a paragraph with an URL, and with some embedly magic it was automatically turned into a rich embed. Based on open standards, and supporting any data provider (and with embedly’s help, even some that do not support it natively).
Search
At one point I integrated with Algolia to provide a search feature. I was using it heavily for commercial pruposes, and it seemed fitting for this site as well. I pushed the index during build time, and used the JS SDK to provide the UI on the site. Unfortunately, there was little adoption from the users, so ultimately I dropped it — and haven’t thought of it since
The frontend revolution
I mentioned opting for Less in the begining. Unfortunately, that decision did not age well, as it was ultimately Saas which won the preprocessor wars. I was late to the party and only got myself to switch in 2020. Along with some redesign, I introduced two major improvements:
- I rewrote the styles to SCSS (as well as added browsersync to the stack)
- And I made the site mobile-first. The site was responsive from the start, but it was built desktop first. It took me long enough for this change, as it was almost at the same time as people starting to question if we should always start with mobile
So the frontend stack was modernized about 4 years ago and it holds remarkably well to this day (I even have a component library build in case I would want to go through the redesign once more). Part of the reason is that there is almost no Javascript used, and the little there is was written in VanillaJS, so no webpack/babel is necessary.
Speaking of javascript: as a heavy reddit user at that time (hello RES) I relied a lot on keyboard navigation. And I thought it would be an obscure, but otherwise an useful feature for my site as well: did you know that you can jump between content sections by pressing either K
or J
keys (not mobile friendly, I’m afraid). You can try it now.
Accelerated mobile pages
Quite soon after its initial release I jumped on the AMP bandwagon. I thought it was an interesting standard. Fortunately it didn’t take me long to see it for what it really was — an attack on the open web — and removed it a few months later. It took Google about 5 years before they utimately backed down too, and stopped pushing this agenda.
I never need AMP. It wasn’t magic. It just cut the fat from multi megabyte websites. Mine’s lean and fast without any help.
Secure by default
I don’t use infrastructure as a code approach here, so I can’t track exactly when it happened, but at some point I decided to switch fully to HTTPS. I think I must’ve had some paid certificates earlier, but by early 2015 I certainly switched to letsencrypt, and automated the whole ordeal. It was before Caddy or Traefik automated the whole thing, so I remember scripting it all together to work with my dokku’s nginx.
At that time I already used SSL for local development, so switching production was a no-brainer. Over time I had the ability to upgrade my ancient version of dokku so I could use the letsencrypt plugin that works out of the box. I was also able to switch from http to dns challenge, which has proven to be much more reliable in my case.
Indie web
Some of the most recent additions are the indie web improvements I think I always followed the spirit of that movement, although not necessarily in any formal way.
For example, in the mid 2000-s, a protocol named OpenID Connect was introduced and widely adopted. This allowed me to turn my site into my identity provider. Before „login with facebook” or „login with google” links, I could „sign in with your URL” and I took advantage of this. Unfortunately, the adoption withered and died, so I no longer use it. But I have it in the back of my head, and whenever a similar solution surfaces, I will be ready to switch.
Other examples of indie web elements are for example the use of Semantic Web in the form of JSON-LD (a successor of once popular RDF), microformats, and even such unnoticeable details as using <time>
element to markup dates. This makes site’s content richer for any kind of automated tools, and allows seamless integration in other places. I originally made this so that sharing links on slack or social media had a more pleasant form.
Like webmentions. I’m on the fence with the whole liking / commenting / pinging thing. I don’t engage with the community as much these days, so the features are mostly dormant, but they are there.
Github actions
And finally, after upgrading dokku last year, and replacing my legacy digital ocean droplet with a brand new $5 one, I decided to switch the build process completely. The buildpack approach was interesting, but caused a lot of maintenance headaches — the buildpacks became outdated or missing, and it felt I like didn’t have the process under control. I didn’t have the confidence that I would be able to recreate it easily using more modern and open toolset.
So the first step was to switch to Dockerfile builds. They still relied on dokku, but used dockerfiles — a standard I knew and could trust, and was not proprietary to dokku ecosystem. and from there it was just one step to extract the build process out of dokku entirely.
Over the weekend I moved it to Github Actions. It still uses the same dockerfiles, but now it just pushes an image to the container registry and triggers dokku to rebuild. As a side effect I can now automatically deploy any branch to a staging environment, which is automatically provisioned (with SSL from LE) and decomissioned after I delete the branch.
Most elements of this process are replaceable:
- Can I switch sculpin to any other static site generator? It won’t be easy to maintain the existing structure but I still can — I just need to update the dockerfile build instructions afterwards
- I can replace Google Actions for any other CI server to build the artifact (docker image)
- And finally I can host that image on any platform that supports docker, which is virtually anything today
I feel that while the stack is understandably more complex than a couple of years ago, it is also more robust and resilent. Let’s hope for another ten years together.
Back to blogging
That final push had a strong reason behind it. I wanted to return more short-form blogging. Currently, I write most of my content in a dedicated markdown editor, and then commit it to the site’s git repository (and push to release). This requires for me to be on a laptop.
I wanted to be able to write more freely. Use my note-taking app or Prose on any device I choose. But since my site is still static and has no content management system, I would need to have a way of publishing notes from those places. I opted to save them to dropbox, which in turn would use a webhook to trigger the github actions build workflow — and there, a simple automation would fetch notes from storage before the sculpin would build the site.
And this separation allows me to do just that, and it is now working and live. What remains is the hope that I find the motivation to write more often🤞