Weeknote 12: In the weeds

Last time I thought I had a trick in my pocket to ensure that I get some writing done:

The solution I’m going to try is free writing, similar to morning pages: set a timer for ten minutes and just write whatever is on your mind.

It did not quite work the way I hoped for. Free writing is great, but it’s just that my work last week was in the weeds and I cannot write about it in detail.


Instead, I’m going to tell you about my Mastodon data exporter.

Before Mastodon had search, I wanted to be able to search my own toots1. I made some scripts to download the data locally and search them in my text editor.

I’m using the “git scraping” technique described by Simon Willison. I made a Python script that gets all my statuses via Mastodon’s API and stores them in a local JSON file. Then I made a private GitHub repo and a GitHub Actions cronjob that runs the script and stores the result in the repo.

Whenever I want to search my posts, I pull the repo. I’ve got another script that formats the data into a Markdown file and I’ve also imported the data into a SQLite file. The SQLite file is useful if I want to find e.g. my most popular post.

I haven’t shared the scripts because I’ve got the scripts and the data all mixed up in one repo. Maybe one day. In any case, it’s nice that GitHub provides us this free infrastructure you can use to automate tasks like this.

Photo: Empty plastic water jugs lying in high grass, illuminated by autumn sun.


  1. Maybe it was possible to search your own toots in Mastodon already back then,.I learned about it after I already had built the exporter in any case. ↩︎


Comments or questions? Send me an e-mail.


Want to get these articles to your inbox? Subscribe to the newsletter: