Having fun isn't hard, when you've got a library card

I’ve recently found myself guilty of doomscrolling. When I have a spare moment, I’ll pick up my phone and mindlessly scroll through Twitter, or a news website, and read or re-read about the most horrible things that are happening in world today. I caught myself doing it, and I wondered how much of my time I’m wasting on things other people want me to read, rather than choosing to read things I’m actually interested in.

Library

When I was a kid, I used to subscribe to magazines. One in particular was NGC, an unofficial Nintendo magazine from the Gamecube era. Remembering their website, gamesradar.com, I went to see what’s there now and found an article from Retro Gamer magazine about the making of TimeSplitters, a series of games I owned and loved. The experience of finding and reading something because I wanted to was a joyful and calming feeling.

I recently discovered an app called Libby, made by a company called Overdrive. Libby is a digital library catalogue, which offers you eBooks, audiobooks and magazines for free through your local library. You just sign up with your library card. After reading their article, I searched for Retro Gamer in Libby and subscribed. I also found some other magazines in the technical, gaming and music categories that I’d bought before, and subscribed to them, all for free. I decided that instead of opening Twitter in my moments of downtime, I’d open a magazine.

Using Libby is an excellent experience. Magazines can be viewed as a digital version of the print layout, scrolled and zoomed, but you can also select the articles on each page to have them presented in a re-flowed “reader” view, like you see on most modern web browsers. As I’m mainly using Libby on my phone, this works very nicely.

Libby can be installed on your mobile device, but it’s also a web app. Going to libbyapp.com gives you 90% of the same experience, and it syncs between devices. I can’t recommend it highly enough.

Reading a publication designed for A4 sheets of paper on my phone, I of course started mulling over the possibility of a tablet or e-reader for this purpose. But the reason this works so well for me is that my phone is pocketable, so its always there in those moments of downtime.

The app aside, reading magazines again is a breath of fresh air. In the 15 years since I last had a subscription, the web has become an appalling place to consume content (the irony is not lost on me, dear reader). A million things are trying to take your attention, get your consent, harvest your email address, get you to go to this page instead of this one… with a magazine, it’s just you and an article.

Migrating password store to self hosted Bitwarden

I’ve been using password store to manage my passwords for several years. I really like its simplicity, it’s just a shell script that relies on gpg and git to handle the encryption and synchronisation. However, the apps and browser extensions are all third-party, so are of varying quality. Also, using it across devices requires you to manage your gpg keys across those devices, something I’ve never managed with 100% success.

I’ve been hearing a lot about Bitwarden recently. It’s another open-source password manager with free, premium, hosted or self-hosted options, it doesn’t rely on gpg and the apps are developed by the same project, so are far more consistent. I thought I’d give it a go!

Self-hosting Bitwarden is fairly involved, it’s not just “snap install bitwarden”. However, it’s a very well documented process and I didn’t have any issue getting it running after an evening’s tinkering. The result is a cluster of about a dozen docker containers, which provide a web UI and the API for any apps.

As with my other self-hosted services, I have this running behind an nginx reverse proxy to have everything running over HTTPS on standard ports.

This all worked well, but I am running it on my HP Microserver and it was feeling the strain a bit. Running 12 containers including an mssql database might make sense if you’re running a service that needs to scale up to serve a company’s users, but for a single user or a few family members it’s a bit excessive.

Luckily there is an alternative. Bitwarden_rs is a third-party implementation of the Bitwarden API, so its compatible with the apps, but it runs as a single, much lighter container and uses SQLite by default. Also very well documented, I set it up to run on boot using systemd-docker. I mapped the container to the same port as I had the original Bitwarden, so my nginx config pointed at the new service with no changes.

To migrate from pass, I used pass2csv to output all of my passwords to a CSV flatfile. I then used this script from github to convert the file to be bitwarden-compatible, and imported through the web interface. The script is written in javascript so runs with nodejs, and requires the fast-csv package to be installed through npm.

When switching from Bitwarden to bitwarden_rs, the new service doesn’t know about users and password stores imported to the original service. To handle this migration, I used the web interface to export an import in Bitwarden’s own JSON format.

Image credit: Safe by Rob Pongsajapan

Syncing Rocketbook to Nextcloud notes

In an effort to write more often and more legibly, I recently bought myself a Rocketbook Core, an erasable pen-and- not quite-paper notebook with a companion app. The app will scan your notes and send a cloud service of your choice. For work, OneNote is supported which suits me fine. However, at home I use the Nextcloud Notes app to keep my notes on my own server, which stores notes in .txt files and gives you a markdown-aware editor on web and mobile.

Rocketbook Core notepad and pen

There is a beta Rocketbook app with Nextcloud Support, but it relies on WebDAV and I couldn’t get it working, so I decided to roll my own. The first step is to share my notes via email. I set it to a dedicated address with OCR transcription enabled.

Rocketbook settings screen

On my Nextcloud server, I run the following script :

!/bin/bash
MAILDIR=$1
DESTINATION=$2
offlineimap
mkdir /tmp/rocketbook
cd /tmp/rocketbook
for i in `ls $MAILDIR`
do
    munpack $MAILDIR/$i
    rm $MAILDIR/$i
done
rm ./*.desc
mv ./* $DESTINATION
rm -rf /tmp/rocketbook
offlineimap

The heavy lifting is done by offlineimap and munpack.

offlineimap will syncronise your email account to a local directory in my case I configure it to sync a single folder. My offlineimap config looks like this:

[general]
accounts = Rocketbook

[Account Rocketbook]
localrepository = Local
remoterepository = Remote

[Repository Local]
type = Maildir
localfolders = ~/Rocketbook

[Repository Remote]
type = IMAP
remotehost = mail.xxx.net
remoteuser = xxx@barrenfrozenwasteland.com
remotepassfile = ~/.imappassword

sslcacertfile = /etc/ssl/certs/ca-certificates.crt

munpack will extract the parts of an email (such as attachments) into separate files This gives me a .pdf containing the image, a .txt file containing the transcribed text, and and the email body in a .desc file which I discard. At this point, you could use the txt and pdf files however you like. I move these files to Nextcloud’s Notes folder, delete the email and re-run offlineimap to sync the deletion. The script runs on a cron job to check for new notes every 5 minutes.

How well does it work? I hand-wrote this blog post, so if you’re reading it, it worked pretty well!

subscribe via RSS