Dell Precision 5520 Ubuntu Review

Dell kindly sent me a Precision 5520 to review for the Ubuntu Podcast. This post goes into detail about some things we couldn’t fit in the show.

In the box

The box the laptop shipped in contained the laptop, the power supply (power brick and UK plug), and a Thunderbolt Ethernet dongle.  The laptop itself was smartly presented in its own smaller padded box.

Hardware

The laptop itself has an attractive metal body with carbon fiber or something similar on the surface around the keyboard.  A nice detail is that the underside of the body features a plate etched with the model number, which flips over to show the normal regulation compliance emblems hidden underneath.

Ports

On the left, the device has the charging port (straight barrel connector), HDMI, 1 USB-A 3.0 port, a 3.5mm combined headphone/microphone jack, and 1 USB-C/Thunderbolt port.  On the right, there is another USB-A 3.0 and anSD card slot.  This side also features an Kensington lock port and a battery indicator.  Both of the USB-A ports support power sharing.

I tried plugging a pair of stereo headphones into the 3.5mm jack.  While audio played, the volume was noticeably lower in the left ear.  As there’s only 2 USB-A ports, I plugged in a mouse, keyboard and Ethernet dongle using a bus-powered USB hub.  Unfortunately, devices kept dropping out on the hub.  I switched to using the supplied Ethernet dongle and plugging in the mouse and keyboard directly, which worked without an issue.

While there aren’t a huge number of ports available on the body of the device, there are docking options available.  A USB-C dock is available with extra USB ports, ethernet, audio output, and ports to drive 2 monitors.  There’s also a Thunderbolt dock available with the similar ports, but the ability to drive 3 monitors.  I didn’t receive a dock with my review unit.

Specification

My review unit featured the top options for most of the hardware.  This specification retails for around £2,618 on the Dell website, while the base specification retails for around £1,649.

  • Intel Xeon Quad-core 3GHz CPU (base Core i5, 2.5GHz)
  • 32GB DDR4 RAM (base 8GB)
  • UHD (3840×2160) 15.6″ touch screen (base FHD (1920×1080), non-touch)
  • Intel WiFi 802.11ac/Bluetooth (standard)
  • 512GB SSD (base 500GB HDD)
  • NVIDIA Quadro M1200 w/4GB GDDR5 GPU (standard)
  • 6-cell 97Wh battery (base 3-cell 56Wh)

Input/Output

I found the keyboard satisfying to type on.  The keys take a decent amount of force to get moving, and go down soothly with no sponginess.  The Super key is printed with a Windows logo, no vinyl sticker like Entroware, and no custom-printed keycap like Tuxedo.  The model I was sent had a US-layout keyboard, although the Dell UK site offers a UK keyboard as standard.  The keyboard lacks some keys such as Scroll Lock and Pause, which is an understandable omission but is slightly irksome for me as I use Pause as the summon key for Tilda.

The touchpad is big and soft to the touch, with just the right amount of friction.  It supports multi-touch, supports clicking down anywhere for the primary button, and clicking the bottom-right area for secondary click.  By default, it’s configured to use “natural” scrolling, which matches the scroll behaviour of the touchscreen.

The screen fills the space inside the lid with only about a 2cm bezel all the way around.  The light from the screen is incredibly bright, so much so that I found it uncomfortable to look at a white page on full brightness.  It can be adjusted to a range of levels so it’s not hard to find a comfortable setting.  The screen is readable in direct sunlight, although has a glossy finish so is quite reflective when there is a dark background.  The touch screen works well, although only supports single touch, and picks up finger prints easily.

This was my first experience of a HiDPI screen.  The picture is very sharp and good quality, but doesn’t offer a particular advantage over an FHD screen for general use.  The place where it really makes a difference is with high-resolution photos and videos, which reveal a huge amount of detail not visible on lower DPI screens.

Software

First boot

The laptop came loaded with Ubuntu 16.04 LTS.  On first boot in launched into the OEM setup, which allows you to connect to WiFi, create a user, configure the locale, and create a recovery USB stick.  It finishes by installing updates, then reboots for you to log in to your new user.  This all went as expected, and the WiFi connected without a problem after reboot, something I’ve had issues with before.

In the interests of full disclosure, at this point I should mention that this was the second unit I had to review.  The first unit I was sent has a bug in the OEM setup which caused it to crash.  The error messages weren’t very friendly, and it left me unable to set up the machine.  Dell sent me a replacement unit, which worked without a problem.

Upon reboot, the LightDM login automatically scales to take the HiDPI screen into account.  However, after login, the Unity desktop wasn’t automatically scaled.  For usability, I adjusted the UI scale to 2x in the Screen Display settings, which gave me the equivalent screen estate of an FHD screen.

One serious letdown about the initial setup is that the Super key is disabled by default, and no key has been configured to replace it for usage with Unity.  This means that upon first boot, the list of Unity keyboard shortcuts is missing a lot of key features around the Dash, Launcher and window management.  There’s an article on Dell’s help website which explains how to re-enable the key, but as someone who makes heavy use of Unity’s keyboard shortcuts, I don’t feel like this represents a good default Ubuntu experience.

Default software

The initial package selection differs from vanilla Ubuntu 16.04.  There’s a Dell-specific APT repository enabled that’s hosted by Canonical, and several hardware drivers installed, ensuring that all the hardware works as intended.

Something about the default package selection which confuses me somewhat is the choice of web browsers.  Instead of Ubuntu’s default Firefox, you get Chrome and Chromium, which both have icons at the top of the Launcher.  As standard with Ubuntu 16.04, The QML Browser app is also installed and available from the Dash, and the Amazon icon in the Launcher uses this, which adds an additional inconsistency since this browser doesn’t pick up the UI scaling.

The rest of the installed apps are pretty much standard, with the addition of Dell utilities for installing drivers and creating recovery media.

Experience

Graphics

As mentioned above, the laptop features a dedicated Nvidia graphics card.  I ran the Superposition benchmark at various settings, although the only preset that provided a reasonably smooth experience was 720p Low, scoring 1930.  I also installed Tomb Raider through Steam (only usable in Big Picture mode due to the desktop UI not scaling) and ran the benchmark there. On 720p/Ultimate settings, it managed an average of 51 FPS.  On 1080p/Ultimate settings, it still managed a respectable 33 FPS.  Unfortunately, at native resolution, it only managed 9 FPS on average.

When the graphics are being exercised, the fan kicks into gear to an extent that is hard to ignore.  The noise isn’t at an irritating pitch, but it’s clearly audible over the sounds of the game.

Battery endurance

I tested the battery endurance with each GPU by setting the screen brightness to minimum (still perfectly usable), the keyboard light off, the Wifi connected, and left it untouched to discharge.

With the Nvidia GPU, the 97Wh battery lasted 6:30. With the Intel GPU, it lasted about 11 hours.

I also tested the standby, which went for 18 hours with 8% battery drain.  Notably, when recharging, the top-left quadrant of the laptop’s body gets very hot.

Verdict

The Precision 5520 is a great looking laptop on the outside, with a fantastic screen and innards to match.  It’s marketed as a “mobile workstation” and definitely fits into that role, offering an excellent option for upgrade from a class of machine such as a Macbook Pro.  The price point is above what most would consider for a personal device, but offers a good range of options for a professional purchase. Personally, I would stick with the FHD non-touch screen, as the UI scaling still has some gaps, and the single-point touch screen doesn’t offer much attraction given the very nice touchpad.
As an experienced Ubuntu user, I’d be very happy to have this device as my daily driver, as I’m confident that I could customise it to my preferences.  I would however have some reservations about recommending it to a first-time Ubuntu user, as I’d fear some of the changes from vanilla Ubuntu would create a confusing experience.

Marching with Unite For Europe

If a democracy cannot change its mind, it ceases to be a democracy. – David Davis

The flag of the European Union, flying in front of Westminster Palace.

Unite for Europe, Parliament Square, 25/03/2017

On Saturday I marched to Parliament Square, London with Unite for Europe, in protest against Brexit. If you asked me beforehand why I was marching, I’d have found it hard to articulate an answer, beyond “because I have to do something”. But now I’ve had time to digest the views and stories expressed on the day, I think I can explain it a bit better.

I was marching for the children, holdings signs with slogans like “I want my future back”. They will have to live with the impact of Brexit for the longest, yet they had no say in the referendum. They’ve been brought up with a promise of the chance to study, to travel and to live anywhere in the EU. They probably go to school with friends who’s parents are EU nationals. If the Government continues on its current course, by the time we leave the EU a lot more of them will be of voting age, but will not have had a say on the issue. Their future is being taken away, without their consent.

I was marching for the EU nationals, who have made their lives, their careers, and their families here. Many of them have fallen in love with UK nationals. Many have fallen in love the the UK itself. They work here, pay their taxes here, and are net contributors to our public services. They had no vote in the referendum, and now they’re being told they might not be welcome any more, treated like bargaining chips by the Government.

I was marching for the Remain voters. 16 million people who cast their vote against Brexit, who are now being ignored and told that “The Will Of The People” doesn’t include them. Who speak out against their country heading down the wrong track, and are called names by politicians and journalists alike, who wish to see them cowed and silenced. They are still here, and they aren’t going away.

I was marching for the Leave voters. People who cast their votes in good faith, on the promise that the UK could leave the EU but remain the in the single market. On the promise of £350 million per week for the NHS. Who are now being used as pawns by a government desperately seeking to justify a wholesale withdrawal from the EU, EEA, Customs Union, Euratom and anything else they can while they’re at it. They didn’t vote for this, and they deserve better.

Democracy is not one question, asked once. 52% vs 48% is not a mandate for sweeping constitutional change. The Government and Parliament cannot outsource their accountability to a referendum. We must stand up, all of us, and hold the Government to account. If we voted Leave, we must make sure we get the Brexit we voted for, or reject it. If we voted Remain, we must ensure the Government and our parliamentary representatives cannot continue to ignore us. We must speak for those who stand with us, but are denied a voice.

The Government will make its Article 50 notification this week. This is the beginning of a 2-year process. Once we know the result of the negotiations, we must have a say in whether we accept it or not.

Blu-Ray playback with Kodi

I recently upgraded my HTPC’s optical drive to Blu-Ray (primarily for The Force Awakens). The DRM on Blu-Rays is problematic when you’ve built your own player – you can’t just stick the disc in and hit the play button like you can with DVDs. I’m using MakeMKV1 which lets you rip Blu-Rays for encoding with Handbrake, but I don’t really have the storage to be ripping Blu-Rays on an ongoing basis. Fortunately, MakeMKV also allows a disc to be streamed over UPnP, which Kodi supports natively.

To make this a bit more usable, I’m using a script to launch the stream with MakeMKV’s CLI interface, wait until the UPnP share is ready, then switch Kodi to the share ready to select the title from the disc.

I’m using the Advanced Launcher addon for Kodi to create a launcher for disc_trigger.sh. I’ve created a Favourite for the launcher and added it to the main menu. Now the process is:

  • Insert disc
  • Activate launcher
  • Wait until it’s ready
  • Select title and watch
  1. MakeMKV is proprietary, but free-as-in-beer as it’s in “perpetual beta”. You do however need to keep updating the beta registration code, so I paid for a license as I want this to be “setup and forget”.

Why I’m voting “remain”

First up, I’m not an expert on the UK’s relationship with the EU. Let’s face it, who is? I got a B in A-level politics about 10 years ago, so I have a general understanding of the mechanism by which EU law effects us, but that’s about as far as it goes.

The question posed by the referendum isn’t one of whether we change in one direction or another. It’s whether we keep the status quo, or change from it. However, the situation we’d change to is entirely unclear. There may or may not be benefits to changing. It seems like in the short term things might be chaotic, but in the long run they might be better, for some people, or they might not, and the post-leave plan is so vague (we don’t even know who’ll be deciding or implementing it) there’s no way to tell.

The uncertainty is the issue I have here. If we had some clear idea about what leaving the EU meant, it might be worth consideration. But since it’s so unclear, a vote to leave is essentially saying that the current situation is so terrible that literally anything would be better than what we’ve got. I don’t believe that’s the case, so that’s why I’m voting remain.

Playing WINE games with the Steam Controller

While I do the majority of my gaming through Steam on Ubuntu, there’s always a few games around that I’d like to play and aren’t out for Linux, so I play then through WINE.  This morning I discovered that it’s  possible to play WINE games with your Steam Controller.  As usual with WINE, your mileage may vary, but here’s what I did.

Firstly, you can’t just install the Windows version of Steam.  I don’t know if it would recognise the controller, but even if it would, Steam wont run Big Picture Mode if it’s running in WINE, so you’re out of luck there.

The trick is that you can add launchers for non-steam games (or actually, any application) to your steam library.  These will run with the Steam Overlay, and therefore will support Steam Controller configurations.  To do this, you’ll need to create a .desktop launcher file for your game.  You can do this with a text editor, but I used Arronax.

arronaxSet a title and an icon if you like, then for the command, enter the command to launch your game using WINE.  If you’re using the default WINE prefix, then this can be as simple as:

wine ~/.wine/path/to/your/game.exe

However, I tend to install WINE games using PlayOnLinux, which installs games into separate prefixes, with appropriate versions of WINE.  This means the default command doesn’t cut it.  Instead I created the following a shell script, and set the .desktop file’s command to execute that shell script.  Here’s an example that I used:

#!/bin/bash
export WINEPREFIX=/home/steam/.PlayOnLinux/wineprefix/DiabloIII
export WINELOADER=/home/steam/.PlayOnLinux/wine/linux-x86/1.7.15/bin/wine
export WINESERVER=/home/steam/.PlayOnLinux/wine/linux-x86/1.7.15/bin/wineserver
"$WINELOADER" "$WINEPREFIX/drive_c/Program Files/Diablo III/Diablo III Launcher.exe"

You can find the appropriate wine version by looking at the configuration dialogue for the game in PlayOnLinux.  Note that PlayOnLinux can generate a .desktop files for a game, but these launch PlayOnLinux, and don’t appear to work when launched through Steam.

Once you’ve created your shell script and .desktop file, launch Steam in desktop mode, access your library and select “Add a game… > Add a Non-Steam Game…”.  This will show a list of applications in /usr/share/applications.  Click “Browse” and find the .desktop file you have created, then press “Add Selected Programs”.  The game will now appear in your library.

Launch Big Picture Mode, and select the game from your library.  You can now configure the controller as normal, and the Steam Overlay will launch with the game, giving you full support for the controller!

Steam Controller Review

It’s been about a year since I built my Steam Machine, and since then I’ve tried various solutions for playing PC games on my TV while sitting on my sofa.  These have included:

  • A Gamecube controller (my all-time favourite gamepad), which is great but doesn’t have as many buttons as games expect,
  • An XBox 360 controller which is the de facto standard controller for games with gamepad support, but doesn’t work with all games.
  • A Wii Remote, which works for games that need a mouse pointer, but is tiring to use for long periods and tricky to configure well.
  • When I’ve been really desperate, my Rii wireless mouse/keyboard, which is OK in a pinch but far from ideal.

The promise of the Steam Controller is that it “lets you play your entire collection of Steam games on your TV—even the ones designed without controller support in mind.”  When this was announced it seemed like to Holy Grail of input devices, the solution to all of my problems, so of course when I heard I could pre-order one this year for delivery a month before they went on general sale, I snapped one up.  I’ve now racked up a good view hours of gaming on games with various types of control systems, so does it fulfil it’s promise? TL;DR.

The hardware

A steam controller in its box.

The Steam Controller

 

The first analogue stick appeared on the N64 controller, then the Playstation Dual Shock controller was released with 2 analogue sticks.  Since then, controllers have pretty much featured 2 analogue sticks, 4 shoulder buttons/triggers, a D-pad, 4 general purpose buttons, and a couple of “system” buttons.  Each has had it’s own USPs, and Valve have clearly taken inspiration from a lot of places. Some similarities I’ve noticed are:

  • The Gamecube controller’s “cradling” grip.
  • The Gamecube’s 2-stage analogue+digital triggers.
  • Touch pads, seen on the PS4 and PS Vita.
  • A gyroscope (seen in some Gameboy games, the PS3 sixaxis and the Wii Remote Plus).

However, unlike any of these gamepads, the Steam Controller does away with the dual analogue stick/d-pad set up in favour of a single analogue stick, and 2 clickable touch pads, 1 under each thumb.  This allows for similar accuracy to using a touchpad or trackball, an an immense amount of versitibility.  The right pad is flat, while the left pad features an cross-shaped indentation, allowing it to easily emulate a D-Pad.

As well as the standard shoulder buttons, triggers and general purpose A/B/X/Y buttons, the Steam Controller sports a left and right “grip” button, which sits under your two smallest fingers on each hand. Squeezing the controller activated this button, which can be used in a number of ways, as we’ll see below.

The N64 controller features an optional “Rumble Pak”, which introduced what we might now call haptic feedback to console controllers.  All controllers since have featured something similar, merrily vibrating the let us know when we take damage or something near us explodes.  The Steam Controller doesn’t have a “rumble” feature in the same way, but it features a remarkable haptic feedback system for it’s analogue controls.  The analogue stick and triggers can be configured with an actuation threshold, which will provide you with a “clunk” when you cross it.  Similarly, the touch pads can vibrate slightly as your thumb moved over them, giving you a sensation similar to a mouse’s scroll wheel, or a trackball.

None of these features is in itself revolutionary, but the way they’re used together, and crucially the way the software allows you to configure them, creates a completely game-changing experience.

The software

The Steam Controller is not a gamepad.  It’s an input device that can be used to emulate a gamepad, a mouse, a keyboard, or bits of each at the same time.

A screenshot of the controller configuration interface from Steam Big Picture Mode.

Steam Controller Configuration interface.

This is all handled by an addition to Steam’s user interface.  When you select a game in Steam Big Picture Mode, you have the option to configure your controller for that game.  This provides you with an interface that lets you map each pad, stick, button and click to a control on one of the aforementioned devices.  I won’t go into detail, but spending half an hour exploring the configuration options available is mindblowing.

A screenshot showing the options available for configuring the controller's touch pads.

Touch pad configuration options

Configuring a gamepad to emulate a mouse or keyboard is certainly nothing new (I’ve used qjoypad and WiiCan to do similar things in the past), but nothing before has had the flexibility and ease of use that Steam has achieved.  I’ve certainly never seen a controller configuration tool that you can use with the controller you’re configuring.

Once you’ve configured your controller (either manually, using one of Steam’s templates, or using a configuration shared by the Steam community), you launch the game and Steam activates your configuration.  You can even tweak the configuration in-game or call up the innovative dual-pointer on-screen keyboard by pressing the Steam button on the controller and bringing up the Steam Overlay.

The games

I’ve played a number of games with different control systems to really test out the Steam Contoller’s ability to play any game.  Here’s a bit about how the controller can be set up to play each type of game:

First-Person Shooter (Borderlands 2).

Borderlands 2 has gamepad support, so the simplest way to play with the Steam controller is to use the “Gamepad” preset.  This uses the left touchpad as a D-pad, and the right touchpad as an analogue stick. However, to really get the best experience, you can use the “Gamepad with precision aim” preset, which uses the right pad as a trackball, giving you a mouse-look style experience.

The grip buttons are mapped as extra A and X buttons, which gives you the nice option of keeping your thumbs on the movement controls while using those functions.

Real-Time Strategy (Xenonaughts)

RTS games rely on you having a mouse pointer to select units and issue commands.  For games like this, the right pad can be used to control the mouse pointer, while the triggers provide the mouse buttons.  Other buttons can be bound to keyboard keys used for selecting pre-defined groups of units, or switching to different views.

Third-Person Action/Adventure (Don’t starve)

As with Borderlands 2, Don’t starve has controller support, so using the Gamepad preset is all you need.  In an non-FPS game, the precision aim option isn’t so useful.

First-Person Adventure, not controller aware (Minecraft)

Many first-person PC games are controlled with the WASD keys for movement, and the mouse to control the camera.  The keys within easy reach of WASD are used for other functions like jumping and running.  Steam provides a “Keyboard (WASD) and mouse” preset with this configuration in mind – the rest of the buttons can be tweaked as required for the particular game.

Point-and-click adventure (Deponia)

As the name suggests, these games rely almost entirely on the mouse, with a few keys used for opening inventory and menus.  I found a community configuration for Deponia that had the mouse and just the few other functions required mapped to the buttons.

Isometric RPG (Pillars of Eternity)

Once again, these games tend to be mouse-driven to tell your characters where to go and what attacks to use, as well as to manage inventory and equipment, and select dialogue options.  A community configuration provided a nice mouse set up, with the added bonus of the right touch pad being used to move the view around (usually this might be controlled by right-clicking the mouse or using arrow keys).  One slight downfall on this sort of game is that they often make use of a large number of use-defined hotkeys, of which the controller is limited.  However, it is possible to use the grip buttons as modifiers, so “A” could be bound to the “1” key, “Left Grip+A” to the “5” key, and “Right Grip+A” to the “9” key, allowing for more hotkeys to be used.

First-person shooter, Steam Controller Support (Portal 2)

A final bonus of the Steam Controller is that Valve are making an API available for developers to bake support right into the game.  In this case, rather than mapping to gamepad/mouse buttons or keyboard keys, the controller can be mapped straight to available functions in the game (e.g. “Orange Portal” instead of “Mouse 1”).  Also, the game can display appropriate hints like “Left Grip to duck” rather than “Button 15 to duck”.  I expect we’ll be seeing more of this.

The verdict

I’m yet to find a game that I’m not comfortable playing with the Steam Controller and the right configuration.  While certain aspects of the controller take some getting used to, Valve have excelled themselves, to the point where I wouldn’t even consider using another controller for my PC gaming. 10/10.

Setting up a Steam Controller on Ubuntu

I recently received my pre-ordered Steam Controller, which I’ve be itching to use with my home-built HTPC/Steam Machine.  I do all my gaming (and everything else) on Ubuntu, and discovered at that the time the pre-releases were shipped, there was a bit of tweaking to do to get the controller working.  Thanks go to this Ask Ubuntu question and this Steam Community thread.  If you are using Ubuntu 15.10 or later, steps 1-3 shouldn’t be necessary, and hopefully a fix will be in place by the time the controllers are on general sale.

  1. Before you plug in your controller, edit (or create) file /lib/udev/rules.d/99-steam-controller-perms.rules.  This already existed on my system.  Edit to to contain at least these lines (it may also contain others):
    SUBSYSTEM=="usb", ATTRS{idVendor}=="28de", MODE="0666" 
    
    KERNEL=="uinput", MODE="0660", GROUP="yourusernamehere", OPTIONS+="static_node=uinput"
    

    Make sure you change “yourusernamehere” to the user you run steam as, or if there are several, the name of a group containing them all.  This will ensure that the controller is correctly recognised and can emulate a gamepad.

  2. Run sudo apt-get install python3-autopilot.  This will install some packages which will add you to a group with write access to /dev/uinput.  This is necessary for the controller to work properly.
  3. Reboot
  4. Plug in your steam controller dongle.
  5. Press the Steam button. It should beep and light up (if not, check the batteries are installed).
  6. Launch Steam.  You should see a notification that your controller is detected.
  7. Launch Big Picture Mode (if you don’t, Steam will tell you to).
  8. Ensure the Steam overlay is enabled in Settings -> In-game.  If not, your controller configurations will simply not work.
  9. Select a game, pick a controller configuration, and play!

Configuring pass on Windows

In my last post, I concluded that pass wasn’t any good if you use Windows due to the lack of browser extension and flaky apps.  I’ve since discovered how to set up both the command-line pass client and the Firefox extension on Windows, so thought it was worth another post to explain what I did.  Note, this isn’t a straightforward process.  You’re probably only interested in this if you primarily use a Unix-like system, but need Windows support too.  pass isn’t currently an ideal solution if you primarily use Windows.  This also assumes you’re running 64-bit Windows 7, other versions may have slightly different paths in the commands.

Dependencies

To run pass on Windows, you need GPG, Cygwin and some Cygwin packages.

For GPG, download and install GPG4Win.

Next, download Cygwin and run the setup.  Install the following packages: git, make, automake, tree.  If you are using git to sync your password store, you may also want to install ssh.

Run the Cygwin shell and create an alias for gpg:

alias gpg='/cygdrive/c/Program\ Files\ \(x86\)/GNU/GnuPG/gpg2.exe'
alias gpg >> .bash_profile

Optional: You might find it easier to set your cygwin home directory to you Windows home directory with this command:

mkpasswd -l -p "$(cygpath -H)" > /etc/passwd

Read this page for more information about that command.

pass

To install pass in cygwin, download the zip and extract it.  In the cygwin shell, cd to the directory where you’ve extracted it, and run

make install

This will install pass to /usr/bin/pass within the cygwin environment.  You can now follow the regular instructions for setting up your password store. If you are using git you may need to set the PASSWORD_STORE_DIR environment variable:

echo "export PASSWORD_STORE_DIR=/cygdrive/c/Users/username/.password-store" >> ~/.bash_profile
. .bash_profile

Firefox extension

The firefox extension isn’t available for Windows on addons.mozilla.org (as it is for Linux), but you can download it directly from the Github page.

Once you’ve added it to Firefox, click the P icon on your toolbar and select Preferences.  Set the following values (note the double slashes!):

  • User Home: C:\\Users\\yourusername
  • Pass command: C:\\cygwin64\\bin\\bash.exe
  • Pass command line arguements: –login /usr/bin/pass

Thanks to this discussion for that information.

And that’s it!

Migrating Lastpass to pass password store

I’ve been a Lastpass customer for several years, and it’s been pretty much the only service I’ve used which stores my data on someone else’s servers (albeit encrypted).  I’ve never been particualrly happy with this, but haven’t found a solution that allows me to access to my passwords easily from multiple devices across multiple platforms, so have stuck with it until now.

My Lastpass subscription is due for renewal this month, and this week Lastpass suffered a security breach.  This coincides with my discovery of pass, a unix password manager that stores your passwords locally in plain text files encrypted with GPG.  It also integrates with git to allow your password store to be easily shared between devices, and has clients for Android (which I need for my phone) and Windows (which I need for work).  I decided to have a go at migrating to see how I got on.

Linux

Setting up on Linux was straightforward.  I’m running Ubuntu 14.04, so installed with apt-get install pass  I generated a key with gpg --gen-key and ran pass init to create a password store using the key.  I then ran pass git init to initialise the git repository.  Next, I exported my passwords from LastPass using their CSV export feature, and ran the file through this script to import then into pass.  Similar scripts are available for migration from other password stores.

I installed the Firefox extension, and it works like a charm, matching the current site and filling in login forms for me.

Before I could install a client on another device, I needed to push the git password store to a server.  I logged into my server that’s accessible via the Internet, created a folder and ran git init --bare since I don’t need to have the files checked out on the server.  I then ran pass git add remote to add the server, and pass git push to sync the passwords.

Android

For Android, there is a client called Password Store which can be found in F-Droid or the Play Store.  First, you need to install OpenKeychain (available from the same places), and import your GPG key.  I followed this guide to export my key, copied it to my phone and used the “Import from File” option to add it to OpenKeychain.

In Password Store, I set up the Git repository and synced down my passwords.  I then set OpenKeychain the the OpenPGP provider, and I was set.  When unlocking a password, Password Store will automatically copy it to the clipboard for a defined number of seconds, then clear it.  OpenKeychain allows you to cache your key’s password for a defined number of minutes, so you don’t have to enter it repeatedly.  It then forgets it automatically.

Windows

Update: I’ve since worked out how to set up pass properly on Windows, including the Firefox extension.  See this post for a full guide.

There are several solutions for Windows, none of them are as complete as the Linux equivalents yet (for example, no Firefox plugin).  However, you can get a similar copy-to-clipboard-then-auto-delete workflow like on Android.

Firstly, you need to install Git and GPG.  I already had msysgit installed which includes gpg, but it’s an older version so I installed GPG4Win as well.  You then need to import your key into gpg.  I found this was easiest using the gpg CLI in git-bash (see the guide linked above again).

The “Windows Client” listed on the pass website is Pass4Win, but I found this to be buggy.  Instead, I went for the “Cross-platform GUI” listed in the site, QtPass.  This gives you the option to use native pass, or to use GPG and Git directly.  I went for the latter option (be sure to select the gpg2.exe executable installed by GPG4Win, not the older one provided by msysgit).

Running QtPass prompted me to create a password store – I selected the key I’d already added to GPG and it created the empty store.  To configure the git repository, I found it easiest to use the command line (it didn’t prompt me for git details in QtPass. I went to the password store directory that had just been created, then ran git init git remote add added the remote details to .git/config and ran git pull  Closing an re-opening QtPass found the git repository and I was good to go.

Conclusion

Lastpass has invested a lot in the usablitity of its soltution.  The browser plugins and Android apps take care of identifying websites and filling in the password for you.  pass is part way there, but still has a long way to go.  I’m willing to comprimise on the usability for the peace of mind of holding all my own data.  However, I wouldn’t recommend it to anyone who primarily uses Windows, and I wouldn’t want anyone who’s not familiar with what GPG is to try and set it up for themselves.  Once set up with the browser extension, it’s certainly a decent alternative to Lastpass on Linux, and a pretty good one on Android.

Diversity at OggCamp

Important note: This is a personal blog post on my personal blog. While I was largely responsible for the organisation of this year’s OggCamp, there is no formal organisation called “OggCamp”, and this post is intended to communicate my personal thoughts on these issues, not those of anyone else involved in past, present or future OggCamp events. At this point there are no plans regarding an OggCamp in 2015, as to where it will be, who will organise it, when or even whether it will happen.

OggCamp 14 took place this weekend in Oxford.  Shortly before the event, Twitter user @zenaynay mentioned that she would be keeping a tally of how many non-male and non-white attendees were at the event.  I was interested to see what she found, and today looked over her timeline from the weekend to find the comments posted below (with her permission), which I felt warranted a considered response.

Before I continue, I feel I should point out that I’m a middle-class white male living in the UK and working in the IT industry, which means I have no first-hand experience of what it’s like to be part of an under-represented minority in my everyday life. This means that when talking about these issues I fear that I may come across as patronising, insensitive, or otherwise offensive. However, to avoid discussing these issues on that basis would be to say that improving diversity is the sole responsibility of the under-represented, which won’t get us anywhere.

To summarise @zenaynay’s observations, she found that while there were a lot of white women (WW) at the event, there were almost no people of colour (POC) in general or women of colour (WOC) in particular, other then herself. In addition, the vast majority of the speakers at the event were men. As a result, she felt out of place, and as though she wasn’t part of the culture of the event.

This is a problem for me, as I want OggCamp to be an inclusive place for everyone. We have done a better job than other tech and open source-related conferences I’ve been to at attracting women and children, although we have made no specific effort to ensure this. To realise that we’re still excluding a group of potential attendees is disappointing, but I choose to take the criticism as an opportunity to make future events even better rather than a reason that this event was unsuccessful.

Personally, I’m more concerned with the content of the talks being diverse and interesting than the people that give them, but I also understand that members of a diverse audience may feel out of place watching a homogenous group of speakers to which they feel they dont belong, and may therefore be put off attending the event in the first place.  This isn’t a situation I’m happy with.

One point of @zenaynay’s observations that I don’t agree with is the assertion that the organisers use the unconference model of the event to get us off the hook regarding speaker diversity. This isn’t the case. From my point of view, one reason why we use the unconference model is that it gives OggCamp the energy and dynamic atmosphere that makes the event unique. The second (and probably main) reason why is that arranging a 3-track 2-day conference schedule is serious amount of work, and we simply don’t have the resources to do it.

We do have a small number of scheduled speakers each year, which is usually made up of people who I can think to ask. This is, of course, limited by the people that I know about, and then further by those who respond to me.  I dont think this has ever resulted in us having an all white-male schedule, but they have certainly been in the majority. If we had the capacity to manage the process, an open call for papers may be a useful device for getting a more diverse line-up of speakers.

As for diversity among unconference speakers, I’d like to hear from existing non-white-male attendees as to why they don’t tend to offer talks. It’s not necessary to indicate your gender, race, or age when submitting a talk to be voted on, so I can’t imagine that attendees use those metrics to decide which talks to watch.  However, there’s clearly something we’re missing here that’s putting people off.

Finally, we come to what I see as the most important issue, which underpins all of this: the diversity of attendees. More diverse attendees means a more diverse pool of speakers to draw on for the unconference, and a more diverse and inclusive culture to bring future attendees into, hopefully allowing them to feel more comfortable.
I don’t know for sure how people hear about and decide to come to OggCamp, but I suspect that it was initially members of the LugRadio community, plus listeners to the UUPC and Linux Outlaws podcasts, and then word of mouth spread from there. For whatever reason, this word of mouth didn’t spread to many people of colour.

Perhaps, therefore, what we need for OggCamp is more widespread marketing. The easiest way to market the event (and therefore the one I focused on this year) is to speak to previous attendees on social media, which is obviously never going to increase diversity. Knowing where and how to promote the event to make it visible to attendees who don’t necessarily fit the existing “mould” which we’ve apparently developed could be a big step in the right direction.

Another step in the right direction may be to adopt a formal code of conduct (CoC).  It’s not something we’ve ever felt the need to introduce before, but I was made aware this year of someone who was put off attending by the lack of a CoC.  Codifying and honouring our intention to make the event safe and welcoming for everyone may help encourage those who worry that they might not be welcome, to attend.

I’ve mentioned to several people this year that I’d like to increase the involvement of the community in the organisation of OggCamp by creating a permanent online discussion forum (web forum, mailing list or whatever). If we go ahead with this and you’re interested in helping OggCamp become more diverse, I’d encourage you to get involved in the discussion. Follow @OggCamp on Twitter and we’ll keep you posted as plans are developed.