The challenge of ‘user driven development’

 

It has been 18 months now since we officially started the project to build a new ONS website – the most consistent thing throughout our initial discovery, the alpha and now the latter days of the beta has been a commitment to the principle of being ‘user focused’.

Over the life of the project I have taken to [mis]using the term ‘user driven development’ for the approach we have endeavoured to take on. At every stage of the project we have had at least one user researcher dedicated to our work and have regularly undertaken the full spectrum of research activities from lab based usability testing through to unmoderated online card sorts via critical friend interviews and guerrilla testing.

From the start we followed the advice from GDS (nicely summed up in this blogpost from Leisa) about how to integrate user researchers and their findings and priorities with an agile development team and despite a few hiccups it pretty much worked well for us throughout the alpha. The whole concept of ‘user research is a team sport’ was really something we embraced and it had a lot to do with the success of that phase I think.

As the project became considerably more complex during the beta – as we moved from producing a prototype to something production ready with all the additional requirements that brings – it became less straightforward to integrate the insights from the user research as quickly into the development cycle as we’d like.

Now we have never let these difficulties from stopping us doing what we believe is the right thing – and being user focused really is how we define ourselves as a team – but we have finite capacity and are a small team. The juggling act between different priorities and compromises to ensure a constantly improving product (including the back-end) is getting to a ‘Britain’s Got Talent’ level. Pressing issues identified via user research still always get prioritised but the backlog of nice to haves/fix has grown (though if issues re-occur then of course they get re-evaluated.)

I recognised a lot of our challenges in this article by Chris Gray particularly the challenge of participant recruitment and getting support systems in place. We are lucky to have a really engaged audience and especially our online research activities are extremely well supported but participant recruitment for particular slots on particular days is often hard work and time consuming (Alison in the team has been a trooper with this recently!) Especially as the personas we need to recruit for our slightly more specialist perhaps than some other projects.

After a year of managing our participant volunteer list via an increasingly complex spreadsheet (we are ONS – we do things in Excel!) we have just started using a proper contact management / CRM tool which looks like it will make a real difference. The ability to add custom fields, tags and segment entries is going to be a godsend longterm.

The ‘Fieldwork Fridays’ approach that is mentioned in the previous article as being used at Google is probably closest to our model – we schedule research opportunities and then poor Jonathan (our user research lead) works out what is the most pressing research need at the time. We have started to supplement this with more online testing – including some Optimal Workshop products but also Usability Hub and Loop11 as we try to broaden our reach in the run up to the end of the project.

I also found this post about ‘Incremental UX’ interesting recently – again I recognise a lot of the problems it is trying to address and I think without really formally realising it we have been working in a quite similar way for a while now. Starting by providing the ‘minimal positive experience’ to our users as and when we change the UI or add features but always aware of the longer term aims and using the research to keep checking that those aims remain valid.

Like I said about working agile in a previous blogpost ‘user driven development’ might be the right thing but it definitely isn’t the easy thing.

Beta breather..

Earlier this week we (very) quietly made an early version of the ONS Beta site public. It is a true Beta – in the sense that it remains experimental and subject to changes at no notice. To be honest it is a little ‘under proofed’ still even for a Beta but I was determined to get something out and work in the open again. I’ve mentioned Linus’s Law over on the work blog but more than that I just believe our users deserve to see what we are up to – whether that is to validate things or reject them. We aren’t building this thing for ourselves after all.

I’m proud of what has been done so far while acknowledging there is much, much more to do. The team has been amazing – really going above and beyond the call of duty – and it has been hard at times.

I’m not going to lie the whole thing to date has left me…and this is a technical term…knackered.

While the Alpha was high pressure but actually enormous fun due to the experimental, prototype nature of the whole thing the Beta has at times been a hard fought slog with more pressure, expectations and onlookers than ever before. The fact our project/team has retroactively become some kind of internal exemplar on implementing a more agile/dev-ops/user centric approach has also ramped things up with internal systems and processes being rewritten around us it feels like.

I am mentally exhausted. I do wonder whether the importance the GDS approach puts on the Service/Product Manager role (they tend to talk about Service Managers now – I still see that role as Product at least in the dev stage) creates a bit of a ‘fault line’ – where, for instance, after a four and a half hour Service Assessment (with follow up questions) said Service Manager might want to curl up in a ball and not think about the web for a decade or so. I don’t know. Some of it (most of it?) is down to me and my all or nothing approach I guess. When it works it works well but every now and again it shatters me.

Anyway I have not been at my best the last couple of weeks – the pressure got the better of me and I was snappy in meetings, or disengaged or overly demanding. Basically I’ve been a bit of a dick.

So I’m off work for a couple of weeks – no (work) email, no Slack, no sneaking a look at Bugherd, no blogging on the work blog – but I do have some stuff for here I’ve just not had the head space to write.

So please have a look at the Beta and let us know what you think – I’ll get back to you in two weeks 🙂

My Ambitions for the Beta

The development team is in the building and the tip tapping of coding on custom keyboards fills the air so it is about time I tried to articulate some personal ambitions for this project.

[These aren’t necessarily the corporate objectives – though there is some cross over and clearly our commitment to ‘user driven development’ means they are subject to change based on research with real users.]

At the most simple level we are doing two things;

1) building a website
2) building a publishing platform to maintain that website

(I’m avoiding the entire content design, migration, archiving aspect on this occasion…)

I think actually my main ambition for the site is pretty much summed up by the ONS ‘vision’ from the strategy;

“To be widely respected for informing debate and improving decision making through high quality, easy to use statistics and analyses on the UK’s economy and society.”

I particularly am driven by the idea of;

“..informing debate and improving decision making”.

While we are not responsible for the ‘high quality, easy to use statistics and analyses’ we can ensure that statistics are easy to find, share and provide visualisations and tools that make them easier to understand. To provide a website that can inform debate and improve decision making in fact.

In order to make this happen we will continue to follow our ‘rules of engagement’ and the GDS Service Standard.

Beyond this though while GDS talk about ’Simpler, clearer, faster.’ we have our own focusing statement;

“Data intense. Design simple.”

[with thanks to Edward Tufte]

Following a long tradition that traces back to Florence Nightingale (more on her later) the site seeks to present statistics alongside clear commentary, charts and visualisations that add context and clarity to help convince. On this project it is ‘context that is king’.

The statistics are the focus – we surface the numbers (and the context that makes sense of them) on the web and do not hide them just in spreadsheets and documents.

Any website, especially Government websites, has a certain amount of content that it must publish for legal reasons, some for sensible legacy reasons and some that meets user needs beyond the core focus of the site. The balancing act will be fulfilling this without distracting from those core user needs.

The publishing platform needs to be able to support these lofty goals, be flexible enough to move with the times and changing user requirements, work within the security model and, importantly, cope with 09.30 publishing. No pressure then.

The platform, which I have code-named Florence because whether it is true or not I like the tale of Florence Nightingale using statistics and charts to make her points with Parliament back in the Crimea War, has some unusual requirements and constraints but it benefits from so much work done in the past. I talk about rather than it being an off the shelf CMS it will be a;

“Bespoke configuration of established open source components.”

This give us the flexibility to build something lightweight as far as possible, with full test coverage and plenty of scope for incremental improvements. Not to mention allows us to build on the lessons from people like GDS and the Guardian.

We are following the ‘minimal viable product’ approach for this aspect and will build just enough to support our initial needs. I’ve been known to refer to the likely outcome of this approach as a tool ‘so simple it is dangerous’ and I stand by that but it is a necessary step I think to ensure we don’t over engineer things (an issue I think most CMS projects face.)

There are things I am looking to add that are not unique but different from the standard web publishing tool. I’m still aiming to minimise (read stop) the use of DIY charts and stock images on the site so I will need a tool that lets publishers upload data and embed a chart (not dissimilar to Datawrapper, Charted or Chartbuilder) that will meet ONS standards, be responsive, shareable and allow the download of the data.

I am also interested in the potential flexibility something like the NPR dailygraphics rig could provide to our Digital Content team.

The last thing I’ll mention is my desire to implement the GOV.UK /info/ pages in some manner – especially the metrics element of this. I think it is a game changer in terms of transparency.

So I don’t want much then.