Wednesday, 22 February 2017

BOOK REVIEW: Fifty quick ideas to improve your user stories by Gojko Adzic and David Evans

BOOK REVIEW: Fifty quick ideas to improve your user stories by Gojko Adzic and David Evans

It has taken a while since my last book review to get the time to do much reading with moving. But here is my round up of "Fifty quick ideas to improve your user stories" by Gojko Adzic and David Evans

Formats: Paperback, ePub, Mobi, PDF

Where can I get it? From Leanpub, Amazon or any good bookshop.
Who is it for? Anyone involved in a software development project working in an iterative manner. As long as they understand some of the basics around user stories, e.g. they know what INVEST stands for.
What's it about? As the title suggests "how to improve user stories", but it is a bit more than that. It covers the whole process including planning and iterative delivery activities.

What's the book like? Each double page spread follows a similar style starting with an introduction to a new tip. Often illustrated with an anecdote from the author's experience. Next is a description of the key benefits of the idea behind the tip. Finally, it finishes with some practical ideas on how to make it work. This takes the theory and presents it in ways that you can apply it for whatever you are working on

The chapters follow the life cycle of a user story:
  • Creating stories
  • Planning with stories
  • Discussing stories
  • Splitting stories
  • Managing Iterative Delivery
The story splitting chapter was my favourite section. These tips cover some of the real gnarly issues in non-trivial agile projects. For example, putting off implementing a reporting system until quarterly report is due and developing the key functionality generating the data as text files. These can then be imported to a new infrastructure once ready.

The tip introductions are really good at giving you a grounding in the context that Gojko and David have developed their experience. This is an important factor as it allows the reader to gauge what factors are similar in their context. All forms of advice (otherwise known as "best practice") need to be adapted to the real world context they will be used in.
To sum up, lots of practical tips to help get value delivered. For example, don't get stuck in a rut with stories that aren't appropriate like technical tasks. If you want to know more about the book topics then check out this cool mind map of the book.

Sunday, 12 February 2017

MEETUP: "Collaborative product management " at ProductTank Brighton

Looking at my notes I forgot to blog about November's meetup "From Startup to Corp: The differences, and how to adapt to the change." sponsored by 15below. There were three very good talks on the difference between very small companies and massive ones. Some highlights from the three talks:
  • Don't feel like you have to aim for moonshots if you aren't Google Ventures - you only get a limited number of bets compared to GV, so a 10% incremental improvement every year is nothing to be ashamed of!
  • Don't feel too embarrassed by your peers - if you start of at the same time as JustEat but have a much more niche market it's OK to have more modest growth.
  • Just because you can see everyone doesn't mean that they know what you are doing - large organisations have the privilege of knowing communication is poor so actively do things to encourage it, in a small startup you have to make ensure people are informed.
  • Large organisations can offer more support and flex around you - for example cover maternity leave or secondment to another business unit.
But anyway ... last week was another interesting edition of the ProductTank Brighton meetup hosted by Rakuten on "Collaborative product management".

Unfortunately, the first speaker scheduled - Dan Kidd - couldn't make it to deliver his talk "Now That's What I Call Continuous Delivery". I was really interested to hear how madgex have used the improved feedback cycles. I am looking forward to the reschedule!

Instead, the first talk was by Ed Vinicombe on "My three critical values of collaborative design" and how traditional ways of working are changing in knowledge work. He is really lucky at to be guided almost entirely by user research (and not HiPPO!). It sounded like this gave him the freedom to collaborate around the requests and deliver quicker.

Next up was Mike Rowlands talking about the development culture at LShift. Maybe controversially for a Product Managers meetup he was suggesting that the role might not be needed! They had very deliberately set out to recruit senior polyglot developers. Ensuring that they fit working as lead developers in an AgilePM environment. Both the team sizes and projects undertaken are kept small. It is up to the lead dev to choose the appropriate tool for the job and peer review is used as a check and balance. They work closely with the clients and do a lot of the communications without an intermediary. This collaboration between client and dev is what helps develop the product. I can see how in this specific environment it works. As it is appropriate in consultancy style "products", but I'm not sure it works as well for long running commercial off the shelf software.

It was interesting hearing from an MD who had created the kind of company he wanted to work in and vigorously defended that culture as it expanded. It sounded like being careful at the type of work accepted played a key role in this. A final little tip was that they aimed to get the work completed and handed over so the clients could support their own BaU operations. But the clients in most cases kept them on support contracts because of the way they solved problems in a transparent manner.

Monday, 6 February 2017

On data blind and data informed

This post is the story of a personal journey. It starts with a legacy product. Both powerful and flexible having grown to meet needs over time. But it is my view on steps to improve the process in my context - so your mileage may vary.

Phase 1: Transactional DB as source

Started with looking at the transactional data. Some things were obvious, one option in the UI is recorded as true/false in the DB. Some were more complicated and required more investigation. To back this up we conducted a user survey. Followed up by a workshop to find the most important user journey to concentrate on. This highlighted three related journeys. The key characteristic of activity in this phase was mining different structures of data. The format of the data coming from custom reports.

Getting to the questions to ask was guided by a user survey to almost the whole user base, a smaller workshop looking at the "job to be done" for the platform. Other than that the skills needed where accessible to most development companies or teams - SQL and Excel.

Phase 2: Audit metrics

During developing these we put in place metrics to track the different activities. This was across all clients in on pace. Even though these were basic metrics they did allow patterns of usage to emerge. For example, actions being triggered and then canceled. Repeatedly. We were able to follow up and check why this happened. Before it became a support or account management issue. This phase had standardised event data that was easily aggregated and sorted. But like the previous phase still tabular in nature. A key benefit here over the transactional data was explicit time stamping of events and users.

In this phase, we also started to use Google Analytics to make the collection of browser user statistics easier. One advantage that we gained over web server log file analysis (other than having to do log file analysis ;) was the browser window size and native screen size. This was useful for guiding the constraints on page design, and tracking how this changed over time.

Phase 3: Users behaviour

The next phase came in adding in more user journeys and variations on the three that already existed. the different combinations made it more difficult to see what was happening and who the interrelated functions were used in tabular data. So the next step was to use session recording. This also helps speed up doing usability testing by opening up a way of recording sessions and playing back. This is invaluable as the pace of development in the new framework speeds up. The feedback on how that solves the problem also needs to scale and improve in quality.

This does bring interesting challenges on workflow. Previous phases have ultimately ended up in some for of document - Excel spreadsheet or table in Word. So a factor in evaluating tools is "how can I make sure this interesting behaviour can be referenced". Can I get a permalink to go to a point in the playback? Will it integrate with our chat tool to let people know something has been found? ... will they then use it?

Phase 4+: Infinity and beyond...

What comes next very much depends on which process problem we are looking to solve next. One thing in our context that probably won't make sense is A/B testing, as in our B2B environment consistency and predictability are valued. Some of the things that it might include are:
  • ROI - Taking into account factors like whole-life costing of the new product development, revenue generation/protection and feature usage to get the most bangs per buck.
  • Process mining - looking at user behaviour to find correlations between tasks to form new packages or products.
  • In app feedback - fine grained/easy feedback, like the MSDN "was this article useful" to flag issues.
  • Visulisation - how information flows through the system, maybe animated to really tell a story.
  • Heat maps - Another way to tell a story about feature usage.
  • Further development of customer types and tailoring products for them.

Further reading