Further developing an onboarding process for a green field product
This is part of a series about my side project Bashfully, which aims to give graduates and other new entrants to careers a seasoned professional level way of expressing themselves through the super power of story telling. Following the core principles of being discoverable, personalised and guiding in approach.
Following on from my post on Building an onboarding process for a green field product we have building the experience. One of the lessons I pulled out previously was about launching something to get feedback. Even if you don't feel ready. It's easy to know the theory, but hard to put yourself out there!
I'm really glad that we did as it allowed some feedback and integration issues to be tested while we polished.
One approach that we have taken is to slowly refactor the experience as we add functionality into the edit screens. To start with we had a limited set of data editable. Basically the story elements. The "extras" like social links we didn't include in the first release. Now there is a unified experience, with a better underlying code base.
An interesting choice that we had to make was the "minimum" amount of detail people had to enter before "completing" a profile in the setup. In the end we decide to go with almost nothing. Make the sign up as easy as possible.
I have some experience of this with data driven applications. These are mainly the sole focus of the screen and I have tackled improving them by providing context, for example on a log out screen not only providing a log in link but directions to other brand touch points. The experience here with Bashfully has given me a bit of twist to the way I look at them. The data that was missing has three different uses, with different impacts to user goals when they are not completed.
Looking at how similar products do this, YouGov and LinkedIn give you a percentage completion figure. This can be useful in seeing how much more that you need to complete, but it doesn't give you any information on the impact to assess what to do next (if you lack time or motivation to do it all)
We've now pushed this live so that we can monitor what our users think is most important. Starting with the top three features of the site. There are other features where we can add a status indicator, for example if any skills have been tagged.
Overall it feels like the approach of batches of experimentation at the same time that we improve the code base is the way to go. Otherwise, it's too easy to rush experiments with "temporary code" that becomes technical debt you need to worry about. The thoughtful way that Martyn is evolving the architecture deserves a lot of credit for enabling this.
One thing we really need to watch out for though, is not gold plating the onboarding process. We need to keep an eye that experience matches the same level of functionality once completed. One way to do that is to develop another mini feature experiment while collecting data on this one.
Photo by Etienne Boulanger on Unsplash |
I'm really glad that we did as it allowed some feedback and integration issues to be tested while we polished.
Background
One approach that we have taken is to slowly refactor the experience as we add functionality into the edit screens. To start with we had a limited set of data editable. Basically the story elements. The "extras" like social links we didn't include in the first release. Now there is a unified experience, with a better underlying code base.
An interesting choice that we had to make was the "minimum" amount of detail people had to enter before "completing" a profile in the setup. In the end we decide to go with almost nothing. Make the sign up as easy as possible.
First iteration
In the first iteration we provide no onscreen guidance and relied on the user going to "Manage profile" from the action menu. Then finding what hadn't been completed. This got me interested in empty states.I have some experience of this with data driven applications. These are mainly the sole focus of the screen and I have tackled improving them by providing context, for example on a log out screen not only providing a log in link but directions to other brand touch points. The experience here with Bashfully has given me a bit of twist to the way I look at them. The data that was missing has three different uses, with different impacts to user goals when they are not completed.
Looking at how similar products do this, YouGov and LinkedIn give you a percentage completion figure. This can be useful in seeing how much more that you need to complete, but it doesn't give you any information on the impact to assess what to do next (if you lack time or motivation to do it all)
Second iteration
So next, we give feedback on what hasn't been completed and the impact. So, for example not choosing a profile URL means it isn't discoverable. We also choose not to spread the state messages over the profile screen, where the content should be, but to position it to the profile owner at the top of the profile text.We've now pushed this live so that we can monitor what our users think is most important. Starting with the top three features of the site. There are other features where we can add a status indicator, for example if any skills have been tagged.
Future improvements
Another avenue we need to explore is doing something a bit more helpful with data imports. At the moment we populate the core fields in experience, but there are some ideas on how we can infer some helpful things to recommend to new uses completing their profile. Oh and extending the "LinkedIn profile import" with other integrations like Facebook, Codepen, and Dribbble.Overall it feels like the approach of batches of experimentation at the same time that we improve the code base is the way to go. Otherwise, it's too easy to rush experiments with "temporary code" that becomes technical debt you need to worry about. The thoughtful way that Martyn is evolving the architecture deserves a lot of credit for enabling this.
One thing we really need to watch out for though, is not gold plating the onboarding process. We need to keep an eye that experience matches the same level of functionality once completed. One way to do that is to develop another mini feature experiment while collecting data on this one.
Comments
Post a Comment