Venturing into the world of SEO

It has been a bit of a voyage into the unexpected in looking at SEO this month. For my side project Bashfully, which is to create an online profile for people early in their careers that has three guiding principles. That it should be:



  1. Discoverable - people need to be able to find the person based on their skills, experience, and aspirations.
  2. Personalised - the skills and experience need to have the ability to be tailored for specific job applications.
  3. Guiding - given the above, give enough structure that allows the profile builder to tell their story in the best way possible. Also a longer term goal here is to provide feedback based on other profiles that match their aspirations.
The features that we develop tend to rotate around each of these goals to keep the product balanced. We hadn't done much in the discoverable area, apart from setting meta data required for creating the cards used in sharing to Facebook or Twitter. Since this came up in our user research we needed to start improving. So, that's the why, now on to the how.


Google Search Console

The first step was looking at the search terms used to reach the site in Google Analytics. Looking for custom reports to help dig into the data, I learned about Google Search Console. This required some extra setup, with linking to Analytics being enough to "prove ownership". The Google Search Console appears to going through an update at the moment. The new design doesn't yet cover all the functionality.

Looking at the data there it was a bit bare. Not many pages were indexed, so therefore not showing up in results. Under the "Crawl" menu was a setting for sitemaps. I knew that Bashfully did not have a sitemap yet, so that was my first stop. 


Sitemap

I wasn't entirely sure what format the sitemap needed to be in. So I went to free site map generator.com. This is not the most beautiful site that I have ever encountered. But it did generate an XML sitemap for me. A little bit of tidying up I put it in the root of the site. One extra little step was to add the path to the sitemap in robots.txt




Digging into results

Once the new sitemap was up and running I went to "Crawl" > "Sitemaps" to get an index started. Within a couple of days we started seeing the clicks that were showing up as hits in Google Analytics. However the whole experience isn't as obvious as say Google Analytics. Which is why SEO tools that take the Google Search Console data exist. 

SanityCheck.io is one such tool. Once you give their user permission to access your search data then you get three options to guide you:

  1. ...get some new content ideas
  2. ...find striking distance keywords
  3. ...improve the CTR of a page




I found this really helpful. Especially the "striking distance keywords", this helps find words that are on the second page of results. I only got the chance to put one small text change through and view the results before the free trial ran out, but I did see more varied search terms match with the site.

Our main approach here has been to make small changes where we can measure the results and probe to learn more. We don't have any SEO experience between us, so doing large changes would not be a great use of time ... and just as likely to make things worse.


Other considerations

Speed that pages get served up is now part of the Google ranking. This is alongside the other changes to nudge sites to be mobile friendly. SanityCheck also allows you to setup speed checks on specific pages and caught a page slowing down in the desktop version - but strangely not in the mobile test. It is strange to think that websites wouldn't be responsive these days.

Appearance of the results. This was a bit more of a rabbit hole! Rather naively I thought that  Google might use the Open Graph meta data that Twitter and Facebook use for enriching links. But no, they use Structured Data. This is really useful in emails for events etc and as it turns out giving hints on how to display search results.



As you can see from the example above it does duplicate the data already there, but it does have some key features that we are interested in. One of the top requested problems to solve in our user research was being optimised for SEO. These structured data and the "Person" type gives us the best way to give Google the hints it needs to put forward our users stories in the best possible way.

One of the frustrating aspects of structured data is that is only supports a subset of the full spec. So there was a bit of trial and error to see which other properties Google supported above the few mentioned explicitly in their intro site.

Further reading

Comments

Popular posts from this blog

CONFERENCE: TTI Summer Forum 2017 – Getting to Grips with GDPR

On HBX and online education

On performance and environment