Wednesday, 17 January 2018

MEETUP: Ethical Technology London

Last night I had a fun time in that there London town, for a meetup organised by Cennydd Bowles. I had become aware of the event after reading his post A techie’s rough guide to GDPR. This was also a fairly rare trip to the Silicon Roundabout for me, and I was struck by how much it has changed recently.

It was a low key, informal event with no agenda. Just interested (and interesting!) people talking about ethics and technology. Among the people I talked to were Anne who is organising an Ethical track at QCon, Rachel who had a brilliant ice break around topics that we thought would help keep technology ethical. Mark's answer that the fast scaling was an issue was more convincing than mine that "transparency" would be the answer. To paraphrase, he said that companies like Uber, AirBnB, and Facebook had probably scaled much quicker than their corporate governance and leadership could scale. The ecosystems that develop around these companies also further diluted the ethical leadership.

That really resonated with me, as looking back over my career processing and data storage is much, much less thought about. It's just not something that we really have to think about for most applications. Compare that to the extreme data efficiency that lead to the millennium bug due to cutting two bytes!

Another really interesting find for me was Richard talking about a "Data sharing pattern catalogue" project at IF. It seemed like a really obvious thing when he mentioned it. Looking at the work they share I think it's brilliant, to help teams that may not be used to approaches for sharing data to give them pointers of good patterns to follow. And it's open to outside contributors, so has the potential to be a really useful living body of knowledge.

All-in-all a good night learning about what others are up to. 

Sunday, 14 January 2018

Tools to help your start-up in starting up

Photo by Jo Szczepanska on Unsplash
Getting the correct tools in place for any initiative is important. For any product concentrating the core functionality is also key. Anything that isn't a core function should come off the shelf. Very few circumstances are that specialised to need to roll your own. 

The number one example of this for me are passwords. Not only do you not want to spend the time writing authentication code. You also won't want to spend the time doing so securely. Most people already have a Facebook, Google, LinkedIn, GitHub, or whatever log-in. Just use that. 

I have already shared about infrastructure that we chose for Bashfully. Had the project been different we may have made different choices. For example, Python and R have brilliant support for statistics and machine learning. Java is great support for build, CI and distributed system tooling. In this post I am going to take a look at a couple of feedback tools, as this is important wherever you are in the product life-cycle.


Survey feedback

One easy and popular way of creating surveys is Google Forms. I find this works best in closed networks to get quick feedback. It's a bit clunky to setup and shows limited data, for example average completion times. But it does have good integration with google docs, getting your results straight into a spreadsheet. Of the survey tools I have used, Typeform is my favourite. I found this to be better for sharing. I have also used the data about the people completing the survey, how long it took, and what kind of device was used. It has plenty of integrations through Zapier, as well as direct integrations with Mail Chimp and Google Docs.


Product feedback

Looking at more general feedback about your product, you want it to be as quick and simple as possible. I have been trialling the ProdPad customer feedback portal as it links to ideas on your road map and it is easy to combine with other sources of feedback. ProdPad again has lots of integrations through Zapier, as well as JIRA and Salesforce. This is a great tool that also allows you to expose ideas from the road-map and get feedback directly linked.

For something a bit more feature rich in analysing the feedback I have been looking at WIYM. This has a great dashboard, for fast feedback. The reason this is important as we still have a short road map in discovery mode. We don't have a clearly defined market and customer base, so we are still experimenting to meet our initial vision.

Road map feedback

We don't have a tool that shares this directly at the moment. But we use Headway for release notes, which then publishes to Twitter. I found out about this tool from seeing it used on another start-up's site. So remember, always keep an eye out for who provides functionality you like! They also have a simple road map tool in development, which could be interesting.

Conclusion

There are a lot of tools out there for exchanging information with your users. Most of them have a free plan if you are starting out. One thing I hadn't quite expected, for example with FullStory, was how Slack would become the nexus for many tools. Tag some interesting behaviour or bug occurring in FullStory, instantly share it. Get feedback from either of the tools we use, instantly ping in Slack. During my day job we use HipChat, which is similar on the surface, but doesn't have the same level of integration support or usability. Make use of trial periods and find the tools that work best for you. But don't get too bogged down, it helps if you see one that you like or read reviews on a site like ProductHunt or BetaList.

Other tools we use:

  • MailerLite - mail automation, including our welcome and on-boarding emails 
  • Zoho - incoming mail hosting, including mail lists 
  • Cloudflare - DNS and CDN 
  • UptimeRobot - monitors our site availability and provides our status page 
  • Trevor - read only tool for querying DB, see what skill tags are being added and build data sources for analysis in R 
  • MockFlow - great design tool, part of a really handy suite 


Further reading


Sunday, 7 January 2018

A tale of two courses: Blockchain vs Ethics

As part of my continuing professional development I have taken two courses. They are similar in terms of backer and effort required. Hopefuly they will help me to prepare for 2018!

The first is an intro to Ethics and Law related to analytics and AI applications. This provided by Microsoft on the edX platform. This followed a common format of:

  • a short video, 
  • linked content to read, 
  • labs to explore the subject, and finally 
  • quizzes to check progress. 

The content was still fresh with the course first run in April 2017. This meant it was topical with GDPR law as well as FCC rulings in the USA. If you are in IT and interested in Ethics, then I can recommend learning from an ethicist. There is no real need to come up with your own moral framework, since there is over 2000 years of open research. Having the two experts from different domains present their viewpoints and way of working was a nice escape from the technologist bubble.


The second was actually a pair of courses:

I am still finishing off the labs for foundation developer. The two courses are very similar though. The format is almost identical to the Microsoft course. It is hosted on IBM's developerWorks platform. The main difference is that the video segments were not as slick. It looks like a developer livestream that has been re-purposed. The learning outcomes might be better, as it gives hands on experience on using a new tool. This is based on the assumption most people have already used Excel, which is used in the Ethics course labs.

This course threw shade on bitcoin, which is what most of us probably think of when we hear blockchain. It was great for bringing it back to what blockchain is ... essentially "just" a modern ledger. At the start they even made it clear where it wouldn't be appropriate tech, and to use a normal distributed DB instead.



Summary

So, a quick head-to-head if you are interested in either of these courses:


Microsoft: DAT249x Ethics and Law in Analytics and AI  blockchain
Useful prior knowledge Excel is used for data analysis, but any stats package could be used. This isn't a very technical course. Some familiarity with JavaScript and JSON is useful for essentials, but the focus is more on a business understanding. APIs, JavaScript and docker for foundation developer as this is a more hands-on technical course for developers.
What I learned
  • Foundational abilities in applying ethical and legal frameworks for the data profession
  • Practical approaches to data and analytics problems, including Big Data and Data Science and AI
  • Applied data methods for ethical and legal work in Analytics and AI


  • An understanding of Blockchain principles and practices and how they can be applied within a business environment. 
  • I have an understanding of Blockchain and distributed ledger systems, the important concepts and key use cases of Blockchain and how assets can be transferred in a Blockchain network.
  • Successful completion of the associated practical lab demonstrating the ability to create a working chaincode and deploy it to a blockchain network
  • Experience of using Hyperledger Composer Playground to create and test a model prior to deployment
Certificate available Yes, for a fee: Hosted in the edX plarform Yes, for a fee: Badges hosted in third party YourAcclain


My learning plan for 2018 is currently:

Wednesday, 3 January 2018

What I "unlearned" in 2017

Photo by Matthew Spiteri on Unsplash
Inspired by this tweet I have decided to do a follow up to what I have learned in 2017, with what I have "unlearned"

I really like this idea, as looking back I suspect most of the time I’ve truly learned something I’ve been able to let go. In 2018 I am going to be much more mindful about whether fear or learning drives adding new ideas/skills/practices this year. I feel that it is much easier to layer on new skills while you learn them, without thinking about what in your tool kit is no longer useful ... or at least if the effort outweighs the benefit/impact of not doing it.

The main thing I have let go of this year has been no longer worrying about agile/scrum ceremonies and artifacts. Along with the main team I work with, I have moved to a much more flow driven way of working. Making the number of WIP items, and the value of the useful changes, the deciding factor on when to deploy to production. Because of the way the items fit together it generally works out at about 10 items, which may be a time-frame of about 4 weeks. We have deploys done to dev on each commit and for QA on a much more regular basis.

To put it really simply, the team pull work items from me as they need them and I then pull releases of working software as I want them. It hasn't completely eliminated the coordination and communication effort for planning time sensitive or urgent fixes. But now this activity only occurs as needed.

Other than that I don't think I have been deliberate enough in my learning to reflect on this over 2017. So that's something I've learned in 2018 already!


Further reading


Monday, 1 January 2018

Data, analytics and AI in 2018: Some hopes and pointers


When pondering what to write about looking forward to 2018 I had a shortlist of three hot topics:

  1. AR
  2. Blockchain
  3. Artificial Intelligence (AI)
I didn't choose AR as I think it will remain a specialist tool, although cool apps like Star Chart that my family love exist and Pokemon Go showed how addictive usage in games can be, it's still early days for tool kits like ARKit to make a break through app.

Blockchain is still probably at least a year off. Given the co-ordination needed in business process innovation it takes a bit longer to get into the mainstream. It appears that the processing speed is also a bit of a impediment at the moment. I am watching this field with interest though as it has potential to change the way companies process transactions. 

Which leaves AI. I chose this not just because it's been my key interest my whole adult life, but also because it is making another big step into the mainstream. Most people will have interacted with AI already for years in its application in detecting credit fraud, computer game opponents, and recommendation services. More recently with voice assistants such as Siri or Alexa adding language processing. This year self-driving cars are going to push computer vision and real time decision making.

I would expand this topic to include data science and analytics. AI relies on data, so good data hygiene and processing is vital. Also doing something interesting with that data often includes analysing it to show the information within. I suspect that in 2018 another nudge for investment this year is going to be GDPR, as marketers need to provide more value in exchange for permission to use data and send communications.


So, reading about what might be coming up in 2018 I would like to share three interesting posts from consultancies about AI and data analytics in 2018. The first is Top 10 AI technology trends for 2018 from pwc - An exciting year and prospects to look forward to! To me Explainable AI is the next goal to provide a competitive advantage. Using AI to predict outcomes is great, but if the model can't explain why it has made a prediction it is difficult to know how to improve it, or the limits of how you can trust it. This is a pretty big difference to how humans learning and provide feedback.

The next article was a great overview of current landscape for analytics, AI, and automation from McKinsey, data analysis skills and under-utilised assets are definitely an area of weakness that 2018 could start improving. With all the potential that could be unlocked, it seems a no-brainer.

It’s difficult to sum up Machine learning evolution in a simple infographic, but think pwc did a good job - just bear in mind reality is a lot more nuanced. Also AI predictions, on technology used, more then 5 years out are usually wrong! It's hard to argue with the general trend of combining techniques. This is something that I have observed since I started studying AI. Where neural nets were abandoned for a decade in favour of other techniques. Now with much larger corporate focus with giants like Apple, Google, and Facebook all having substantial investment and looking for a competitive edge, I'd hope that this wouldn't be the case.

But what about the data scientist building a lot of this infrastructure? One key concern I have picked out from looking around on twitter to gauge opinion, is below

worth bearing in mind if you are introducing predictive analytics. Your model reflects the data that you had at the time and choices made during the training process. How are you going to check the performance and effectiveness going forward? Especially if you don't have an Explainable AI.

The second is how important the foundation is, Sean Taylor highlights that a lot of the value of data scientists work is how it is used downstream

So think about the benefits that a good data analytic culture can bring. Not all of the value will come from where this work is done. If it's done right then the learning will be occurring in other roles.

In the past year I have been looking at some of the business aspects of AI. So this year my resolution has been to brush up on the more technical aspects. Expect to see my write about the DataCamp Data Science courses and others that I take in 2018.


Further reading

Wednesday, 27 December 2017

Lessons that 2017 taught me

Photo by Annie Spratt on Unsplash
I never used to see the point of "end of year round up" blog posts. But the journey that I've been on during the past year has lead me to reflect on what I have learned.

The beginning of the year started off with me thinking about being data informed. This was balanced by a survey that showed this work had paid off. Since then I have put in processing to help collect and report on what we guided us to detect changes. Learning R and creating a repo to share the data processing recipes has been the culmination of this.

Data isn't just an important topic for product management, combined with ethics it's a topic that is increasingly touching our lives. My interest was first piqued in my reading during April.  Later in the year as GDPR started to loom on the horizon I took a course on Ethics and Law in Data and Analytics. This was a great course that covered not only some philosophical exploration of what it means to be "ethical", but also how the legal framework in the US and Europe tackles the same issue.

Alongside this I launched a side project. I have found that this is a great way to hone skills outside my usual work environment. It has also taught me about some of the differences between B2B and B2C. One of the things that is easier with B2C is that you can get feedback easier and target potential users on social media. One thing that is easier with B2B is having a clearer idea of customers and their needs ... they even organise conferences to tell you what they want!

Finally, I have embraced the cloud! The biggest thing that I have gained in 2017 is using distributed apps. This has free me in simple ways, like being able to access my "library" of interesting e-books and papers wherever I am. It has also been great for collaborating outside of working hours and an office setting. No more carrying around memory sticks!


My top posts of 2017

Further Reading

My top picks of the crop of 2017 reviews/2018 previews

Sunday, 17 December 2017

What to look for in innovation

Photo by Andy Kelly on Unsplash
This week I attended a webinar on AI in the aviation industry. I don't envy anyone in doing a summary of AI in under an hour leaving enough time for the rest webinar! IT's a shame that one bit that gets missed is the role of supporting technology or ecosystem in innovation.

Looking back one of the big factors allowing AI to become useful has been the supporting technology. Namely, speed of processing power and availability of memory.

Taking a different industry, the Netflix business model was helped by increasing broadband speeds, encoding formats, and again processing power. The change in the shape of overheads was probably a key enabler. Switching to an internet streaming business allowed the delivery mechanism to scale on demand.

Going to the root of both of these, Get off the grass by Hendy and Callaghan had the best history that I have read of Silicon Valley. The innovation didn't come from research into completely new technology. In theory transistors could have existed years before. It was a combination of skilled workforce and government procurement processes that enabled the technology to flourish.

So, a key take away for me is that when looking for innovation look for the enabling factors. The complementary products. Look for changes in the ecosystem that are precursors to wider availability. Alternatively look for the complementary products that change behaviour. 

Going back to AI, smartphones changed the way we accessed AI powered software. My first encounter with voice dictation was on a desktop. The software had to be installed and was essentially fixed when shipped. With the developers not being able to build on how the software was used in the real world. Now Siri can take advantage or a speedy local processor coupled with a fast connection to centralised software. Like Amazon Alexa, Siri can take advantage and learn from every single usage of the core software.

MEETUP: Ethical Technology London

Last night I had a fun time in that there London town, for a meetup organised by Cennydd Bowles. I had become aware of the event after readi...