Last week I attended the Big Data Analytics Conference on Data Quality at the Francis Crick Institute.
It was a great event in an amazing space. It covered a huge range of applications, ranging from looking at data to identify election fraud through to the use of Big Data in medicine, via the creation of apps to disrupt the financial services market. And I spoke about what Big Data could learn from official statistics.
It’s easy to have two cynical reactions to a Big Data event. First, that Big Data isn’t a thing at all, but just a buzz word designed to make not-very-new things sound new; and second, that the Government is miles behind everyone else.
I think these reactions miss the point. This event showed that Big Data is as much as a cross-domain community as it is a set of specific techniques. And it’s clear from the event that this community is vibrant, committed and growing. And as I tried to show in my slides, there’s lots to learn from how large data sets are used in Government – about the pitfalls of neglecting data quality and also the innovations that emerge when Government analysts link multiple datasets, as they have done in Department for Education recently to look at ordinary working families.
Here are my slides [Big Data Presentation]. They capture a simple message: that focusing on the eternal principles of trustworthiness, quality and value is still the starting point of all good work with data and all good presentation of statistics.
So thanks to the organisers and speakers – it was a great event.