Posted on Friday 6th July 2012 9:47
The thinktank Policy Exchange has waded into the Big Data debate with a headline grabbing report claiming that greater use of big data analytics could save the UK government £33bn.
Its report, which was launched at Computing magazine’s Big Data Summit, makes the valid point that while the public sector devotes considerable time and resource to amassing data as part of running the public services, it’s much less effective at connecting these up. As the report’s author, Chris Yui, says “finding ways to share or link this data together has the potential to save time for citizens and money for taxpayers. The government will need the capability to conduct analytics effectively, and the courage to pursue this agenda with integrity.”
Among the more proposals contained in Policy Exchange’s report is a call to scrap the 10 yearly census and instead gather data on the UK’s changing population demographics by ‘mashing up’ information from the electoral role and council tax registers. Similarly, the thinktank also suggests that problems such as traffic bottlenecks in airports and train stations could be averted by monitoring activity on Twitter.
While it’s great news that an influential policy body has bought into the idea that stores of unconnected data represent significant unlocked value to governments and taxpayers, I worry that this report underestimates the challenges and investment required to make these savings possible. Big data analytics is a powerful tool, but it’s not a panacea, especially not when it’s expected to interface with complex computer systems. The success of such systems also depends on the regular or automatic flow of clean, high quality data, which still cannot be assured over the public sector. Take Policy Exchange’s census-scrapping mash-up as an example. This assumes that council tax data from all 353 local authorities can be amalgamated and cross-referenced against other demographic data which could be structured in fundamentally different ways. It’s a wonderful idea, but who’s going to write (and pay for) the APIs that make it possible if the single objective of the programme is to save money.
For an example of how an efficiency project in the public sector can easily spiral out of all control we don’t need to look any further than the NHS’s ill-fated National Programme For IT. This came unstuck when it emerged that getting government computer systems to talk to one another, let alone work together, and ended up being a vastly more expensive and time-consuming business than the technology evangelists promised. Ten years on it is reasonable to expect that better standards of project management in the public sector would smooth the implementation of big data analytics somewhat, but it won’t make the process painless.
It’s also a little misleading to suggest that a government in search of short-term savings could secure them through big data analytics alone when most government agencies would be coming to such projects from a standing start and would need to think of it as a medium to long-term investment
Big data, like all other hot technologies, has enormous latent power to transform the way organisations operate. If it is to fulfil on its early promise, however, it’s crucial that we don’t over-promise on what it can do on its own. Yes, it could help the world’s governments govern in a much more efficient and joined-up way, but there’s a lot of groundwork to be done before that’s possible.