The problem with 'dumb data'
In the Business Intelligence (BI) sector, the challenge facing analysts is no longer one of size, it’s one of complexity.
That’s the opinion of John Brand, vice president and principal analyst for the CIO Group of Forrester Research, who has been the keynote speaker at each of Microsoft’s Big Picture events in Chirstchurch, Wellington and Auckland this week.
BI is an oft-overlooked field, but it's one which Brand says is consistently in the top five subjects top-level executives want to talk about.
The challenge for years has been the volume problem, frequently referred to as the problem of ‘big data’.
However, Brand says trends happening elsewhere in the industry are turning this problem into one of ‘dumb data’ – how to integrate information from a variety of sources into something that makes sense.
"The business is now using a lot of data from outside the organisation,” Brand says, "and expecting to blend it with what’s been gathered on the inside.”
The problem has arisen due to the uptake of cloud computing, where people collaborate on documents and participate in discussions online, rather than within an organisation, as well as social networks, which have encouraged a ‘hosting and posting’ approach to information.
"Next-generation users are coming in and asking, why would you transfer that data?” Brand says.
"It just doesn’t make sense to them. It’s an amazing difference in the way we think about technology.”
So how do you deal with dumb data? Brand suggests looking at the processes rather than the information itself, for example breaking down BI into Repositories (raw data), Collections (filtered data) and Views (reports).
"It should never be about a single version of the truth, because you’ll never get there,” Brand says.
"It should be about a common understanding.”
Users can also think more about the impact the information will have on productivity.
"People need to ask, ‘what fundamental change would this information make to the running of this business?’”