I remember, in the 1980’s, being issued with my first desktop PC. It had a hard drive that held a total of 40mb. It was luxury. I could hold all the data I needed about clients along with documents and spreadsheets. Nowadays we won’t even look at a USB stick unless it can hold gigabytes of data.
Similarly, our clients’ databases then containing their several thousand customers with histories of their purchases were the epitome of ‘big data’, needing at minimum a mid-range computer.
What changed the view of ‘big data’ has been the explosion of digital information chiefly through the development of the world wide web and the ability of the industry to both meet the storage volume requirements and reduce the cost. Back in 2007 Information Week stated that “for the first time, the amount of digital information generated will surpass the storage capacity.” Since then, it has been estimated that 90% of all data in use today has been accumulated within just the last two years.
The term ‘big data’ implies volume and it relates not just to the amount of data being received but also the interaction between those data elements to derive new values and so create more data. However, consideration of ‘big data’ goes beyond just volume. As far back as 2001 Gartner, the technology research resource, identified that volume had to be considered along with ‘Variety’ and ‘Velocity’. We are seeing that today, with variety comprising more and more unstructured data for example in notes, blogs and social media postings or photographs, videos and audio with all their attendant metadata (who took it, where, when, size, etc). Velocity relates to the speed of the flow of information.
Consider these examples of ‘big data’ in relation to your own organisations. Ebay receives 100 terabytes of new data every day, YouTube receives 48 hours of uploaded video every minute and the Cern Hadron Collider sensors deliver data 40 million times per second. That is big.
However, the underlying concept of how big you consider your data to be must be how you use it to derive knowledge and insight, identify trends and deliver nuggets of commercial opportunity. What may be considered ‘big data’ will depend on your data management proficiency and the capabilities of the technology you have; if it’s too big for you, then it is ‘big data’!
Unless you are really dealing with many terabytes or even the unimaginably large petabytes of data, don’t be too concerned about the technology and complexities of ‘big data’ if you are able to manage the data you have and the data you need satisfactorily. Consider rather how you can best use what you have effectively and efficiently and ask yourself are you collecting all the data you need to meet your objectives.
By understanding how to derive additional values you will turn it into information; by interpreting that information with your own tacit knowledge of your members, your industry, your markets then you will turn the information into knowledge.
The key is to have a robust data strategy that enables you to define the data sources and satisfy yourself that you are acquiring the right data from each touchpoint. Put in place processes for evaluating the data’s importance and therefore what role it has in achieving your objectives and a capability to implement analytics to deliver insight.
Data –whether ‘big’ or not – is only worth collecting if you are going to do something beneficial with it. I remember seeing an old photograph of a US military mess hall with a large sign over the trays of food saying: “Take all you want, but eat all you take”. It’s the same with data.
©Michael Collins 2014