For the last decade, big data has been a buzzword for organisations beginning their move into the digital stratosphere. Companies from sectors as diverse as retail and healthcare believed that big data – and AI – could effectively solve their data issues with its in-depth analytical capabilities and computing power. But big data isn’t a one-method-fits-all, as many organisations have come to realise.

Data remains the foundation for organisations of all sizes and continues to play a key role in predicting behaviours, driving innovation, looking at trends, cost saving and many other features. But does that mean that data has to be big to be powerful? In this blog, I’ll analyse the rise – and fall – of big data’s hype and answer the question we’re all wondering – is big data still relevant?

Big data vs. small data – what’s the difference?

Big data – in case you’ve missed it – is data that is too complex, widespread, rich, or high volume to be easily interpreted by traditional statistical methods (or crucially by the human eye) and requires specific software and expertise to manage.

At first glance, big data in its raw form does not provide a lot of insight without refining it. But once the data begins to be broken down into smaller parts, it can provide useful insights in real-time dashboards. Even though this process can be costly and time-consuming, larger organisations will always have big data sets that will require analysis and so big data will continue to be prevalent in these organisations.

In comparison, small data is easily interpreted by humans due to the volume and format, with the analysis of small data involving more precise, bite-sized metrics. Despite its tiny volume size, small data can be collected, analysed, and acted on relatively quickly, making it useful for assisting with rapid business decisions.

Although big data has become more widespread, Forbes predicts that we will see an increased focus on small data as this approach allows a more agile and scalable business model for many companies, alongside opening the door for more refined data models. Small data also works hand-in-hand with AI and TinyML, a subfield of machine learning that uses traditional ML algorithms to help process smaller data sets faster and more accurately on limited hardware, such as a mobile phone.

Looking ahead, there are many businesses – particularly small start-ups – that will want to utilise machine learning, with TinyML offering them the opportunity to do so without the major cost implications. This suggests that small data will become a more popular choice for organisations as they look to cut costs and maximise their data sets.

Why are businesses moving towards small data?

Since the mass disruption of the COVID-19 pandemic, data and analytics leaders have been forced to change the way they look and think about their data. Now more than ever it is important for leaders to have the tools and processes in place for them to bounce back from unexpected disruption, and small data plays into this new plan. Small data will amplify business leaders’ ability to identify key technology trends and priorities – enabling greater agility.

Additionally, the collection of extremely large data sets over the past decade has meant that many organisations haven’t been able to get the full use of their data, which has proved to be costly and potentially wasted resources. As technology continues to evolve, organisations will be less reliant on big data for insights, and keener to get the most out of every inch of interpretable, small data within their organisations.

The small data preference has been reiterated by Gartner, with a prediction that, by 2025, 70% of organisations will have shifted their focus from big data to small data. The shift towards automation combined with the use of smaller, more relevant data sets will have a large impact on business decisions, including the ability to be agile and reactionary.

Small data and AI

Organisations are aware that there are enormous amounts of data available to them, but they are often unsure on how to use it, especially when it’s unstructured. This is where AI has become integral as it can analyse small and large data sets combined with structured and unstructured data.

The current shift towards small data will also provide greater control for feeding quality data into an AI model. For example, many organisations are accumulating large volumes of data but are only receiving general outcomes that are non-specific and cannot gain any valuable insight, leading to wasted data. This contrasts to smaller data sets that can be used to train AI to produce more refined models that will be functional and applicable within their business.

Small data techniques are constantly advancing, and this brings increased accuracy, efficiency, and transparency – all of which will save time and provide high quality data. Because of this, small and wide data could be at the forefront of data strategies moving forward for many years to come. I believe it is imperative that organisations continue to have access to big, small, and wide data sets, however I am aware that the pandemic has shown that organisations need to adapt and small data fits well into a flexibility model.

For more insights into big and small data, read Lorien’s Data Domination whitepaper here. Alternatively, view our latest data roles, or if you have any queries on the blog, please contact me at nathan.ballin@lorienglobal.com.