Data, Jul 02, 2019

How to design and implement big data programmes

Mark Atterbury

The pressure is on for many organisations to be ‘seen to be using big data’. 

Designing and implementing big data programmes is a unique challenge, but “data is crucial to the way government delivers services for citizens, improves its own systems and processes, and makes decisions.” (NAO, 2019).

Read next: Best practices for better data insights

It is not about technology  

The difference now is that the challenge has evolved from gripes around just technology and integration (hands up if you’ve built a successful Hadoop data ‘lake’) to also encompass people and ethics.

We are told in the media that business decisions are increasingly more data-driven, but our ability to locate increasingly smaller needles in increasingly bigger haystacks is still limited by the abilities of employees as analysts to leverage information, and them being able to navigate the growing importance for responsible data handling and access.

Companies have realised that metadata, lineage and data governance help address this; but have also realised that (despite the promises of AI and data science) making effective and appropriate use of big data is still quite a lot of hard work, and costly. Speed of interpretation and inference are everything; the job of the analyst and business decision makers to deliver insight using big data solutions is only getting harder. 

There is a need to grow in-house core data capabilities 

What does this mean for enterprise? At Credera, we are seeing a growing trend in clients wanting to take ownership back of analytics, data capabilities and data stores from outsourced suppliers and System Integrators. This is to bring their staff closer to information, understand its value, and improve capability in analytical decision making.

This, coupled with convergence of data standards, improves ease of integration and adoption of open source technologies, as well as providing them with more self-sufficiency. However, they still struggle with understanding how to achieve it. 

Empowering your people to try and test ideas 

With the maturing of cloud, ephemeral platforms, automation, agile and DevOps practice, we believe that the industry is at a tipping point, and learning to herd your data critters, and accepting that it is pointless (and impossible) to tame the data beast, is key. The days of running a costly big data and analytics estate are gone.

To succeed in data, businesses must empower their staff to break down the data sources and the problem; and leverage technology that no longer needs to differentiate data as ‘big’.

This means changing skills, knowledge and ways of working. It also means investing in a data and analytics practice that doesn't just straddle business and technology; but blur their ongoing interaction.

Orienting around blended data and analytics expertise and multi-disciplinary teams responsible for not just delivering, but qualifying value, maintains delivery standards and contributes back to communities, allowing for the shepherding of knowledge.

Unicorns don’t exist, but by enabling staff to test ideas and leverage solutions, they can grow sustainably themselves - and you can produce plenty of horses. 

Credera has significant experience in delivering successful big data projects within the public and private sectors. Read one of our case studies below:

Releasing real-time data for the government

Read more:
Managing data in the lakehouse: An introduction to the data lakehouse
Understanding data & analytics maturity: How it's critical to your digital success
Podcast: Is it good to adopt data openness by default?

Building efficient and scalable data platforms in AWS
Podcast: How do you become a data mature organisation?

Have a question?


Let’s find a solution that fits your challenge.

Contact us