The six key components in a modern data supply chain

Narendra Mulani, Senior Managing Director, Accenture Analytics

There were once grand Palladian windows of opportunity for success in business, but these have all but been reduced to tiny slithers. The cause? The sheer velocity of market change brought about by fluctuations in consumer behaviour and technological advances. To succeed in this new environment, firms need to have the ability to make wise decisions – fast. Recognising this new business need, companies are either developing or looking to develop a modern data supply chain to move, manage and mobilise their data, propelling their opportunity to make real-time data-driven decisions.

When a data supply chain is at the forefront of a firm’s operations, key tangible business outcomes can emerge as information is able to travel seamlessly across different platforms. By liberating their data, companies can find the business opportunity in their data and generate actionable insights to outperform competitors.

Selecting the right architectural building blocks for your business

An effective enterprise data supply chain is built specifically with desired business outcomes in mind. To chase the end goals, businesses should leverage their existing systems and include data acceleration technologies to improve the enterprise’s data movement, processing and interactivity – further accelerating the path to insight. To realise such a data management system, businesses will have to navigate the crowded and sometimes complicated data component marketplace. Their success here will depend on acknowledging that collaboration, rather than the isolation, of instruments is the best way to truly capitalise on components’ complementary advantages.

There are six main components to consider when creating an accelerated data supply chain:

The big data platform (BDP). A BDP is a cluster of computers that collectively facilitate data traffic and information processing. These capabilities can be extended by integrating query engine software into this component, creating structured data tables and common standards for queries.

Ingestion. This is all about empowering organisations to capture, store and move large amounts of information as quickly and efficiently as possible. It works by transferring data from the source to a storage facility where a queueing system processes the information so that an end user can pick it up when required.

Complex event processing (CEP). This piece of the architecture focuses on the analysis of data drawn from actions, such as site clicks and video feeds. The value here is rooted in the CEP’s ability to use multiple streams of data simultaneously to draw conclusions, allowing organisations to master real-time analytics.

In-memory databases (IMDB). IMDBs are databases integrated into a computer’s main memory, a feature that makes them inherently faster than the predecessors which were stored on disk. They have become a much simpler and more efficient system as they require fewer instructions to access and run from a single location.

Cache clusters. This component category promotes a simple and quick route to accessing the most needed data. By building up query data over time, a repeated information request can be processed faster as the system can bypass the need to return to the original data source.

Appliances. An appliance is a fixed arrangement of hardware, software and support services which, combined, provide an all-in-one data and analytics solution. This system relies on a common database for both online transactions and analytics processes to drive the acceleration, processing, and interactivity of data traffic simultaneously. Businesses that lack the IT expertise to manage their own high-performing database solutions will find this ‘plug-and-play’ concept an excellent route to data acceleration.

Where do we go from here?

Data acceleration is set to become a core component of a company’s modern data supply chain to generate data insights at a quicker pace than competitors. Of note, executives should not get comfortable with their established technology architecture. As customer expectations and business objectives are always changing, they should be prepared to constantly navigate the landscape of solutions aimed at fostering acceleration – the goal is to always seek the right combination of tools that will make even more opportunity quickly ripe for the taking.

More News