The digital era has led to a deluge of data that poses enormous challenges for organisations. British mathematician Clive Humby’s phrase ‘Data is the New Oil’ has become one of the most quoted idioms in the world. According to global leaders, organisations that learn to capture and utilise data properly by analysing it will be in a dominant position. But people often underestimate the technological challenges of homogenising and then analysing data being captured in different places and formats across the firm. This, in fact, has given rise to a technology trend – stitching data using data fabrics and data mesh – that will be important for all companies to learn and use properly in the coming days. This article focuses on how data fabric and data mesh architectures connect data together and build links between all the services and microservices of an enterprise. It is the first feature of our EY Tech Trends series that will look at the technologies that will define our work and life in the next few years.
Stitching data together
Since the beginning of the pandemic, an increasing number of people have been spending more time on the internet for work, personal requirements, and pleasure. A natural consequence of this is that an enormous amount of data is being generated constantly. In India alone, according to the Economic Survey of 2021-22, there are more than 830 million internet users. That is more than twice the entire population of the US. India generates more data than any other country in the world, save China.
This data, however, is not being captured in any central application. As customers started accessing more services online, corporations too added a plethora of new applications for easy access to their services and microservices, many of which are built on the cloud. The number of applications being built for all industries, but particularly the financial services industry, has grown exponentially. This has led to a leap in the quantity of consumer data being generated.
Increased generation and availability of data will influence the ways in which data is used. Traditionally, when organizations built their technology systems, there would be one core and, at best, a few peripheral applications. Today, the core itself is being distributed into 20 different applications. That is creating a challenge in terms of stitching together the data across all those applications.
The advent of cloud has added to the complexity. A number of the new capabilities required to run a business or offer specific services to the customer, are now available as bespoke solutions on the cloud. This means organizations need to get together the dozens of different applications – some of which are in the core systems residing on-premise, some are in edge computing while others are on the cloud residing in a data center. Interconnectivity is essential not only for analytical purposes but even basic operations.