Data is at the heart of business. Without data, it is not possible to be fully aware of what is affecting the company and be able to make strategic business decisions. For this reason, larger and larger investments are allocated to the acquisition, collection, and management of data.
On this drive, we have seen the raising and establishment of the Big Data paradigm: we refer to the collection and analysis models of all those data characterized by volume, speed and variety that cannot be assimilated to traditional data. This data can be different and it can range from phones, to factory detectors, and to smart city sensors.
With Big Data we have discovered the value of having an all-round knowledge of users, business processes, and internal and external situations. This knowledge has become increasingly abundant and exhaustive as the volume of data available grows. Or at least on paper…
Over the past decade, companies have set up complex Big Data management models and tools, looking for a way to “capture” as much information as possible. The cost of collecting, processing, aggregating, and analyzing all these data has grown exponentially, along with the data itself; but the value, unfortunately, has not always grown the same. Why?
Fast Data: the answer to new business needs
Alongside the information completeness, a different need raised for companies, which is on one side complementary but even more relevant: the speed of information. The market is running faster than ever and today’s main challenges are keeping up with the market pace, promptly responding to changes, and being able to continuously offer new products and services to the users 24/7.
Therefore, it is more useful to immediately have the essential information – in the moment of need – rather than having a general and full picture of all the data later in time. For this reason, a new paradigm is spreading with increasing relevance: Fast Data. Fast Data can be seen as a subset of Big Data: a set of essential data that must always be available and updated.
Unlike traditional methodology, which involves collection and batch processing, this data is collected, processed and made instantly usable. Consequently, this data enables well-informed and timed decision making and triggers automated processes in real-time.
This choice systematically involves sacrifices: on one hand the Big Data paradigm favors the information completeness to process during a retrospective with an historical analysis, on the other hand, the Fast Data paradigm focuses on the rapid availability of essential information, leaving the contour data analysis at a later time.
Fast Data, a game changer for companies
Choosing to adopt a Fast Data based system can bring numerous advantages in a competitive market like today’s.
From a business point of view, the implementation of Fast Data solutions enable the so-called data-driven decisions, or real-time decisions, which are based on updated data available 24/7.
We can see an excellent practical application, for instance, by analyzing the fresh food supply chain, which deals with perishable products for which timing is essential. By following the traditional method, the procurement of productive resources, their transformation into finished products, and the distribution on the market are carried out according to a predictive strategy.
For example, the analysis of historical data provides hypotheses on the supermarkets’ demand for the product at different times of the year. This is used to predict the products’ quantity in the warehouse, the shipments, and the orders.
Instead, with a Fast Data architecture, the company has real time data. The supply chain manager would know the products’ quantity that is stored in each warehouse and where it is in the production chain, at any time. This way, the company can make its own decisions according to the current needs, such as where to ship a certain container that has just arrived at the harbor. The company can thus optimize its choices by reducing waste and costs.
How to increase the user experience with Fast Data
Fast Data technologies also bring many benefits to the end user: these benefits can be traced back to the realization of a smoother, faster, and more complete user experience. For instance, think about using a home banking app or a phone billing and account management app.
In applications that are based on traditional systems, the updating process of “business critical” information – such as the subscription of a policy or a change of the phone billing plans – only occurs periodically (usually with nightly windows updates) and the user must wait for hours or even full days to see their updated information. Also, many actions are completely disabled outside the systems’ operating hours.
On the other hand, an application based on Fast Data makes the essential information for the business available in real time and 24 hours a day: this allows the users to carry out actions at any time and to have their information always up-to-date. It is the type of “always on” and omnichannel user experience that today’s users expect from a modern company.
How to create a Fast Data architecture
Therefore, how can you introduce Fast Data technologies in your company? Is it necessary to make the old management systems retire and build a more modern application suite? Fortunately not: the cost and complexity of this operation would be such an obstacle to discourage every IT Manager.
One of the advantages of Fast Data solutions is the event-driven architectures, which are based on data streams such as Apache Kafka or Google Pub / Sub and work independently from the underlying IT systems, while acting as an integration layer.
The Fast Data message broker captures and transmits the applications data with all their modifications, and saves it on its own database. Thus, information “liberated” by the management software can be instantly processed and exposed via a simple API to the “consumer” App, according to the needs of the business.
The result is what the research company Gartner calls a “Digital Integration Hub“: an architecture capable of decoupling the channels from the IT systems, greatly improving application performances and reducing the costs of the calls to the mainframes.