Accelerating Insights to Action – Part I

Achieving faster insights can only happen if delays and bottlenecks that exist throughout data life cycles are addressed using better practices and modern technologies.

Share this Post


Contact Us

In prior blog posts I’ve mentioned the speed of data/information movement and the speed of information in decision making. While these are two entirely different things they are very often interrelated. You will also hear me mention that today every company is a data company and companies derive value and success from the key insights held within the mountain of data that each company owns.

In this next series of blogs I discuss the value of fact-based, data-driven decision making and how the speed to insight is a key competitive advantage. Speed often is the difference in being a category leader or in some cases. being profitable. 

If a company can analyze data to discover a trend in customer preferences ahead of their competitors, they can gain an edge, potentially delivering higher market share, customer loyalty, and profitability. On the cost saving side, if a company can use predictive models to detect a fraud scheme or an attack on its service before it has a chance to do significant damage, it can save costs and avoid public embarrassment. 

Achieving faster insights can only happen if delays and bottlenecks that exist throughout data life cycles are addressed using better practices and modern technologies. 

Preparing for speed 

Preparation happens on the organizational front and the technology front. Setting project objectives is important.

Organizations that regard themselves as very successful are able to identify, measure value, and quantify objectives. As projects move toward deliverables organizations are implementing agile, DataOps, and other methods to help them better organize projects and move faster to create value. 

Technology advances are also key. Companies must reduce latency in data preparation, transformation, and development of data pipelines. This is one place where both types of speed are important. For companies using or sharing large data sets it is important to be able to move those data sets quickly across distances to have the data available closer to the user.  In addition, organizations need to learn how to use data catalogs, metadata repositories, and data virtualization more effectively, including for governance. With expense and scalability as two main issues facing organizations today, it is not surprising that cloud-based data transport, data management, integration, transformation, and development are popular. When seeking such solutions both the technology and the business/pricing model must be scrutinized.

Moving to the cloud solves some problems but surfaces and highlights other issues, such as governance and finding the right balance between centralization and self-service environments. 

Data itself is getting faster as organizations begin to analyze new sources including streaming data coming from sensors, websites, mobile devices, geolocations, and more. Some organizations use streaming, real-time analytics and AI to automate decisions and deliver actionable recommendations to users. Organizations should focus on well-defined objectives and devote attention to their big-picture strategy to avoid letting complexity slow innovation. This is where it is important to think big, start small, and scale up.

Speed to Insight is More than Just being “Fast” 

Nearly everyone agrees that if business executives, managers, and frontline personnel do not have to wait for the most current insights or be forced to use yesterday’s or last month’s information when they really need the very latest they could make more timely decisions, seize fleeting opportunities, and serve customers and partners more effectively. It is very important to your organization to invest in solutions, cloud services, and practices that can enable faster analytics and data consumption and faster data integration, preparation, transformation, and processing. The problem is that faster is not better if the data and information are not accurate, complete, or fit for the purpose. 

For certain use cases such as data science, exploratory analytics, and AI, the faster that users and AI programs can get access to raw, live data that’s just been recorded or streamed in real time, the better. That said, for many users and applications, data quality, accuracy, completeness, and relevance are more important than just pure speed. Users want to know if they can trust the data; they often need to know where it came from, how it relates to other data, and how it has been transformed and aggregated. Before the data can flow, organizations also need to ensure that it is secure and governed in accord with regulations. I’ll discuss data cleanliness in a separate post.

Reducing time to insight depends on applying technologies and appropriate practices that improve matters at each phase in the life cycle. Organizations must take into account the type of user and their context, terms such as “fast” “right time” and “real time” can have different meanings depending on these factors. For some, just getting the data or insights at the time that they need them (the right time) is best; for others, reducing latency to the smallest possible interval between the data’s creation and its availability for analysis and visualization is best. 

As technologies advance, including through opportunities created by cloud computing, traditional ways of working with data need to be reconsidered. 

In the next installment in this series I will explore organizational and company culture issues that factor in.

See it in action.

RStor’s fast access and lightning-quick geo-dispersed storage distribution provides faster workflows and disaster recovery at some of the lowest price points in the industry.

Request Demo