Skip to content

data has a story to tell...but it gets lost in translation

BIG DATA

In the past decade, companies have been working hard to bring their data together into data lakes or data warehouses. They now have plenty of data, but few insights. The challenge is this: the data is brought together from multiple sources, in the format it was produced, for the purpose these source technologies were designed to serve. The data as such was never intended to serve the purpose of managing an entire company. 

In larger companies with a history in M&A, this challenge multiplies. Very often, we talk to companies using 5-6 ERPs and more than one CRM. They are also likely to have a group structure and sales teams in multiple subsidiaries. Such complexity is very difficult to manage while preserving situational awareness. Cross-selling opportunities are often missed.

The data structure and format is based on the origin technologies. It is difficult to structure and combine. For the purpose of creating insights and situational awareness of the company's situation or creating predictive analytics, you need to transform the data. That means you need to connect, clean, enrich, and structure it to serve this purpose. That is the reason why 90-95% of agentic AI initiatives fail to scale and go to production. According to research, 100% of failed cases were at least partly failing because of data related challenges.

This article dives deeper into the infrastructure-level requirements and best practices, about how to solve these challenges. This story is based on the learning we have gained from building the 180ops Revenue Intelligence Platform.

 

THE FUNDAMENTALS FOR DATA-DRIVEN MANAGEMENT

In order to enable data-driven management, predictive analytics and Agentic AI, you need to follow this path:

1. Clear vision

What is the technology expected to deliver? How is that turning into value? What kind of answers do you want to get from your data? You need to be very clear with your expectations, because there are hundreds of decisions to be made, which your definition of done will dictate. You should go all the way to defining Data as a Product (DaaP), meaning the end products that you expect from the technology. These end products need to be traceable, trackable and evidence based. It is likely that the production of each end product will have specific technical requirements. A data product is a reusable, active, and standardized data asset designed to deliver measurable value to its users.

To be able to read your data and understand the context and story it tells you, you need to be well aware of the Customer journey the story is about

2. Data recipe

Your vision will dictate the prioritization of data sources and types, and requirements for the structuring of the data. You need to understand that the nature of your offerings related to customer relationships can be very different. Your offerings may have very different ideal customer profiles (ICPs), and the roles of offerings may have significant differences themselves, as well as in market positioning in general. To create a clear and functioning data recipe capable of delivering value for your use cases, you need to use every bit of business and market understanding that you have. The creation of a data recipe is a deeply business and market understanding-based job. Technology is an enabler, but the knowledge comes from you.

3. Processes

The only way to automate anything is to have a clear process for every end product (DaaP). These processes can be best described with medallion model:

  • Bronze level: The data in an as-is format. The starting point for data transformations and structuring. Vision and purpose dictate what data is required for the production of expected data products. 
  • Silver level: Cleaning, structuring, connecting, enriching, transforming and production of new data based on production processes. Each step is a process in its own. Each production process is unique and has specific technical requirements, eg. the  use of neural networks serve a purpose in some cases, but in others the DaaP requirements cannot be fulfilled.

    One of the key tasks to define is Taxonomy for offerings. You may have 150 000 SKU's in your offering, but that is not manageable and valuable for customer behaviour understanding and operational management. We are using "jobs-to-be-done" as a key to Taxonomy grouping of offerings. This level allows us to understand the customer's jobs and which of those jobs are we  serving and to what extent. It also allows us to recognize white spots for cross-selling opportunities.

    The choice of technology is very important: Neural networks deliver you outcomes, but no explanation for it. As a manager, understanding why something happens is even more important than the outcome. The why = risks and opportunities that you need to influence and manage. Not knowing why, just getting the outcome, doesn't really help you. You also need to recognize, that production of data means tens and hundreds of processes running simultaneously and the technologies used for each process need to serve that specific purpose. That is why defining the data products is the foundation for technical solution selection also.

    Agents and LLM models are great at leveraging already-created data, but they are not the best option for the production of fundamental data products. AI models also start drifting and deliver different outcomes from one day to the next. This means that as a fundamental management foundation, these technologies were never designed to serve that purpose. 
  • Gold level: The end products ready for delivery and distribution. This is where human understanding steps in very strongly. Getting the right answer is just the first step to success. However, a lot more is needed to make an impact. You need to deliver:
    • The right answer
    • To the right person
    • At the right time
    • In an understandable format
    • Via technologies that are used to make decisions

4. Architechture

The first three steps give you the functional specifications and requirements for the architecture. You need to determine how the solution will fit with your existing architecture, how you take care of governance, data and cyber security, updates, access control and logs, and multiple other considerations (article about this here). However, at this point you have an idea about what you are actually creating and what is required to build it.

Architecture is very important, but the magic happens in the processes. Even though you have a single dataproduct to produce, eg. in our case the Wallet size analysis, the offering level potential for an offering, the environments and foundations for production of this single outcome variate tremendously. That is why you may have a dozen processes to produce that single data product. 

You need to pay special attention to API and data distribution, meaning how you make the data understandable. This final step is the reason why you get lost in translation with the data. Data visualization and tools to filter and analyze it are the fundamental final steps that are capable of telling the story and delivering the insights. MCP API is another way of using Agents to deliver you specific answers and guidance. However, Agents require you to have the fundamentals in place before they can deliver you truly valuable answers. 

For scalability consideration, you don't want another service your salespeople need to log into to get answers. You need to deliver the answers to them into the technologies they use for account management. This means MCP API for Copilot and Agentforce, other agentic tools, add-ons to CRM technologies, the replication of data to the master data management systems, reporting tool development, and so on.

 

THE 180ops JOURNEY, SO FAR

Currently, we are deploying our third architecture generation. We've tried and failed and learned with ML tools, neural networks, algorithms, LLMs and Agents, advanced mathematics and are currently running our third version of UI. It has taken us 3.5 years and countless hours to get to this point, although all members of the team have 20-plus years of experience, and we've known what we intend to build from day one. 

We've also learned that the story that data tells is different for every company. The stories between offerings are different. The choices that companies need to make are different. Our role is to uncover that story and deliver guidance for management, sales, marcom and customer success.

Global analyst firms are telling us that we have created something unique and valuable. Something no one else has done before. We will be featured in some magic quadrants in the spring of 2026.

We have created powerful tools and knowledge to serve you and deliver you value in a month at a fixed cost and with low risk. You can now take advantage of our learning and bypass the challenges by contacting us. Collaboration with us is not about building, it is about masscustomization and configuring. The foundations are already available. We are capable of delivering you results in 1 month after the data delivery/access to data. Lets work together :)

Note: 180ops is a Transactable offering in Microsoft Azure, and Microsoft partner. Our profile in Azure here

google-site-verification: googlee5dd09b158d13a98.html