Data everywhere, there is no place: What are the information panels that are wrong, and why do you need a manager of data products

Photo of author

By [email protected]


Want more intelligent visions of your inbox? Subscribe to our weekly newsletters to get what is concerned only for institutions AI, data and security leaders. Subscribe now


In the past decade, companies have spent billions of dollars Data infrastructure. Petabyte warehouses. Pipelines in real time. ML).

However – ask your operations that lead why Churn increased last week, and you are likely to get three conflicting information boards. Ask funding for success through support systems, and you will hear, “This depends on who you ask.”

In the world of drowning in information panels, one fact maintains the surface: data is not the problem – thinking about the product.

Quiet collapse of “data as a service”

For years, Data difference It works like internal-interactive consulting, ticket based, paid. The “Data as a service” (DAAS) model was good when the data requests were small and the risks were low. But when companies became “paid in data”, this model was broken under the weight of its success.

Take Airbnb. Before launching the standards platform, the Finance, Finance and OPS teams withdrew their own versions of the standards such as:

  • Nights reserved
  • An active user
  • The available menu

Even simple main performance indicators varied according to candidates and sources and who was asking. In leadership reviews, the different teams provided different numbers – which led to arguments about the scale that was “correct” instead of the procedure to be taken.

This is not technology failure. They failed the product.

Consequences

  • Data lack of data: Analysts are guessing second. Information boards are abandoned.
  • Human guidance devices: Data scientists spend more time explaining the contradictions than generating visions.
  • Excessive pipelines: Engineers rebuild similar data sets across the teams.
  • Decision withdrawal: Leaders delay or ignore the procedure due to the inconsistent inputs.

Because data confidence is a product problem, not a technical problem

Most data leaders believe that they have a data quality problem. But look closely, and you will find a problem of data confidence:

  • Your experimental platform says that the feature hurts me to keep it – but the product leaders do not believe it.
  • OPS sees a dashboard that contradicts their live experience.
  • Two teams use the same metric name, but a different logic.

Pipelines operate. SQL is the sound. But no one trusts in the outputs.

This is the failure of the product, not geometric. Because the systems were not designed for easy use, interpretation or decision -making.

Enter: Data Product Manager

A new role appeared across major companies – Data Product Manager (DPM). Unlike public PMS, DPMS works via fragile and invisible terrain. Their job is not to charge information boards. It is to ensure that the right people have the right insight in time to Decision decide.

But DPMS does not stop at pipe data in information boards or formatting schedules. Better of them go further: they ask, “Does this actually help someone to do their jobs better?” It determines success is not in terms of outputs, but the results. No, “Did this be shipped?” But “is this materially improving the functioning or quality of the decision for a person?”

In practice, this means:

  • Not only define users; Watch them. Ask how they think the product works. Sit next to them. Your mission is not to charge a data set – it makes your customer more effective. This means a deep understanding of how the product fits the true context of their work.
  • Special ecclesiastical scales and treatment such as application programming facades-version, documentation or control-and ensure that they are linked to subordinate decisions such as $ 10 million holes or the launch of Go/NO-GN products.
  • Build internal facades – such as feature stores and clean room applications programming – not as an infrastructure, but as real products with contracts, SLAS movement units, users and comments rings.
  • Say no to the projects that you feel development, but it does not matter. Data pipeline that no team uses is the technical religion, not progress.
  • Design for durability. Many data products do not fail from bad modeling, but from fragile systems: not documented logic, luxurious pipe lines, and shade ownership. Building with the assumption that yourself in the future – or replace you – will thank you.
  • Horizontal solution. Unlike the field PMS, DPMS must be constantly shouting. The logic of the life of one of the team (LTV) is the budget of another team. Simple metric update can have the second -class consequences through marketing, financing and operations. Hitting this complexity is the job.

In companies, DPMS redefines how to build, control and adopt internal data systems. They are not there to clean the data. They are there to make organizations believe in them again.

Why did it take a long time?

For years, we have sinned in activity for progress. Data engineers designed pipelines. Scientists are souks. Analysts built information boards. But no one asked, “Will this insight change a commercial decision already?” Or worse than that: we asked, but no one has the answer.

Because the executive decisions are now through data

In today’s institution, almost every major decision – budget transformations, New releasesRestructuring ORG – pass through the data layer first. But these layers are often owned by:

  • The metropolitan version has changed in the last quarter – but no one knows when or why.
  • The logic of experimentation differs through the difference.
  • The chain of transmission models contradict each other, and each has reasonable logic.

DPMS does not have the decision – they own the interface that makes the decision read.

DPMS guarantees that the scales are interpretable, transparent assumptions and the tools correspond to the real workflow. Without them, decision paralysis becomes the rule.

Why does this role accelerate in the era of artificial intelligence

Artificial intelligence will not replace DPMS. It will make it necessary:

  • 80 % of the artificial intelligence project effort still goes to the data prepared (Forster).
  • As a large linguistic models (LLMS), the cost of garbage input compounds. AI does not fit bad data – it amplifies it.
  • Organizing pressure (European Union Law, Consumer Privacy Law in California) is paid to treat internal data systems with the accuracy of the product.

DPMS is not a traffic coordinator. They are the architects of confidence, interpretation, and the founding of the responsible Amnesty International.

So what now?

If you are CPO, CTO or data head, ask:

  • Who possesses data systems that run our biggest decisions?
  • Is our interior application programming facades, discovered and controlled?
  • Do we know the adopted data products – which undermine confidence quietly?

If you cannot answer clearly, you will not need more information boards.

You need a database manager.

SEOJON OH is the Uber Data Product Manager.



https://venturebeat.com/wp-content/uploads/2025/07/Dashboards-DDM.jpeg?w=1024?w=1200&strip=all
Source link

Leave a Comment