The Limitations of Data Science for Product Success

Published:

Transitioning from support to a core function

The last decade has seen the divide between tech and commercial teams thin almost to the point of nonexistence. And I, for one, am in favor of it. Not every tech team works in a tech company, and blurring the lines between the commercial and technological means that we can build and ship product safe in the knowledge that it will be well received, widely adopted (not always a given), and contribute meaningfully to the bottom line. Name a better way to motivate a high-performance tech team, and I’ll listen.

It’s a change that was accelerated — if not caused by — data tech. We’ve spent decades working through big data, business intelligence, and AI hype cycles. Each introduced new skills, problems, and collaborators for the CTO and their team to get to grips with, and each moved us just a little further from the rest of the organization; no one else can do what we do, but everyone needs it done.

sajdhasd

Technical teams are not inherently commercial, and as these roles expanded to include building and delivering tools to support various teams across the organization, this gap became increasingly apparent. We’ve all seen the stats about the number of data science projects, in particular, that never get productionized — and it’s little wonder why. Tools built for commercial teams by people who don’t fully understand their needs, goals, or processes will always be of limited use.

Introducing lean value

This waste of technology dollars was immensely justifiable in the early days of AI — investors wanted to see investment in the technology, not outcomes — but the tech has matured, and the market has shifted. Now, we have to show actual returns on our technology investments, which means delivering innovations that have a measurable impact on the bottom line.

The growing pains of the data tech hype cycles have delivered two incredible boons to the modern CTO and their team (over and above the introduction of tools like machine learning (ML) and AI). The first is a mature, centralized data architecture that removes historical data silos across the business and gives us a clear picture — for the first time — of exactly what’s happening on a commercial level and how one team’s actions affect another. The second is the move from a support function to a core function.

This second one is important. As a core function, tech workers now have a seat at the table alongside their commercial colleagues, and these relationships help to foster a greater understanding of processes outside of the technology team, including what these colleagues need to achieve and how that impacts the business.

Leveraging LLMs to improve quality and speed up delivery

This, in turn, has given rise to new ways of working. For the first time, technical individuals are no longer squirreled away, fielding unconnected requests from across the business to pull this stat or crunch this data. Instead, they can finally see the impact they have on the business in monetary terms. It’s a rewarding viewpoint and one that has given rise to a new way of working; an approach that maximizes this contribution and aims to generate as much value as quickly as possible.

We set quality levels we must achieve, but opting for efficiency over perfection means we’re pragmatic about using tools such as AI-generated code. GPT 4o can save us time and money by generating architecture and feature recommendations. Our senior staff then spend their time critically assessing and refining those recommendations instead of writing the code from scratch themselves.   

Data lakehouses: lean value data architecture

There will be plenty who find that particular approach a turn-off or short-sighted, but we’re careful to mitigate risks. Each build increment must be production-ready, refined and approved before we move on to the next. There is never a stage at which humans are out of the loop. All code — especially generated — is overseen and approved by experienced team members in line with our own ethical and technical codes of conduct.

Inevitably, the lean-value framework spilled out into other areas of our process, and embracing large language models (LLMs) as a time-saving tool led us to data lakehousing; a portmanteau of data lake and data warehouse.

A seat at the table

Standardizing data and structuring unstructured data to deliver an enterprise data warehouse (EDW) is a years-long process, and it comes with downsides. EDWs are rigid, expensive, and have limited utility for unstructured data or varied data formats.

Whereas a data lakehouse can store both structured and unstructured data, using LLMs to process this reduces the time required to standardize and structure data and automatically transforms it into valuable insight. The lakehouse provides a single platform for data management that can support both analytics and ML workflows and requires fewer resources from the team to set up and manage. Combining LLMs and data lakehouses speeds up time to value, reduces costs, and maximizes ROI.

The lean-value approach is a framework with the potential to change how technology teams integrate AI insight with strategic planning. It allows us to deliver meaningfully for our organizations, motivates high-performing teams and ensures they’re used to maximum efficiency. Critically for the CTO, it ensures that the return on technology investments is clear and measurable, creating a culture in which the technology department drives commercial objectives and contributes as much to revenue as departments such as sales or marketing.

FAQs

What is lean value?

Lean value is a project management methodology that focuses on ruthless prioritization to maximize value for organizations.

How does data lakehousing differ from enterprise data warehousing?

Data lakehouses can store both structured and unstructured data, providing a more versatile and cost-effective solution compared to traditional enterprise data warehouses.

Why is it important for technical teams to have a seat at the table?

Having a seat at the table allows technical teams to better understand the business processes and objectives, leading to more effective collaboration and value delivery.


Credit: venturebeat.com

Related articles

You May Also Like