Operations is the often-overlooked hero of profitable growth. Antonia Landi and Elizabeth Press (myself) connected over the insight that operational excellence is the key to business success, be it in product, data, on the factory floor or the newsroom. Even creative agencies have processes. Ops and processes will become a passport to play, as legislators catch up with technology (NIS2, DORA as examples) and ISO 27001 becomes a standard business hygiene in many industries.
Large language models and generative AI are disrupting how data is done. I (Elizabeth Press from D3M Labs) spoke with Leonid Nekhymchuk (Leo), CEO and Co-Founder of Datuum.ai, about how large language models will transform data operations. Datuum uses AI to connect data sources with target models, automate mapping, making data integration less time-consuming and less expensive.
Data Mesh – Wie man verhindert, dass es sich in ein geldverschlingendes Chaos verwandelt – ein Podcast
Data Mesh ist eine analytische Datenarchitektur und ein Betriebsmodell, bei dem Daten wie ein Produkt behandelt werden und den Teams gehören, die sie produzieren, d. h. den Geschäftsbereichen. Wie können sich Unternehmen auf den Weg zu Data Mesh machen, ohne ihre Budgets zu sprengen und letztlich einen großen, unübersichtlichen und teuren Datensumpf zu schaffen? Höre dir den Podcast an. Lese den Blog.
Data Mesh is an analytical data architecture and operating model where data is treated like a product and owned by teams who produce it, i.e the busness domains. How can organizations embark on their data mesh journeys without exploding their budgets and ultimately creating a big, mess, expensive data swamp? Listen to the podcast. Read the blog.
Data quality is a business problem, as well as a tech problem. It is the biggest enemy of data-driven business and machine learning. Bad quality data can block or render a data project or machine learning use case unusable and thus a waste of money, human resources and time. Tackling data quality needs to be a targeted, systemic and ongoing, rather than a huge, one time cathartic event.
Beyond the algorithm, the realities of operationalizing AI – A podcast interview with Elizabeth Press
The AI mystique might be the biggest obstacle to AI adoption. The artisanal data scientist who works on an alchemy of code output the magical algorithm impedes discussion on what is needed to commercialize and scale AI solutions. AI needs to be treated like a product and an item to be manufactured and scaled on an industrial level.
Technical debt in your data pipeline will impact your organization in ways that will annoy stakeholders, make the working lives of analysts tedious and frustrate data engineers. This debt can cause embarrassment in front of boards and investors, as numbers can be mismatching and unexplainable. And worse.
Matt Brady, Founder of Zuma Recruiting and I talked about Data Strategy. We will start by covering data strategy and roadmaps before discussing how to treat data, data roles and where data should sit in an organization. Data teams add the best value to their organization when they are part of a holistic company strategy discussion and work as strategic partners with the stakeholders.
The right mix of governance and freedom in architecture is still up for debate, end-to-end solutions are often-heard recommendations, low and no-code is expanding access to data and the customer journey could be seen as a source of revenue are some insights I gleaned from this spring’s Big Data World.
Operational KPIs that will let you know your Data Team is creating impact (rather fixing & firefighting)
Data teams are usually busy, but are they impactful? Just because your data team is burning through tickets does not mean that they are creating impact, especially if they are stuck fixing and firefighting. Impact can be broken down into prioritization, coverage and quality. KPIs such as the statistical re-do rate, analytical throughput rate and effective analytical throughput rate that will help you quantify the impact of your data organization. This framework, along with external validation from stakeholders, is helpful to root cause and make business cases to invest in improvements.