metamorworks - stock.adobe.com
Generative AI won't replace data analysts
Generative AI isn't going to replace data analysts. It can help analysts be more effective, but it lacks human insights and knowledge to properly do the job.
Generative AI will not replace data analyst jobs, nor will it replace people in many other fields, especially ones requiring human empathy and insight. Data analysis may seem like a technical role on a surface level, but the reality of the job is nuanced.
The process involves more than crunching numbers; it requires an understanding of the human elements behind the data. Whether it's analyzing customer behavior or detecting fraudulent activity, a human analyst's ability to empathize and understand the motivations, fears, ambitions and interests of others can lead to compelling insights. Insights can go beyond what is immediately apparent in the raw data and require an element of human judgment and understanding that AI currently lacks.
AI can process massive amounts of data and provide quantitative analysis. It cannot understand the subtleties of human behavior, cultural nuances or the complexities of human motivations and desires. Human factors often have a significant effect on the data and are areas where human analysts excel.
Generative AI tools such as ChatGPT and Bard can simulate text generation to a human-like standard, and they have the potential to automate some of the tasks that data analysts currently perform. Generative AI also has limitations, such as its inability to understand the full context of data. As a result, data analysts must interpret the results of generative AI and make decisions based on the data.
Current abilities and limits of generative AI
Generative AI cannot perform the nuanced work of analysts. Analyst work requires a synthesis of visual, numerical and tacit knowledge, which analysts cannot convey through text alone. The training data that generative AI models use limits the text that they can generate. Generative AI also cannot analyze raw data or produce original visualizations, and any insights it provides come from language patterns in the training data.
Another concern with generative AI models is accuracy. Without human oversight, the text outputs from AI can contain logical gaps, biased perspectives and factual errors it inherits from the training data. Accuracy depends on the quality and diversity of the training data. Biased or inaccurate training data causes the data sets to be biased or inaccurate as well.
AI models struggle to keep up with the real world: Training or retraining a model takes a great deal of computing power, time and money. As the world changes, the AI model falls behind until it's retrained. Even GPT-4 is only trained on data up to 2021. The irony is ChatGPT has little insight into its effects on business, the economy and intellectual life since its launch in November 2022.
Generative AI models also lack the critical thinking skills and insight to question the validity or relevance of its source material, which is an essential skill for a data analyst. A core component of data literacy is checking the quality of data and identifying potential biases.
As a result of limitations, generative AI models should not be a substitute for human analysts. Instead, the models are a tool to help analysts generate text, identify patterns and explore data. With human oversight, generative AI models can be an asset. Without human engagement, they mostly churn out repetitive, formulaic summaries of their existing knowledge.
How generative AI affects the analyst role
It will be some time before AI replaces human analysts because of its limitations. But human analysts can use AI as a valuable assistant in their work right now.
Code generation
Generative AI can suggest code to extract, clean and analyze data, which helps automate some repetitive tasks. It lacks the deep understanding of context, business goals and interdependencies required to design complex, scalable and maintainable code architectures. It can help an analyst who may need to work in multiple languages or diverse architectures, generating helpful code for quick review.
Data modeling
Given the right information, AI can propose data structures such as tables, especially for analytic schemas such as stars and snowflakes. Although AI can identify patterns within data and suggest tables, the task of defining efficient and effective data structures often needs human intervention.
AI can struggle to get it right the first time because it does not have the same understanding of the data as a human analyst. An analyst understands the nature of the data, its relationships and how to best model it for a specific use case. Analysts' knowledge is essential for defining efficient and effective data structures. It might be too much work to describe the necessary details to the AI program, but the human analyst can work with an intuition for potential use cases.
Human analysts often are wrong on the first iteration too, but they start with a richer understanding of the problem.
One interesting use of AI is to recommend analytical methods. Analysts must validate the suitability of a suggested method for the problem, account for business needs, data constraints and possibly even budget constraints for compute and storage.
Suppose an AI system is analyzing customer purchase data to increase sales. It sifts through massive data sets and identifies a pattern: Customers who buy a laptop often also buy a wireless mouse. Consequently, the AI recommends that bundling the products together in a promotional offer might lead to increased sales.
The human data analyst -- using specific business knowledge and experience -- can complement the AI-generated insight. They know that the laptop has a high profit margin and the mouse has a low profit margin. A bundle could increase sales, but may dent overall profit. They might suggest a tweak to the AI's strategy: Instead of a bundle, offer the mouse at a discount, perhaps with a coupon, only after the customer has bought a laptop. The proposal maintains the profitability of the laptop and the overall sales might still increase due to the perceived deal. The human analyst can also provide context about supply chain constraints, seasonal trends or upcoming marketing campaigns the AI might not be aware of.
With new insight, analysts can prompt the AI again and see if it has more, or similar, recommendations.
The near-term future of AI in data analysis
AI can enhance -- rather than replace -- the role of data analysts. Analysts can dedicate more time to strategic work as automation helps carry out routine data tasks. But AI is not accountable for its own errors. Responsibility and blame still rest with humans.
Human judgment, coupled with a healthy dose of skepticism and business acumen, continue to be indispensable assets AI cannot replace. Smart analysts can use AI as a tool to augment their abilities rather than perceive it as a threat to their roles.
Today, AI can automate repetitive tasks, provide insights into large data sets, help draft initial reports, write code snippets and propose potential routes for analysis. As AI advances, the industry might look forward to more sophisticated assistance in data analysis. AI can suggest potential data sources, generate effective test data, or drive operational and tactical decisions.
Even if generative AI reduces the number of analysts required, the key role of the human analyst remains. Their knowledge of the specific context, ability to apply critical thinking and deep understanding of human needs remains, even with new advances. The human analyst's role is not obsolete. It's more essential to ensure their organization harnesses generative AI's potential effectively and responsibly.
Donald Farmer is the principal of TreeHive Strategy, who advises software vendors, enterprises and investors on data and advanced analytics strategy. He has worked on some of the leading data technologies in the market and in award-winning startups. He previously led design and innovation teams at Microsoft and Qlik.