The history of economic transformation is not primarily a history of inventions. It is a history of infrastructure, power, and consequences. Coal, oil, and more recently social media did not simply enable growth. They reshaped institutions, concentrated influence, and created externalities that took decades to confront.
Artificial intelligence now sits at a similar inflection point. It is being adopted at speed, embedded deeply into decision-making, and trusted implicitly in areas ranging from finance and employment to healthcare and public administration. For the accountancy and data professions, this raises a familiar but urgent question: how do we govern a powerful enabling resource before its unintended consequences become systemic?
Looking back is instructive, not because history repeats itself exactly, but because it reveals recurring patterns in how societies respond to transformative capabilities.
Coal – infrastructure, power, and path dependency
Coal powered the industrial revolution, but its deeper impact lay in how it structured society. As Jeremy Paxman has observed in his histories of Britain’s coal industry, coal was not just fuel, it was infrastructure power. Coal seams determined where towns grew, which industries flourished, and who held economic and political influence. Britain did not industrialise simply because it was innovative, but because it sat on coal and built an economy around exploiting it.
Coal created immense prosperity, but also deep dependency. Entire regions became monocultural, tied to a single resource and employer. Risks were normalised rather than questioned. Pit disasters killed miners every year. Pollution accumulated gradually.
The costs were tolerated because the benefits were immediate and systemic.
Most importantly, coal locked Britain into a path dependency. Once the infrastructure existed, alternative energy futures were crowded out. Decline, when it came, was slow, painful, and socially destructive. The 1984 coal strike still reverberates around the old mining regions today. The failure was not change itself, but the absence of planning for transition.
When a resource becomes embedded as infrastructure, its influence persists long after its drawbacks are understood.
Oil – scale, monopoly, and geopolitical leverage
Oil succeeded coal not by replacing it overnight, but by extending its logic. It was lighter, more portable, and capable of powering global mobility. Oil enabled cars, aviation, mechanised warfare, and international trade at unprecedented scale.
But oil also demonstrated how control of infrastructure equates to control of markets. The dominance of vertically integrated firms such as Standard Oil was built not merely on extraction, but on pipelines, transport, and distribution. Infrastructure became a gatekeeper. Without regulatory intervention, monopoly followed.
The oil era also made visible a recurring feature of industrial revolutions: private gain raced ahead of public governance. Environmental damage, labour exploitation, and later climate change were treated as externalities rather than costs. Only once trust eroded and public backlash mounted did regulation arrive, through antitrust action, labour protections, and environmental law. FRS 12 (Provisions, Contingent Liabilities and Contingent Assets) only came into force into 1998.
For today’s professionals, the parallel is not abstract. Whenever a new capability becomes essential to economic participation, questions of access, fairness, and accountability inevitably follow.
Social media – attention as a resource
Social media introduced a less tangible but equally powerful resource – attention and behavioural data. Platforms promised democratisation and connection, but their underlying economics were extractive. Where coal mined energy, social media mined engagement.
Algorithms optimised for scale, speed, and emotional intensity, not for truth, wellbeing, or civic cohesion. The result was widespread adoption followed by rising concern about misinformation, mental health, and democratic integrity.
As with coal and oil, governance lagged behind impact. Regulation arrived only after trust had been damaged, through data protection law, antitrust scrutiny, and growing calls for platform accountability. The lesson was not that social media was inherently harmful, but that systems optimised without regard to social outcomes eventually undermine the institutions they rely upon.
Artificial Intelligence – the black box era
AI differs from earlier revolutions in one critical respect. It does not merely power systems, it shapes decisions. AI models influence credit approvals, recruitment, pricing, medical diagnosis, and risk assessment. In doing so, they increasingly function as intermediaries between evidence and judgement.
This creates a new form of opacity. Many modern AI systems are effectively black boxes. Their outputs are statistically powerful but difficult to explain, even to their creators.
For professions grounded in assurance, audit, and accountability, this presents a profound challenge. The risks are already visible = bias can be embedded at scale; errors are undiscovered; responsibility becomes diffuse.
As with coal smoke or algorithmic news feeds, harm is often incremental rather than dramatic, which makes it easy to overlook until it is entrenched.
At the same time, AI is generating new forms of concentration. Data-rich organisations become model-rich organisations. Control accrues upstream, to those who own training data, compute, and platforms, rather than to those who generate value through use. As with previous eras, the people creating the raw material rarely own the means of production.
Patterns that matter
Across coal, oil, social media, and AI, a consistent pattern emerges.
- Adoption is rapid and enthusiastic.
- Infrastructure creates gatekeepers.
- Externalities are underestimated or normalised.
- Regulation arrives late, often after trust has been damaged.
The differences between these technologies matter, but the logic of transformation does not change. Each era shows how difficult it is to retrofit governance once systems are embedded.
Implications for the data and analytics profession
For ICAEW members working with data, analytics, and AI, this history is not academic. It translates directly to professional responsibility.
First, data and AI should be treated as infrastructure, not merely tools. Infrastructure demands stewardship, auditability, and long-term thinking.
Second, explainibility and transparency are not optional. In high-stakes contexts, black-box decision-making is incompatible with accountability. Professionals should challenge systems that cannot be interrogated or explained.
Third, transition matters. Just as the decline of coal devastated communities that were unprepared, AI-driven change risks fracturing professions and regions if reskilling and adaptation are left too late.
Finally, public trust is a strategic asset. Once lost, it is difficult to recover. Professions that embed ethical governance early are more likely to retain legitimacy as AI adoption accelerates.
Choosing a different path
Coal built cities. Oil built empires.
Social media reshaped how we relate to one another.
AI will shape how decisions are made.
History suggests that the greatest risk is not technological progress itself, but our tendency to treat its consequences as someone else’s problem, to be addressed later. This time, we have the benefit of hindsight.
The challenge for the data and analytics community is to apply that hindsight now, while choices are still design decisions rather than fixed realities.
The black box is not just a technical problem; it is a governance one.
How we choose to open it, constrain it, and take responsibility for it will define the next era of professional trust.
Sources and background material
The first part of this article draws on the excellent book “Black Gold, the history of how coal made Britain” by Jeremy Paxman.