I had a front-row seat to the social media revolution in World Affairs roles at Twitter and Meta. The same mistakes happen in artificial intelligence

Photo of author

By [email protected]



I’m not a technology naysayer. Far from it. But we do it again.

A new era of technology is emerging. Artificial Intelligence is reshaping economies, industries, and governance. And like last time, we’re moving fast, breaking things, and building the plane as we fly it (to use some common technical phrases). These statements have helped spur innovation, but now we live with unintended consequences.

For more than a decade, I’ve worked in the engine room of the social media revolution, starting with the U.S. government, then in twitter and dead. I’ve led teams dealing with governments all over the world as they grappled with platforms they didn’t understand. At first, it was intoxicating. Technology has moved faster than organizations can keep up with. Then came the problems: misinformation, algorithmic bias, polarization, and political manipulation. By the time we tried to organize it, it was too late. These platforms were too large, too compact, and too necessary.

Lesson? If you wait until technology is ubiquitous to think about safety, governance, and trust, you’ve already lost control. However, we are about to repeat the same mistakes with AI.

New intelligence infrastructure

For many years, artificial intelligence was viewed as a technical issue. Not anymore. It has become the foundation for everything from energy to defense. Basic models are improving, deployment costs are falling, and risks are rising.

The same chants are back: Build fast, go early, scale aggressively, and win the race. Only now are we not disrupting media, but reinventing the basic infrastructure of society.

Artificial intelligence is not just a product. It is a public good. It shapes how resources are allocated, how decisions are made, and how institutions operate. The consequences of making mistakes are much greater than with social media.

Some of the dangers seem eerily familiar. Models trained on fuzzy data without external supervision. Algorithms optimized for performance at the expense of safety. Closed systems make decisions that we do not fully understand. Global governance is empty while capital flows faster than regulation.

And again, the prevailing narrative is: “We’ll figure it out as we go along.”

We need a new playbook

The social media age’s playbook of moving quickly, asking forgiveness, and resisting censorship won’t work with AI. We’ve seen what happens when platforms scale faster than the institutions they aim to control.

This time, the stakes are higher. Artificial intelligence systems are not just about communication. From how energy is transported to how infrastructure is allocated during crises, they are beginning to impact reality.

Energy as a case study

Energy is the best example of an industry where infrastructure is destiny. They are complex, structured, mission-critical and global. It is the sector that will enable or limit the next stage of artificial intelligence.

AI racks in data centers consume 10 to 50 times more power than traditional systems. Training a large model requires the same energy as 120 homes a year. AI workloads are expected to increase global data center electricity demand by 2-3 times by 2030.

Already, AI has been integrated into systems that optimize grids, predict service outages, and integrate renewable energy sources. But without the right monitoring processes, we may face scenarios where AI systems prioritize industrial customers over residential areas during peak demand. Or crises in which artificial intelligence makes thousands of quick decisions during emergencies that leave entire areas without power and no one can explain why or override the system. This is not about choosing sides. It’s about designing systems that work together safely and transparently.

Don’t repeat the past

We’re still early. We have time to form the systems that will govern this technology. But that window is closing. So, we must act differently.

We must understand that incentive structures shape outcomes in invisible ways. If models prioritize efficiency without guarantees, we risk building systems that reinforce bias or push reliability to the edge until something breaks.

We must judge from the beginning, not the end. Regulation should not be a retrospective solution, but rather a design principle.

We must treat infrastructure as infrastructure. Energy, computing and data centers must be built with long-term governance in mind, not short-term optimization.

We can’t accelerate critical systems without robust testing, red-teaming, and auditing. Once widely integrated, harmful design choices are almost impossible to reverse.

We must work to unite public, private and global sector actors, which can be achieved through cross-sector events such as ADIPEC, a global energy platform that brings together governments, energy companies and technology innovators to discuss and debate the future of energy and artificial intelligence.

No company or country can solve this problem alone. We need common standards and interoperable systems that can evolve over time. The social media revolution has shown what happens when innovation outpaces organizations. With AI, we have to choose a different path. Yes, we will move quickly. But let’s not break the systems we depend on. Because this time, we’re not just building networks. We are building the next foundation for the modern world.

The opinions expressed in Fortune.com reviews are solely those of their authors and do not necessarily reflect the opinions or beliefs luck.

Global Luck Forum It returns from October 26-27, 2025 in Riyadh. CEOs and global leaders will come together at a dynamic, invitation-only event to shape the future of business. Apply for an invitation.



https://fortune.com/img-assets/wp-content/uploads/2025/10/sean-evins_070fc8.png?resize=1200,600

Source link

Leave a Comment