How can we the tackle systems-level impacts of AI?

In February 2021, in Austin, TX, I turned off our kitchen lights so my new husband could blow out his birthday candles. The lights didn’t come back on.

It was the 2021 Texas Energy Crisis, when winter storms brought down the power grid and left millions without power, many for days in record low temperatures.

What began as a milestone birthday–our first together–became the start of a survival ordeal. By the third day we were huddled in our freezing kitchen wearing hats and scarves, rationing canned beans and hoping our final container of water would last us until public utilities were restored, or until we could get to somewhere that had water in stock (Austin’s water treatment plant had failed).

During all that time, I mostly felt disbelief: that this could happen; that the people who built and regulated these systems hadn’t adequately planned for this; that society’s infrastructure could just suddenly break down. It was profoundly isolating.

The experience left many of us with a kind of PTSD when it comes to our electrical grid.

Now, there are new reports of AI and cryptocurrency data centers in Texas that are putting increased strain on the grid. According to KUT, they’re drawn to Texas by business-friendly incentives, but they don’t offer much in return: they employ few workers, and they draw a lot of power.

So when you use AI, your request could be going to a data center in Texas, which could be putting Texans at higher risk of more blackouts.


Unrecognized choices

This is the kind of choice we rarely realize we’re making when it comes to how we deploy AI.

We’ve started to recognize the quality and legal issues AI brings into our own product ecosystems. But incorporating AI into our products also means we are integrating ourselves into much larger systems: systems with far-reaching and often hard-to-understand consequences.

Such as the fact that when you run an AI model this summer, you could be causing a blackout at my house!

When you run an AI model this summer, you could be causing a blackout at my house.

As human-centered professionals in tech, we cannot predict every outcome, but that doesn’t mean we should absolve ourselves of the responsibility to try.

How can we tackle this difficult problem? How do you address the societal impacts of your own work?

In my emerging technology strategy work, I increasingly include specialized subject matter experts, as this is often the only way to anticipate issues that fall outside the traditional scope of tech.

I’ve experienced what can happen when major systems fail, and I’m happy to share how I’ve approached these issues for the technologies I work on. If I can support you in doing this work at your company, let me know.

Llewyn Paine