I’m asking the same question you are here: Surely there are negative externalities to this miraculous technology.
The environment? The legal rights of creators? The fate of creativity itself in the face of automation?
There are externalities, and we must be cognizant of them.
Central among them: What happens when the “end result of our creative vision” is also the process itself?
Let’s set aside movies for a moment, because videos have lots of moving parts we’d be happy to automate away, and talk purely about the most hot-button issue in most creative circles: Isn’t the work of generating an image the creative process itself? If we’re bypassing that entirely, then what can be said to be creative about the end result at all?
These are the sort of questions I want to address in this Statement of Use, because there is no single, straightforward answer.
If you’ve read this far, then you’re interested in grappling with these questions too. And I thank you for that, because at least it means we can come to understand each other, even if we don’t ultimately agree on the answers.
In this section, I’d like to talk about the negative externalities of genAI and common misconceptions about them.
Is AI Bad for the Environment?
This canard is widely touted by anti-AI activists as an unquestioned fact, and often (deliberately) confuses the two types of costs involved in creating an AI model: the training (what goes into creating the tool in the first place) and the inference (our daily use of the tool).
In short, even if you amortize the training cost into the inference cost, which makes the confusion between the two irrelevant, the impact of this aggregate cost on the environment is orders of magnitude less than that of most other human activities, such as making cellphones; flying planes or driving cars; cement and steel production or golfing; using microwaves, AC units, fridges, or coffeemakers; eating hamburgers; and taking a shower.
It’s hard to emphasize here what is meant by “orders of magnitude,” especially for someone like me who has degrees in writing and can barely add two numbers together.
A good place to start is this article, which goes into painstaking detail about how marginally impactful the use of LLMs and image generation is on the environment (again, factoring in training and also the entire energy impact of all data centers on earth) as compared to virtually all the other truly environment-destroying activities we engage in as a human species. The examples in this article focus on ChatGPT, but his numbers take into account image generation as well, derived from dozens of sources you can verify yourself.
Here are some fun facts from that article that are not too math-oriented, but you can read the article in full to find the math these analogies spin out from:
Being “mindful” with your chatbot usage is kind of like filling a large pot of water to boil to make food, and before boiling it, taking a pipet and removing tiny drops of the water from the pot at a time to “only use the water you need” or stopping your shower a tenth of a second early for the sake of the climate.
Deciding that you’re going to stop using AI for the sake of the climate is like going around your home and randomly unscrewing a single LED bulb, or pausing your microwave a few seconds early to save the planet. It’s so small that it’s a meaningless distraction.
If you were running ChatGPT’s servers in your home, to raise your energy bill by one dollar, you would need to send 19,600 prompts. One prompt every single second for five hours.
If I choose not to take a flight to Europe, I save ten million ChatGPT prompts. This is like stopping more than 100 people from searching ChatGPT for their entire lives. Preventing ChatGPT prompts is a hopelessly useless lever for the climate movement to try to pull. We have so many tools at our disposal to make the climate better. Why make everyone feel guilt over something that won’t have any impact?
Printing a physical book uses 5,000 Wh, so even just sitting down and reading a book you bought for six hours (using 833 Wh per hour) is going to use more energy per minute than ChatGPT, unless you prompt ChatGPT 1,000 times per hour, or once every three seconds for a full hour. Switching to using ChatGPT from another activity is almost always going to decrease the total energy I use every day. This isn’t an argument that you should only use ChatGPT!
If you want to send 2,500 ChatGPT prompts and feel bad about it, you can simply not buy a single additional piece of paper. If you want to save a lifetime supply’s worth of chatbot prompts, just don’t buy a single additional pair of jeans.
A digital clock uses one million times more power (1W) than an analog watch (1µW). “Using a digital clock instead of a watch is one million times as harmful to the climate” is correct, but misleading. The energy digital clocks use rounds to zero compared to travel, food, and heat and air conditioning. Climate guilt about digital clocks would be misplaced. The relationship between Google and ChatGPT is similar to watches and clocks. One uses more energy than the other, but both round to zero.
Many people have trouble visualizing the aggregate results of doing everyday activities. If there were a single national microwave, it would use as much energy every day as Seattle. Data centers make this aggregate energy use visible, but the aggregate energy of most other ways we spend our time are invisible. Data centers only look like they’re using way more energy because we can’t directly see all the energy of other things we do gathered together into specific buildings […] All data centers worldwide emitted about 0.5% of the world’s annual emissions … by 2030, all data centers worldwide are projected to consume 8% of the water currently consumed by the [entire] US golf industry [3% as of 2023] or 8% of the US steel production or 1% of America's total irrigated corn.
There is nowhere in America where data center operational use of water has increased household water prices at all.
A recent popular blog post was titled “Why Saying ‘AI Uses the Energy of Nine Seconds of Television’ is Like Spraying Dispersant Over an Oil Slick.” The author’s main point is that each individual AI prompt is able to use so little energy only because of this vast and expanding background buildout of AI infrastructure, so just reporting (as I do) that an AI prompt only uses as much energy as a few seconds of a microwave is hiding the more ominous reason why it’s able to be so cheap in the first place. By using AI, you’re complicit in some way in that infrastructure buildout. This criticism would make more sense to me if everything else in society didn’t also have a vast sprawling physical infrastructure supporting it. “9 seconds of TV” has huge networks of electronics systems supporting it, as well as crazy amounts of money and people-hours going into making the most entertaining TV, lavish (often wasteful) lifestyles enabled by the profits from TV. Obviously, TV advertising also encourages people to buy more stuff from other complex supply chains.
So Is AI Bad for the Environment?
Obviously, I'm not an environmental scientist. As I said above, I can barely add two numbers together. But I don't think it's a logical fallacy to base my opinions of environmental subjects on the research of environmental scientists. We engage in this kind of reasoning every day, on every subject we are not experts on. History may prove that Masley and all the sources he cites (which are written by people credentialed in the subject matter) are wrong. And in that case, I would revise my opinion to reflect reality.
But currently, if the above is the actual reality of genAI’s impact on the environment, then I hope you can see how I’m not moved by the claim “AI is bad for the environment.” If I’m being charitable, it’s a wildly misleading statement born of ignorance; at worst, the opposite is true, for my use of AI. As an example, using genAI means I don’t invest in traditional methods for producing content that involves products and processes whose creation is orders of magnitude worse for the environment. So if you really care about the environment, would you rather me create a carbon footprint that is orders of magnitude worse than what I’m currently creating?
Finally: while this does not apply to my use of AI in developing OSR+, it’s worth noting that AI has the potential to help us with discoveries and optimizations in environmental science that could ultimately combat global warming.
If we accept that its impact on the environment is marginal, then shouldn’t we try our best to use it so we can make those discoveries?
Other Reading
Masley’s article, however well-researched, is only one meta analysis.
Here are other sources to explore (which include ones Masley references) that either surface the data his claims are based on, or come to the same conclusions:
- On the numbers:
- Goldman Sachs and Google
- International Energy Agency on data center emissions
- “Environmental Burden of US Data Centers in the AI Era” (Harvard & UCLA, under peer review)
- Congressionally mandated 2024 United States Data Center Energy Usage Report by Lawrence Berkeley National Laboratory
- On the Energy costs of communicating with AI via Frontiers
- On energy efficiency in AI: Stanford HAI report on rapid gains in energy efficiency in AI and Recalibrating global data center energy-use estimates
- Further corroboration by Oxford climate data scientist of Our World in Data
- The carbon emissions of writing and illustrating are lower for AI than humans, and Reconciling the contrasting narratives on the environmental impact of large language models, both from Nature.com
- How reporting on AI energy usage has been oversimplified and as a result misleading: You’re Thinking About AI and Water All Wrong (via Wired) and Rethinking Concerns About AI’s Energy Use (via the Center for Data Innovation)
- How political opportunism exaggerates the AI energy "crisis": The False AI Energy Crisis
- More nuance on data centers and electricity demand from the Information Technology & Innovation Foundation
Archetypes
Armor
Classes
Conflicts
Cultures
Ethos
Flaws
Glossary
Kits
Maleficence
Origins
Shields
Skills
Spells
Stances
Status Effects
Tactics
Talents
Techniques
Treasure
Weapons

Hall of Heroes
Hall of Legends
Dungeons & Flagons