Connect with us

Gadgets

OpenAI’s CEO says kindness comes with a price

Published

on

OpenAI's CEO says kindness comes with a price

Summary

  • Politeness in ChatGPT has hidden costs.
  • Using manners with AI can impact energy usage.
  • Kindness in prompts isn’t necessary but signifies respect.

It doesn’t cost anything to be kind — unless that kindness is directed at your friendly neighborhood chatbot.

Last week, OpenAI’s CEO, Sam Altman, disclosed that users using manners-driven language like ‘please’ and ‘thank you’ was driving up electrical costs into the ‘tens of millions’ range. While energy usage and conservation paint a pretty black and white picture in the name of efficiency, it also sparks thought about how people interact with — and unironically, fear — AI language models.

Related

Do you say ‘please’ and ‘thank you’ to ChatGPT?

I’ve been using AI tools for far longer than OpenAI released the AI kraken on the world, and I’ve always been careful to mind my manners. No, it’s not because I’m afraid that ChatGPT is going to become our AI overlord someday (please don’t — my packed schedule can’t take a Skynet Judgement Day IRL). It’s because basic manners have been ingrained in me since I could speak, and it’s a force of habit that technology isn’t going to take away from me. I also feel like manners, like muscles and habits, can atrophy or fade away if you don’t use them consistently in almost all situations. Let’s say you are in ChatGPT or Google Gemini all day every day, and you consciously decide to drop all manner-driven phrases for the sake of time and keystrokes — what if that becomes the habit, and you accidentally write an email to your boss that’s all command and no etiquette? No corporation will squeeze the “please” out of me, thank you very much.

How does being polite waste energy?

Understanding the environmental impact AI





A grey water drop.

Pavlo T / Unsplash

Altman’s post was in response to user @tomieinlove pondering the impact of manners on the AI model’s energetic footprint. It’s a valid question, yet one that people don’t often think about when they’re messing around on ChatGPT. If it’s online, it’s not impacting the environment, right? Wrong.

According to data from The Washington Post, a 100-word email generated by an AI chatbot using GPT-4 uses up 519 milliliters of water, or just over one standard bottle of water. Once weekly takes the toll up to 27 liters a year. If one out of 10 people used ChatGPT to create that one email once a week for a year, the consumption would equal roughly that of the water used by all the households in Rhode Island for a day and a half.

Why is ChatGPT so thirsty? Well, to generate each prompt that GPT-4 spits out, ChatGPT sends it through servers that generate substantial amounts of heat. To cool them down, OpenAI uses water systems. If you’re a ChatGPT regular, think about how many prompts you use everyday — OpenAI’s water bill definitely makes me feel better about mine every month.

And that’s just isolating queries — not even touching upon the energy it takes to train models up front.

Related

How I made myself (and my favorite characters) into action figures

Hello, uncanny valley.

How could being polite be bad?

What’s so evil about ‘please’?

Trending