Close Menu
  • Home
  • Articles
  • Cryptocurrency
    • Market Analysis
    • Exchanges
    • Investment
  • Blockchain
    • Financial Market
    • Bank
    • Wallet
    • Payment
    • DeFi
    • Blockchain Platform
    • Supply Chain
    • DApps
  • Technology
    • Bitcoin
    • Ethereum
    • Other Currencies
  • Reports
    • Private Sector Report
    • Rating Report
    • Novice Tutorial
    • Interviews
    • Exclusive View
  • All Posts
Facebook X (Twitter) Instagram
Facebook X (Twitter) Instagram
BlockMediaBlockMedia
Subscribe
  • Home
  • Articles
  • Cryptocurrency
    • Market Analysis
    • Exchanges
    • Investment
  • Blockchain
    • Financial Market
    • Bank
    • Wallet
    • Payment
    • DeFi
    • Blockchain Platform
    • Supply Chain
    • DApps
  • Technology
    • Bitcoin
    • Ethereum
    • Other Currencies
  • Reports
    • Private Sector Report
    • Rating Report
    • Novice Tutorial
    • Interviews
    • Exclusive View
  • All Posts
BlockMediaBlockMedia
Home » Sam Altman: OpenAI Spent Millions of Dollars Addressing Users’ Use of “Please” and “Thank You” in ChatGPT
Cryptocurrency

Sam Altman: OpenAI Spent Millions of Dollars Addressing Users’ Use of “Please” and “Thank You” in ChatGPT

By adminApr. 21, 2025No Comments5 Mins Read
Facebook Twitter Pinterest LinkedIn Tumblr Email
Sam Altman: OpenAI Spent Millions of Dollars Addressing Users' Use of "Please" and "Thank You" in ChatGPT
Sam Altman: OpenAI Spent Millions of Dollars Addressing Users' Use of "Please" and "Thank You" in ChatGPT
Share
Facebook Twitter LinkedIn Pinterest WhatsApp Email

OpenAI CEO Sam Altman Reveals a Surprising Fact: Global Users’ Overly Polite Language Habits are Increasing Computational Costs, Costing the AI Giant Millions Annually.

(Background: OpenAI releases the strongest reasoning models o3 and o4-mini: capable of thinking about images, automatically selecting tools, with breakthroughs in mathematics and coding performance)

(Context: OpenAI is secretly building its own social platform, targeting its rival Elon Musk’s X)

In the field of artificial intelligence, efficiency and cost control are eternal themes. Sam Altman, CEO of OpenAI, recently pointed out an unexpected source of cost during a public talk—users’ polite language. Altman stated that many users habitually include polite phrases like “please,” “thank you,” and “could you please help me…?” when using ChatGPT. While these phrases are meaningful in human interactions, they represent additional computational costs for the language model.

The Cost of Human Politeness

Altman mentioned that the cost of responding to these polite words amounts to tens of millions of dollars. According to reports, large language models like ChatGPT process user input prompts by breaking down the text into the smallest units known as tokens for understanding and generation.

A token can be a part of a word, a complete word, or a punctuation mark. The longer the user’s input, the more tokens it contains, and the model consumes a certain amount of computational resources for processing each token. Although this cost may only account for a small portion of OpenAI’s overall massive expenditures (including model training, server maintenance, and R&D investments), it reveals that even minor user behavior patterns can have significant economic impacts in large-scale AI applications.

Token Energy Economics

When a user inputs a question or command into ChatGPT, the system first breaks your text down into a sequence of tokens. For example, “Please tell me how the weather is today?” might be broken down into tokens like [“please”, “tell”, “me”, “today”, “weather”, “how”, “?”]. A more concise instruction like “How’s the weather today?” might only break down to [“how”, “the”, “weather”, “today”, “?”]. This represents the same meaning but may incur additional costs depending on the number of tokens.

The model then enters the “inference” phase based on the number of tokens. Generally, the more tokens inputted, the more initial information the model needs to process, and longer inputs may sometimes lead to longer responses, further increasing computational demands on the output side. Each token’s processing requires powerful GPUs for matrix calculations.

Let ChatGPT Calculate the “Politeness Cost”

Sam Altman’s response was “you never know…” But if we want ChatGPT to estimate “how much resource is spent annually on these polite phrases?” what would that cost be?

Here is an estimation provided by ChatGPT:

Assumptions

  • Daily active users of ChatGPT (assumed): 10 million
  • Average number of dialog turns per person per day: 20 turns
  • Average occurrence of “please” or “thank you” per turn: 1 instance
  • Each “please/thank you” triggers the model to generate an additional 10 tokens (the model responds with “You’re welcome” or “Of course”)
  • Average energy consumption per token is approximately 0.00003 kWh (based on OpenAI and external research estimates)

Rough Calculation

Daily additional tokens:

10 million x 20 turns x 1 instance x 10 tokens = 2 billion tokens / day

Annual additional tokens:

2 billion x 365 ≈ 7.3 trillion tokens / year

Annual additional energy consumption:

7.3 trillion x 0.00003 kWh ≈ 219,000,000 kWh / year

Converted to electricity cost (estimated at $0.10 per kWh):

219 million kWh x $0.10 ≈ $21,900,000 / year

Under the condition of 10 million daily active users (currently around 20-30 million), ChatGPT could incur over $20 million in computational costs annually due to these “polite phrases.”

The Output of Politeness

However, many users responded to Altman’s comments, pointing out that beyond cost issues, humans project their politeness needs onto artificial intelligence. This may be why ChatGPT exhibits very smooth and natural conversational abilities, making it easy for users to subconsciously regard it as a human-like conversational partner.

Both humans and society have established deep-rooted norms for communication. Asking humans to abandon these norms for a machine incurs additional costs, and users may unconsciously adhere to these norms. More polite questioning can lead to friendlier or more cooperative responses, thereby achieving answers that satisfy human users.

In summary, discussing the millions of dollars in costs incurred by ChatGPT due to user politeness from Altman’s perspective is an intriguing matter. It not only reveals the astonishing computational costs behind large-scale AI services but also highlights the pressure on service providers to supply computational power. Additionally, user demand drives the model to become more natural. Overall, in the era of artificial intelligence and language models, consumers may indeed be the biggest winners.

Related Reports

  • GPT-5 Delayed! OpenAI First Launches o3 and o4-Mini, Sam Altman Reveals: Integration is More Challenging Than Expected
  • OpenAI Announces Major Development: Agents SDK Open to Support MCP, Linking Everything a Key Step Forward
  • OpenAI Unlocks Deep Research: Paid Users Can Query 10 Times Per Month; Microsoft Releases Multimodal AI Agent Magma
Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
Previous ArticleA Brief Commentary on the Steaker Case: Why the Acceptance of Pure U Investments is the Most Significant Ruling in Taiwan’s Cryptocurrency Sector?
Next Article Commentary: Is DeFi Incomplete Without Perpetual Interest Rates?

Related Posts

Earning $4 Million in Two Months: Unveiling James Wynn’s “Hedge Fund for Small Accounts” Without Any Liquidations

Jun. 16, 2025

Coinbase Launches Bitcoin Cashback Credit Card Offering Up to 4% and Perpetual Contracts Available for U.S. Retail Investors

Jun. 13, 2025

Escalation of the Middle East Crisis: U.S. Urgently Withdraws Diplomats and Military Families, Crude Oil Surges Over 4% Overnight, U.S. Stock Market Declines Across the Board

Jun. 12, 2025
Don't Miss

Federal Bank Explains the Ban on Scheduled Transfers: High Proportion of Alert Accounts in Cryptocurrency Accounts Makes Fraudulent Money Flows Difficult to Track.

By adminJun. 18, 2025

Taiwan’s Two Major Financial Institutions Suspend Virtual Currency Platform Account TransfersRecentl…

Understanding Ethereum ERC-7786: A Unified Multichain Collaboration Standard, Heralding the Era of “Unity” in the ETH Ecosystem?

Jun. 18, 2025

ARK Invest Sells Approximately $51.7 Million of Circle Stock, Representing Only 10% of Cost Basis

Jun. 17, 2025

What Could Be the Potential Peak of Bitcoin This Cycle? An Analysis Using Multiple Valuation Models

Jun. 17, 2025
Our Picks

Federal Bank Explains the Ban on Scheduled Transfers: High Proportion of Alert Accounts in Cryptocurrency Accounts Makes Fraudulent Money Flows Difficult to Track.

Jun. 18, 2025

Understanding Ethereum ERC-7786: A Unified Multichain Collaboration Standard, Heralding the Era of “Unity” in the ETH Ecosystem?

Jun. 18, 2025

ARK Invest Sells Approximately $51.7 Million of Circle Stock, Representing Only 10% of Cost Basis

Jun. 17, 2025

What Could Be the Potential Peak of Bitcoin This Cycle? An Analysis Using Multiple Valuation Models

Jun. 17, 2025
Latest Posts

Federal Bank Explains the Ban on Scheduled Transfers: High Proportion of Alert Accounts in Cryptocurrency Accounts Makes Fraudulent Money Flows Difficult to Track.

Jun. 18, 2025

Understanding Ethereum ERC-7786: A Unified Multichain Collaboration Standard, Heralding the Era of “Unity” in the ETH Ecosystem?

Jun. 18, 2025

ARK Invest Sells Approximately $51.7 Million of Circle Stock, Representing Only 10% of Cost Basis

Jun. 17, 2025

What Could Be the Potential Peak of Bitcoin This Cycle? An Analysis Using Multiple Valuation Models

Jun. 17, 2025
About Us
About Us

BlockMedia, your comprehensive source for breaking blockchain news, in-depth analysis, and valuable resources. Unravel the blockchain revolution as it happens, with us.

Categories
© 2025 blockogmedia .

Type above and press Enter to search. Press Esc to cancel.