source · blog
Three Observations
src_altman_three_observations
https://blog.samaltman.com/three-observations
reliability 0.40
authors: Sam Altman
published: 2025-02
accessed: 2026-04-19
Notes
Personal blog post by OpenAI CEO. Primary testimony re: his own views, but advocacy-coded on economic claims. Slight upward adjustment from blog prior (0.35) because Altman is a primary source on OpenAI's framing and roadmap.
Intake provenance
- method
- httpx
- tool
- afls-ingest/0.0.1
- git sha
- 604c9dfd252a
- at
- 2026-04-19T18:50:11.272955Z
- sha256
- b7809c2770a6…
Evidence from this source (5)
- weight0.90
method: primary_testimony · locator: paragraphs on policy and empowerment
“directionally, as we get closer to achieving AGI, we believe that trending more towards individual empowerment is important; the other likely path we can see is AI being used by authoritarian governments to control their population through mass surveillance and loss of autonomy.”
- weight0.75
method: primary_testimony · locator: Observation 2
“The cost to use a given level of AI falls about 10x every 12 months... the price per token dropped about 150x in that time period.”
- weight0.70
method: expert_estimate · locator: Observation 3
“The socioeconomic value of linearly increasing intelligence is super-exponential in nature. A consequence of this is that we see no reason for exponentially increasing investment to stop in the near future.”
- weight0.70
method: expert_estimate · locator: Observation 1
“The intelligence of an AI model roughly equals the log of the resources used to train and run it... the scaling laws that predict this are accurate over many orders of magnitude.”
- weight0.90
method: primary_testimony · locator: closing paragraphs on distribution
“the balance of power between capital and labor could easily get messed up, and this may require early intervention. We are open to strange-sounding ideas like giving some 'compute budget' to enable everyone on Earth to use a lot of AI.”