A $20/month user costs OpenAI $65 in compute. AI video is a money furnace

(aedelon777.substack.com)

35 points | by Aedelon 2 hours ago

5 comments

  • pjdesno 1 hour ago
    Look up the Osborne 1, the first "portable" (i.e. luggable) computer. They went out of business not only because they lost money on each unit, but because of how many they sold. Then they pre-announced their next model, which killed all demand for the existing one, and they were toast.
    • ssl-3 1 hour ago
      It's a fascinating story, but is it really related?

      IIRC, they were making decent-enough profits with the Osborne 1 at the beginning. It was never intended to be a loss-leader.

      It was only after the Osborne 2 was announced (way too early) that existing orders got cancelled, and inventory was sold at fire-sale prices in sheer desperation to generate any value from the well they'd accidentally poisoned.

      (For those who don't know, the company imploded before the Osborne 2 was finished.)

  • asdff 29 minutes ago
    Sounds like a great way for an AI company to kill off a competing AI company. You can probably do this "organically." Take your $20/mo user just use that money directly to buy that user a subscription for the competitor product and serve them a wrapper.

    Not sure if it would work but it would at least been a great plot for Silicon Valley if that show were still around.

  • davikr 38 minutes ago
    A $20/month Codex user probably costs thousands of dollars in compute (for multiple providers). I think they just want to weather it over until compute is eventually cheap enough, has to be, otherwise it'll always be unsustainable.
    • asdff 25 minutes ago
      Or the alternative long con: lock in enterprise customers and raise prices. Seriously that is the golden goose, big institutions will just buy like lemmings even if they are already buying the redundant competing product.
  • motbus3 1 hour ago
    Some friends and I did a "just for fun" calculation on what price AI should really have using some of business and infrastructure experience.

    The three of us have a decent amount of years in adjacent fields, still this is more like a "trust me bro comment". Anyway, we came to a subscription price of 120-150 USD/mo and we did this 6 months ago when the world wasn't yet the chaos it is right now. If those number had to be adjusted, a quick calculation would put it already close to the 200 USD/mo mark so there a decent margin after taxes.

    That said, of we are anywhere close to be correct on this, I think that increase the price of the product by 10x will drastically reduce the number of users which will then drastically reduce the hardware required.

    And even if we are off by the double, it would still be a 5x price increase would cause similar effect.

    Speculation on my part is that it needs to be cheap because they need as much as human generated content as possible as they are running out of data and the models have plateau'ed. We don't see that thing of models getting 10x smarter anymore and maybe we see they are getting smaller or more specialised.

    Ofc, disruptive research might come up, but my guess is that this price is both a incentive and a requirement for this business to not break apart.

  • Terr_ 2 hours ago
    > generating a 10-second AI video costs roughly 160 times more than generating an equivalent amount of text

    Hold up, "equivalent" how? It can't be based on "cost" of generation, or else it would be a 1x factor, by definition. Perhaps "costs" in this case refer to the unprofitable gap between revenues and expenses?

    > Table 2

    Weird, so it looks like some person just arbitrarily decided that 1K GPT-4 text tokens "is equivalent to" 10s of Sora 2 video?

    That doesn't seem very rigorous.

    • motbus3 1 hour ago
      Let me type and think

      (I put it in Gemini for English translation) The 1080p and most expensive tier is 0.70 USD per second. Since Sora 2 runs at 30 FPS, each second of video costs roughly 2.3c per frame. While a single 1920x1080 static image is 765 tokens, video models use spacetime compression. Instead of a raw 22,950 tokens per second (765 tokens x 30 frames), a second of 1080p video equates to roughly 10,000 'latent tokens' due to temporal redundancy. Adding 20 tokens per second of audio, we get roughly 10,020 tokens per second of output. At $0.70 per second for ~10,020 tokens, the cost is approximately $0.00007 per token for Sora 2. 10 seconds of Sora 2 video would cost $7.00 for roughly 100,200 tokens. In comparison, GPT-5.4-pro at 15 USD per 1M output tokens costs $0.000015 per token. To generate 100,200 tokens of text, it would cost only $1.50. This puts Sora 2 at roughly 4.6x more expensive than GPT-5.4-pro per token generated. However, if we ignore video compression and treat every frame as a unique 1080p image (765 tokens each), Sora 2 becomes roughly 30x more expensive in terms of raw computational effort per frame

    • PaulHoule 2 hours ago
      Well I guess you could say there is some amount of text that entertains you as much as a 10s Sora video. Judged in terms of time a fast reader might read 50 words in 10s and that is what, 100 tokens? If somebody wants to fudge that up by a factor of 10 (picture is worth a thousand words or something) you get where they are.

      Now personally I am not entertained by motion-for-the-sake-of-motion Instagram reels, they actually make me queasy despite having a cast iron stomach and having taught myself to not get sick in VR. So if that's 10s of entertainment, leave me out. I don't care if Tom Cruise is whaling on Brad Pitt or the other way around for that matter, but boy do I want to see the body thetans burst ouf of Cruise's body when OTIII goes horribly wrong.

      My reaction to the article was funny. I mean, I saw that 160x thing and thought it was bogus, and of course it is all AI generated and poorly formatted to boot but I did like the overall message. It does remind me of the early 2010s when a lot of sites with photo-based content (including mine) were going out of business because the revenue wasn't enough to pay the hosting costs and a few newcomers like Instagram were survivors and Google was obviously cleaning up with video on YouTube. From the viewpoint of business models for AI video I think there are two questions:

      (i) how many times can you get people to watch the same video, i mean, no matter how expensive it is, if you get enough views/ad impressions/other revenue you are OK

      (ii) how does it compete with some other way to generate the video?

      The picture that the $20 subscription costs $65 to serve doesn't sound too crazy to me. I mean, there might be somebody who can get 3x the value out of a 10s Sora video than somebody else or they could get the cost down by a factor of 1/3.

    • trillic 2 hours ago
      It's a well known fact that 1 Picture == 1000 words.
      • goodmythical 55 minutes ago
        I've often used this in silly pseudo-proofs demonstrating that words have little to no value.

        Given that a picture is worth 1000 words, a film (being a string of pictures) at 24fps is 129600 pictures in 90 minutes, and viewing a film might cost $15: a word can be rented for $0.000116 or at a rate of roughly 86 words per penny.

        This also tracks well with paperback novels as 70k words would be a little over $8 and 100k words would be just under $12.

        That said, I have nothing but the vaguest sense of what an average movie or book costs these days. Are movies $15? Does walmart still have the $5 bin?

        What about books? I know that the last time I was in a book store I was somewhat shocked by the prices but that was years ago.

        Although, the local used good probably still sells both media for $1/ea. If that's the case, there's an easy frugality argument in the 90 minute movie being worth ~130k words against most novels topping out under 100k.

      • CrzyLngPwd 1 hour ago
        30 pictures a second for reasonable video, haha

        Just burn money.

      • latexsalesman2 1 hour ago
        [dead]
    • Aedelon 1 hour ago
      [dead]