Preferences

It's cheap now. But if you take into account all the training costs, then at such prices they cannot make a profit in any way. This is called dumping to capture the market.

infecto
No doubt the complete cost of training and to getting where we are today has been significant and I don’t know how the accounting will look years from now but you are just making up the rest based on feelings. We know operationally OpenAI is profitable on purely the runtime side, nobody knows how that will look when accounting for R&D but you have no qualification to say they cannot make a profit in any way.
guappa
Except they have to retrain constantly, so why would you not consider the cost of training?
infecto
In the medium to long term that R&D matters. In the short term it’s not as important of a metric. I absolutely agree from an underwriting prospective one would ideally be considering those costs but I also think it’s dishonest to simply say they are bleeding money, end of story.

They dont have to retrain constantly and that’s where opinions like yours fall short. I don’t believe anyone has a concrete vision on the economics in the medium to a long term. It’s biased ignorance to hold a strong position in the down or up case.

NoOn3 OP
Yes, if you do not take into account the cost of training, I think it is very likely profitable. The cost of working models is not so high. This is just my opinion based on open models and I admit that I have not carried out accurate calculations.
diggan
> But if you take into account all the training costs

Not everyone has to paid that cost, as some companies are releasing weights for download and local use (like Llama) and then some other companies are going even further and releasing open source models+weights (like OLMo). If you're a provider hosting those, I don't think it makes sense to take the training cost into account when planning your own infrastructure.

Although I don't it makes much sense personally, seemingly it makes sense for other companies.

dist-epoch
There is no "capture" here, it's trivial to switch LLM/providers, they all use OpenAI API. It's literally a URL change.
jamessinghal
This is changing; OpenAI's newer API (Responses) is required to include reasoning tokens in the context while using the API, to get the reasoning summaries, and to use some of the OpenAI provided tools. Google's OpenAI compatibility supports Chat Completions, not Responses.

As the LLM developers continue to add unique features to their APIs, the shared API which is now OpenAI will only support the minimal common subset and many will probably deprecate the compatibility API. Devs will have to rely on SDKs to offer comptibility.

dist-epoch
It's still trivial to map to a somewhat different API. Google has it's Vertex/GenAI API flavors.

At least for now, LLM APIs are just JSONs with a bunch of prompts/responses in them and maybe some file URLs/IDs.

jamessinghal
It isn't necessarily difficult, but it's significantly more effort than swapping a URL as I originally was replying to.
lelanthran
> There is no "capture" here, it's trivial to switch LLM/providers, they all use OpenAI API. It's literally a URL change.

So? That's true for search as well, and yet Google has been top-dog for decades in spite of having worse results and a poorer interface than almost all of the competition.

This item has no comments currently.