DeepSeek launched a free, open-source large language model in late December, claiming it was developed in just two months at a cost of under $6 million.
Maybe? Depends on what costs dominate operations. I imagine Chinese electricity is cheap but building new data centres is likely much cheaper % wise than countries like the US.
That’s becoming less true. The cost of inference has been rising with bigger models, and even more so with “reasoning models”.
Regardless, at the scale of 100M users, big one-off costs start looking small.
But I’d imagine any Chinese operator will handle scale much better? Or?
Maybe? Depends on what costs dominate operations. I imagine Chinese electricity is cheap but building new data centres is likely much cheaper % wise than countries like the US.