DeepSeek launched a free, open-source large language model in late December, claiming it was developed in just two months at a cost of under $6 million.
Honestly none of this means anything at the moment. This might be some sort of calculated trickery from China to give Nvidia the finger, or Biden the finger, or a finger to Trump’s AI infrastructure announcement a few days ago, or some other motive.
Maybe this “selloff” is masterminded by the big wall street players (who work hand-in-hand with investor friendly media) to panic retail investors so they can snatch up shares at a discount.
What I do know is that “AI” is a very fast moving tech and shit that was true a few months ago might not be true tomorrow - no one has a crystal ball so we all just gotta wait and see.
There could be some trickery on the training side, i.e. maybe they spent way more than $6M to train it.
But it is clear that they did it without access to the infra that big tech has.
And on the run side, we can all verify how well it runs and people are also running it locally without internet access. There is no trickery there.
They are 20x cheaper than OpenAI if you run it on their servers and if you run it yourself, you only need a small investment in relatively affordable servers.
Honestly none of this means anything at the moment. This might be some sort of calculated trickery from China to give Nvidia the finger, or Biden the finger, or a finger to Trump’s AI infrastructure announcement a few days ago, or some other motive.
Maybe this “selloff” is masterminded by the big wall street players (who work hand-in-hand with investor friendly media) to panic retail investors so they can snatch up shares at a discount.
What I do know is that “AI” is a very fast moving tech and shit that was true a few months ago might not be true tomorrow - no one has a crystal ball so we all just gotta wait and see.
There could be some trickery on the training side, i.e. maybe they spent way more than $6M to train it.
But it is clear that they did it without access to the infra that big tech has.
And on the run side, we can all verify how well it runs and people are also running it locally without internet access. There is no trickery there.
They are 20x cheaper than OpenAI if you run it on their servers and if you run it yourself, you only need a small investment in relatively affordable servers.
Give that statement to maybe not super techy investors, and that could spook them into the sell-off.