How much if a discount are you expecting to start gaming on a 30k card with no video output?
Not for gaming, for running AI open source models and other AI shenanigans. My 4080 Super has been filling my gaming needs and will for years to come, but it’s not enough for my AI interests lol
The most I can get out of this 4080 is running a ~7B param model, but I want to run cooler shit like that new open source DeepSeek v3 that dropped the other day.
So you’re waiting for the AI bubble to burst because you can’t wait to run all the cool new AI models?
Yea, the underlying tech is what interests me and I have a few potential use cases. Use cases that I would never entrust a random company with. For example, the concept of MS recall is cool, I’d never trust Microshits implementation though. But an open source local version that I’m in control of all the security implementations? Hell yea lol
That’s the problem. If the use case is super cool, and 99% of people have no knowledge (or motivation) to set it up for themselves, the online services will keep existing, and the bubble won’t really burst.
Even if some single companies fail (and they will, there some truly horrendous ideas getting funding), the big players will buy the GPUs wholesale before they hit ebay.
Lol no, I mean it would be a bubble if it didn’t provide anything useful, or transformative, but that’s far from the truth.
Like it or not, even LLMs have been found to help in health treatments, mental support , workplace efficiency and so on
AI is here to stay, it’s basically the next industrial revolution
The problem is that the enterprise level cards can’t really perform at the consumer market level nor are they designed for it. Many don’t even have video outputs.
Nuclear energy could also be a weird side effect of the bubble.
I believe it is likely that there will be a burst at some point, just as with the dot-com burst.
But I think many people wrongly think that it will be the end of or a major setback for AI.
I see no reason why in twenty years AI won’t be as prevalent as “dot-com’s” are now.
I agree, history always repeats itself. But perhaps the timing is different, it could be 20 years, 10 years, or 50 years who knows
Some current directions in AI, such as LLMs, seem to be dead-ends in the sense that those approaches cannot be incrementally improved much further to, for example, eliminate hallucinations or simply be capable of using logic along with those probability engines in such a way as to, at minimum, exclude the logically impossible from the results.
The dot-com stuff on the other hand was the very first bubble from the very first wave into a whole new technological direction that had just been unlocked and gave access to an entire technological branch of new ways of doing things - it the result of the very first wave of investment around the technology domain of worldwide digital communications and all the other tech branches that became possible due to it.
Basically the Internet was like openning a door to a various new areas of Tech (curiously that wasn’t even all that amazingly complex as Tech goes, kinda like a basic wheel isn’t exactly complicated but look at all that became possible with its invention), whilst the current AI wave (which is mainly the latest wave of work in the branch of Neural Networks, which is over 3 decades old) is more like a handful of massivelly complicated solutions which are the product of decades of work in a specific direction, some of which work in such a way that they can’t be significantly further improved and hence can’t be made to get past certain problems it has (the most obvious example being LLM hallucinations).
So whilst I do think that in 20 years there will be some prevalence of AI tech companies in some domains were the AI solutions of this wave of development on it do work well enough (say, entity detection on images), I don’t think that will be anywhere comparable to what happened in the 20 years following the start of a new Tech Age which triggered by the Internet.
Mind you, 2 decades is a lot of time in Tech terms, so maybe somebody will come up with a whole different approach to AI in the meanwhile that breaks through the inherent limitations of the current one, just don’t count on it.
Edit: just wanted to add that I was there when Darpanet morphed into the Internet and the dot-net bubble that came out of it. At the time everybody and their dog was playing around with making a websites, people were trying new stuff on top of those websites, inventing new comms protocols, wiriting programs that talked to other programs over the network, creating entirely new business models anchored on making a website a storefront - the Internet was Freedom. This AI wave doesn’t feel at all like that - sure plenty of people are deploying models created by others and trying them out, but very few are creating new models and a lot of that Tech comes pre-monetised and locked down by large companies who are trying to get money out of anything people do with it - the whole things is not at all like the “we’ve open this whole new domain, you guys figure out what to do with it” that was the birth of the Internet.
You don’t want a used GPU that’s been running overclocked for years on end bro.
Multiple outlets including LTT and Gamers Nexus have debunked this.
The only thing you may have to do if you notice unusual performance is reapply thermal paste to the GPU, and that’s only because most thermal paste will dry out after years of sitting around or being used
The price of gpus would go down as there would be less demand.
Aw, I was hoping the whole thing would rhyme after the first line.
GPU prices are gonna get cheaper, annnnyyyy day now folks, any day now
That’s what I’m saying since 2020, people don’t have patience any more.
I dumped my old af GPU for more than I paid for it because people have no chill.
I will sell my Polaris for a ridiculous amount of money. I will sell my Polaris for a ridiculous amount of money. I will sell my Polaris for a ridiculous amount of money.
Manifesting 🙏🙏🙏🙏
Just deserts.
Just deserts.
We aint found shit
What did OP mean by that?
Onlyflans
Just desserts
There may be a dip in prices for a bit, but since covid, more companies have realized they can get away with manufacturing fewer units and selling them for a higher price.
Same goes for the death of windows 10. I want me some cheap Linux boxes.
deleted by creator
Oh but I do, ironically, for the same use cases LMAO. I like to tinker with AI and I like Microshits concept of Recall and similar ideas, like having an AI to be able to search through all my documents with nothing but a sentence or idea of what I’m looking for
But ain’t no fucking way I’m going to give a closed source AI that I’m not running myself that level of access
to be able to search through all my documents with nothing but a sentence or idea of what I’m looking for
Like this?
Personally, I want to try to have a llama or something rewrite voice to text prompts into Home Assistant commands. Should be very cool.
But yeah, I won’t pay the current prices to run it either, nor use the cloud.
sdklf;gjkl;dsgjkl;dsgjkl;dsgsjkl;g
When the price of those drop, the price of the ones that werent used for that purpose will also drop
Why not, though? Does the silicium really age?
sdklf;gjkl;dsgjkl;dsgjkl;dsgsjkl;g
Most crypto mining outfits undervolt their cards for lower power usage. They aren’t cranking them as you say they are. A dead GPU doesn’t produce anything for you; cranking it up the chance that it will fail. You’re better off running it an extra 4 years at a lower voltage than you are cranking it for 1.
I thought the efficiency curve for GPUs peaked before 100%. If electricity is your primary cost, driving the GPUs at lower loads saves money.
So you might end up with GPUs that spent their entire life at a steady 80% load or something.
This was my understanding as well - that miners often underclock their GPUs rather than overclock them.
Yes.
They don’t exactly age, but top of line chips have very large currents in very small conductors. When you do that with DC current, your conductors deform with time, up to the point that they stop working correctly.
That said, you probably can get plenty of casual use out of them.
Supposedly they do but I’ve had surprisingly good luck with used GPUs from ebay. I’m good with warning others against buying used GPUs on ebay though because then costs will stay lower for me.
Have you seen the price of new GPUs? Sure ya do. Maybe they only last a few years. That’s alright.
Unfortunately, this time around the majority of AI build up are GPUs that are likely difficult to accomodate in a random build.
If you want a GPU for graphics, well, many of them don’t even have video ports.
If your use case doesn’t need those, well, you might not be able to reasonably power and cool the sorts of chips that are being bought up.
The latest wrinkle is that a lot of that overbuying is likely to go towards Grace Blackwell, which is a standalone unit. Ironically despite being a product built around a GPU but needing a video port, their video port is driven by a non-nvidia chip.
My use case is for my own AI plans as well as some other stuff like mass transcoding 200TB+ of…“Linux ISOs” lol. I already have power/cooling taken care of for the other servers I’m running
I’ve already got my gaming needs satisfied for years to come (probably)
gaming can work on one GPU and display on another, thunderbolt egpus do it all the time.
Crypto miners watcing AI bubble.
Silverlight, what are you doing here? Go on, get outta here!
I just want the ai hate bubble to burst, the hype bubble probably needs to go as well but honestly I care less about that
I want the hype bubble to burst because I want them to go back to making AI for useful stuff like cancer screening and stop trying to cram it into my fridge or figure out how to avoid paying their workers, and the hate bubble isn’t going to stop until that does.
“I made an AI powered banana that can experience fear of being eaten and cuss at you while you do.”
Nah the hype will just make the POP louder.