Sorry if this is a dumb question, but does anyone else feel like technology - specifically consumer tech - kinda peaked over a decade ago? I’m 37, and I remember being awed between like 2011 and 2014 with phones, voice assistants, smart home devices, and what websites were capable of. Now it seems like much of this stuff either hasn’t improved all that much, or is straight up worse than it used to be. Am I crazy? Have I just been out of the market for this stuff for too long?

  • Valmond@lemmy.world
    link
    fedilink
    arrow-up
    1
    arrow-down
    1
    ·
    7 days ago

    “Mores laws dead” is so lame, and wrong too.

    Check out SSD, 3D memory, GPU…

    If you do not need to upgrade then it doesn’t mean things aren’t getting better (they are) just that you don’t need it or feel it is making useful progress for your use case. Thinking that because this, it doesn’t advance, is quite the egocentric worldview IMO.

    Others need the upgrades, like the crazy need for processing power in AI or weather forecasts or cancer research etc.

    • JustEnoughDucks@feddit.nl
      link
      fedilink
      arrow-up
      2
      ·
      edit-2
      6 days ago

      GPU advances have also gone way way down.

      For many years, YoY GPU increases lead to node shrinkages, and (if we simplify it to the top tier card) huge performance gains with slightly more power usage. The last 4-5 generations have seen the exact opposite: huge power increases closely scaling with performance increases. That is literally stagnation. Also they are literally reaching the limit of node shrinkage with current silicon technology which is leading to larger dies and way more heat to try to get even close to the same generational performance gain.

      Luckily they found other uses for uses GPU acceleration. Just because there is an increase in demand for a new usecase does not, in any way, mean that the development of the device itself is still moving at the same pace.

      That’s like saying that a leg of a chair is reaching new heights of technological advancement because they used the same chair leg to be the leg of a table also.

      It is a similar story of memory. They are literally just packing more dies in a PCB or layering PCBs outside of a couple niche ultra-expensive processes made for data centers.

      My original comment was also correct. There is a reason why >10 year old MCUs are still used in embedded devices today. That doesn’t mean that it can’t still be exciting finding new novel uses for the same technology.

      Again, stagnation ≠ bad

      The area that electronics technology has really progressed quite a bit is Signal Integrity and EMC. The things we know now and can measure now are pretty crazy and enable the ultra high frequency and high data rates that come out in the new standards.

      This is not about pro gamer upgrades. This is about the electronics (silicon based) industry (I am an electronics engineer) as a whole