• Barx [none/use name]@hexbear.net
    link
    fedilink
    English
    arrow-up
    88
    arrow-down
    1
    ·
    1 month ago

    Uncritical support for these AI bros refusing to learn CS and thereby making the CS nerds that actually know stuff more employable.

    • Imnecomrade [none/use name]@hexbear.net
      link
      fedilink
      English
      arrow-up
      6
      ·
      edit-2
      1 month ago

      Unfortunately, I am experiencing the opposite effect. I am an IT contractor tasked with writing scripts, and I keep applying for dev and IT jobs with a 2 year degree with no success for the past few years while I see old dinosaur fucks at my job not even knowing what functions are. They use ChatGPT to write scripts for them without any modifications to work with our specific clusterfuck of an environment that they created, and the scripts are broken and run in production because we don’t even have a testing environment. Meanwhile I have to clean their mess and get paid much less than them, let alone not have any PTO or benefits. It’s absolutely maddening to be moderately skilled in programming and witnessing some of the dumbest people on the planet get CS jobs through nepotism or impressing HR, moreso the former. It seems like my workplace will only hire someone if and only if they are incredibly incompetent.

      And now the team of fucking morons are taking care of packaging, so the one thing in my job that gave me control to fix things and helped with my sanity is being stripped away from me, which means I get to image laptops unsuccessfully because their scripts will not work. I might as well bring a personal laptop to work and practice programming since I am going to be sitting around waiting shit to get fixed for weeks at a time. Can’t wait to get the fuck out of the IT field and pursue electrical engineering someday. I’m not wasting 10-20 years of my life just to get a single promotion in a field dominated by cop-worshipping, white supremacist libertarians.

      Wish I could use a real programming language in a sane environment where I am hired to actually be a developer and not an IT person with 3 odd jobs that don’t help me build any transferrable experience and get paid for less than one for once.

      Honestly, how is it possible to unionize white-ass techbros and old techfucks when they don’t hold themselves accountable, work against each other, refuse to keep up to date in their field or learning anything new, and scream fire for non-issues so that other teams have to struggle, let alone most of techbros being against unions or anything remotely left of Hitler?

      • Barx [none/use name]@hexbear.net
        link
        fedilink
        English
        arrow-up
        3
        ·
        1 month ago

        Absolutely, IT and tech stuff is full of failing up. And it is exacerbsted by generational knowledge gaps that put vaguely millennial-aged people in the “sweet spot” for competent debugging and CS knowledge. The older generations that did not need to update their skills retained positions where those skills had been needed, passing them to a younger generation so they could join a massively incompetent managerial class. And zoomers etc never had to fight much with technical issues so they went straight to frameworks and web stuff.

        Obviously there are exceptions in all cases, but this tendency has lead to a situation where the Boomer/Gen X managerial class does the incompetent things you mention, with everything working only because one or two people under them put out the fires (usually but not always in the millennial range) and when younger still people get hired to replace them (I.e. pay people less) the whole thing explodes and management does its best to fail upwards again while everyone else gets to be unemployed for a while. This can be put off for a while by accidentally hiring competent zoomers that inevitably get sick of that shot and get a different job.

        My comment is really just joking about a silly extreme of this, which is CS students failing to learn basic CS because they think that AI solves everything for them. AI basically becomes the nerd they pay to do their homework poorly and then they will get a job, gear someone sat, “this is all built on linked lists” and have no idea what anyone is talking about despite dropping $130k on a degree that covered that topic a bunch of times.

        PS sorry you have to be in that environment, it sounds very frustrating. Nothing worse than being responsible to fix something that the higher-up decision-makers break. In my experience they even blame you for not fixing it fast enough or preventing the problem in the first place even though they forced the decision despite your objections. Scapegoating is rewarded in these corporate environments, it is how the nepotism fail-children protect their petty fiefdoms.

        • Imnecomrade [none/use name]@hexbear.net
          link
          fedilink
          English
          arrow-up
          2
          ·
          edit-2
          1 month ago

          My workplace for some reason wants 20 years of IT manager experience for dev jobs, and many of the incompetent higher-ups were promoted from IT despite having no experience, especially outside experience from any other company, which is why they make the stupidest decisions. Management is also equally incompetent. The particular software team that handles patches is led by a shitty manager that creates a terrible culture that seems to make even decent people become incompetent, like some brain worm. I worked with another contractor in my team that was hired for IT Networking with literally no experience. I haven’t had the same luck, probably because I don’t fit in with their culture.

  • FunkyStuff [he/him]@hexbear.net
    link
    fedilink
    English
    arrow-up
    60
    ·
    1 month ago

    This is simply revolutionary. I think once OpenAI adopts this in their own codebase and all queries to ChatGPT cause millions of recursive queries to ChatGPT, we will finally reach the singularity.

    • hexaflexagonbear [he/him]@hexbear.net
      link
      fedilink
      English
      arrow-up
      24
      ·
      1 month ago

      There was a paper about improving llm arithmetic a while back (spoiler: its accuracy outside of the training set is… less than 100%) and I was giggling at the thought of AI getting worse for the unexpected reason that it uses an llm for matrix multiplication.

      • FunkyStuff [he/him]@hexbear.net
        link
        fedilink
        English
        arrow-up
        17
        ·
        1 month ago

        Yeah lol this is a weakness of LLMs that’s been very apparent since their inception. I have to wonder how different they’d be if they did have the capacity to stop using the LLM as the output for a second, switched to a deterministic algorithm to handle anything logical or arithmetical, then fed that back to the LLM.

        • nightshade [they/them]@hexbear.net
          link
          fedilink
          English
          arrow-up
          10
          ·
          edit-2
          1 month ago

          I’m pretty sure some of the newer ChatGPT-like products (the consumer-facing interface, not the raw LLM) do in fact do this. They try to detect certain types of inputs (i.e. math problems or requesting the current weather) and convert it to an API request to some other service and return the result instead of a LLM output. Frankly it comes across to me as an attempt to make the “AI” seem smarter than it really is by covering up its weaknesses.

          • FunkyStuff [he/him]@hexbear.net
            link
            fedilink
            English
            arrow-up
            3
            ·
            1 month ago

            Yeah, Siri has been capable of doing that for a long time, but my actual hope would be that moreso than handing the user the API response, the LLM could actually keep operating on that response and do more with it, composing several API calls. But that’s probably prohibitively expensive to train since you’d have to do it billions of times to get the plagiarism machine to learn how to delegate work to an API properly.

  • bdonvrA
    link
    fedilink
    English
    arrow-up
    40
    ·
    1 month ago

    Can we make a simulation of a CPU by replacing each transistor with an LLM instance?

    Sure it’ll take the entire world’s energy output but it’ll be bazinga af

  • WhyEssEff [she/her]@hexbear.net
    link
    fedilink
    English
    arrow-up
    39
    ·
    edit-2
    1 month ago

    lets add full seconds of latency to malloc with a non-determinate result this is a great amazing awesome idea it’s not like we measure the processing speeds of computers in gigahertz or anything

    • WhyEssEff [she/her]@hexbear.net
      link
      fedilink
      English
      arrow-up
      26
      ·
      1 month ago

      sorry every element of this application is going to have to query a third party server that might literally just undershoot it and now we have an overflow issue oops oops oops woops oh no oh fuck

      • WhyEssEff [she/her]@hexbear.net
        link
        fedilink
        English
        arrow-up
        23
        ·
        edit-2
        1 month ago

        want to run an application? better have internet fucko, the idea guys have to burn down the amazon rainforest to puzzle out the answer to the question of the meaning of life, the universe, and everything: how many bits does a 32-bit integer need to have

        • WhyEssEff [she/her]@hexbear.net
          link
          fedilink
          English
          arrow-up
          21
          ·
          edit-2
          1 month ago

          new memory leak just dropped–the geepeetee says the persistent element ‘close button’ needs a terabyte of RAM to render, the linear algebra homunculus said so, so we’re crashing your computer, you fucking nerd

          • WhyEssEff [she/her]@hexbear.net
            link
            fedilink
            English
            arrow-up
            23
            ·
            edit-2
            1 month ago

            the way I kinda know this is the product of C-Suite and not a low-level software engineer is that the syntax is mallocPlusAI and not aimalloc or gptmalloc or llmalloc.

            • WhyEssEff [she/her]@hexbear.net
              link
              fedilink
              English
              arrow-up
              22
              ·
              edit-2
              1 month ago

              and it’s malloc, why are we doing this for things we’re ultimately just putting on the heap? overshoot a little–if you don’t know already, it’s not going to be perfect no matter what. if you’re going to be this annoying about memory (which is not a bad thing) learn rust dipshit. they made a whole language about it

              • WhyEssEff [she/her]@hexbear.net
                link
                fedilink
                English
                arrow-up
                10
                ·
                edit-2
                1 month ago

                if they’re proposing it as a C stdlib-adjacent method (given they’re saying it should be an alternative to malloc [memory allocate]) it absolutely should be lowercase. plus is redundant because you just append the extra functionality to the name by concatenating it to the original name. mallocai [memory allocate ai] feels wrong, so ai should be first.

                if this method idea wasn’t an abomination in and of itself that’s how it would probably be named. it currently looks straight out of Java. and at that point why are we abbreviating malloc. why not go the distance and say largeLanguageModelQueryingMemoryAllocator

  • miz [any, any]@hexbear.net
    link
    fedilink
    English
    arrow-up
    32
    ·
    1 month ago

    this is definitely better than having to learn the number of bytes your implementation uses to store an integer and doing some multiplication by five.

  • roux [he/him, they/them]@hexbear.net
    link
    fedilink
    English
    arrow-up
    28
    ·
    1 month ago

    This right here is giving me flashbacks of working with the dumbest people in existence in college because I thought I was too dumb for CS and defected to Comp Info Systems.

  • keepcarrot [she/her]@hexbear.net
    link
    fedilink
    English
    arrow-up
    26
    arrow-down
    1
    ·
    1 month ago

    One of the things I’ve noticed is that there are people who earnestly take up CS as something they’re interested in, but every time tech booms there’s a sudden influx of people who would be B- marketing/business majors coming into computer science. Some of them even do ok, but holy shit do they say the most “I am trying to sell something and will make stuff up” things.

        • FunkyStuff [he/him]@hexbear.net
          link
          fedilink
          English
          arrow-up
          10
          ·
          1 month ago

          I might need reeducation because I think that image is probably the closest thing to an appropriate usecase for LLMs I’ve seen ever.

          • Zvyozdochka [she/her, pup/pup's]@hexbear.net
            link
            fedilink
            English
            arrow-up
            8
            ·
            edit-2
            1 month ago

            I don’t think the idea of it is terrible, I can see it’s use cases, but I think it being implemented in MySQL directly is extremely silly. Imagine having to have all the instances of your database server running on some beefy hardware to handle the prompts, just do the actual processing on a separate machine with the appropriate hardware then shove it into the database with another query when it’s done.

              • invalidusernamelol [he/him]@hexbear.net
                link
                fedilink
                English
                arrow-up
                2
                ·
                edit-2
                1 month ago

                It’s nice to be able to do something like this without having to use an ORM. Especially if you need a version of the data that’s limited to a certain character size.

                Like having a replica on the edge that serves the 100 character summaries then only loads the full 1000+ character record when a user interacts with it.

                A summary of the review is also more useful than just the first 100 characters of a review.

                If the model that’s used for that is moderately light I could see it actually using less energy over time for high volume applications if it is able to lower overall bandwidth.

                This is all assuming that the model is only run one time or on record update though and not every time the query is run…