Someone used Midjourney to AI-generate images of politicians cheating on their spouses — though claims that it was well-intentioned.

  • donuts@kbin.social
    link
    fedilink
    arrow-up
    20
    ·
    1 year ago

    Frankly I’m struggling to see even a single upside to AI at this point. Shit like this fucking sucks.

        • Aatube@kbin.social
          link
          fedilink
          arrow-up
          1
          ·
          1 year ago

          Text generation, general advice, barebones stuff, easier way to create art, easier way to create, finding patterns and detecting things

        • Scubus@sh.itjust.works
          link
          fedilink
          arrow-up
          1
          ·
          1 year ago

          Better combat in games, massive increases in technology as a result of AI designing things, the ability to test millions of potential medications at once, guidance counciling, greif counciling, counciling in general, once it’s gotten over it’s hallucination issues massive increases in intellectual development, the ability to fully automate supply chains, the ultimate sword and shield combo with MAD hopefully ending all physically violent wars, and the fact that eventually anything a human can do, an AI could do better.

          • Aatube@kbin.social
            link
            fedilink
            arrow-up
            1
            ·
            edit-2
            1 year ago

            I think you’re overestimating its potential… leaving very critical things to a machine is not a good idea, and I’m not sure how it will test medications

            • Scubus@sh.itjust.works
              link
              fedilink
              arrow-up
              0
              ·
              1 year ago

              We leave critical things to humans all the time. We’re just machines with a shitload more failure points. And here is how it designs and tests multiple medications at once. I don’t think that article even touches on protein folding, which is another big one.

              • Aatube@kbin.social
                link
                fedilink
                arrow-up
                1
                ·
                1 year ago

                I understand leaving critical things to predictable machines but very critical things like counseling (preventing people from things like suicide) and nuclear warring (which should already have experienced commanders) should not be left to unpredictable machines.

    • DaniAlexander@kbin.social
      link
      fedilink
      arrow-up
      10
      ·
      1 year ago

      Ai will give opportunities to the handicapped in a very near future. Imagine a paraplegic artist or coder or sculptor who can describe to a machine their ‘vision’ . They can do that now even with a picture. Soon they can do that with a 3d machine. I don’t mean ‘make a painting of the mountains’ but instead ‘cadmium red mixed with yellow brush stroke in circles’.

      When you think of AI try not to think of the bad actors. Try to think of the good things that can come from it. All the worlds that will be opened up for people.

      • DessertStorms@kbin.social
        link
        fedilink
        arrow-up
        7
        ·
        1 year ago

        I don’t disagree with the point you’re making*, but please, #SayTheWord - we are disabled, not handicapped (note that at the end of this they also discuss a shift to person first language, as in “person with disability”, which some people do prefer, but many others, myself included, still favour simply “disabled” or “disabled person/adult/child/whatever is relevant”).

        *I will just say that disabled people currently needing to, in most cases, exchange privacy and sometimes even security so that the companies selling these devices can make even more money, for access to these new technologies, is not something we should be ok with, and we should be fighting for accessibility that isn’t dependant on profiteering, but instead on the actual will to include disabled people in society.

        • LoafyLemon@kbin.social
          link
          fedilink
          arrow-up
          7
          ·
          edit-2
          1 year ago

          There are lots of open source projects involving AI that you can run on your personal computer. I think the community-driven projects are heading in the right direction, but it’s completely opposite for the ones owned by corporations as they’re only driven by profit margins, not people.

          • donuts@kbin.social
            link
            fedilink
            arrow-up
            3
            ·
            1 year ago

            The problem with “open source” in the context of AI is that the source code is a much smaller factor than the training dataset. AI companies running around and scraping everybody’s data as if they own anything they see is a real problem raising massive ethical and legal concerns.

          • DessertStorms@kbin.social
            link
            fedilink
            arrow-up
            1
            ·
            1 year ago

            That’s great (genuinely), unfortunately having to work outside of the mainstream brings its own hurdles -this isn’t on the same level but consider twitter vs mastodon or reddit vs lemmy: the corporate solution is shiny and easy and requires very little to no effort from the end user to use, while the other requires a little more understanding and effort and comfort with technology, and might not appeal, or even be known, to many. Sure, people can look it up and learn it, but that looking and learning are hurdles, and when it comes to accessibility devices, those hurdles tend to be more significantly in the way.

            To be clear, I am not trying to shit on the open source stuff, I do genuinely think it’s great, but like so many of the solutions we currently have to work with, it’s a band-aid on a cancer. We need to remove the cancer.

        • DaniAlexander@kbin.social
          link
          fedilink
          arrow-up
          4
          ·
          1 year ago

          I apologize that my choice of language was insulting to you. I am disabled(using your word, tho I grew up with and an comfortable with my own terms), so I rarely think about terminology for myself. I’ll try to remember I’m the future.

          As per your point, well I do see a problem with excess profits on the backs of other people, I also realize that innovation does not come for free. However you should probably look at open source AI . It is one of the fastest growing areas. I think if you are concerned about privacy and profits it would probably do you good to work with campaigns that are trying to get legislation passed in this area.

          • DessertStorms@kbin.social
            link
            fedilink
            arrow-up
            2
            ·
            1 year ago

            No worries, I wasn’t personally insulted, I just think the words we use are important. Here is a good piece that talks more about it.

            And thank you for the advice, and I agree, there are some smaller solutions coming through but I worry that in the environment they exist in (capitalism that already looks to exploit and ableism on top of that) won’t allow them to become viable solutions. I think the problem is not one that can be solved with legislation, it (not just AI but the system it and we exist under) is a much larger problem that needs a much bigger solution and that’s abolish it and build better.

      • donuts@kbin.social
        link
        fedilink
        arrow-up
        3
        ·
        1 year ago

        Unfortunately AI “art” is almost exclusively bad actors, using massive datasets of scraped and stolen work without consent, copyright, or license. It doesn’t have to be that way, and hopefully in the very near future the ethics and legality will be clarified and things will change, but right now AI “art” is simply plagiarism on an unprecedented, industrialized scale.

        By the way, there are quite a lot of disabled artists around today.

      • Machinist3359@kbin.social
        link
        fedilink
        arrow-up
        3
        ·
        1 year ago

        You may be right in some ways, but if encourage you (or anyone) to not use theoretical disabled people as counterpoints. Ideally, cite something someone has said instead.

        I understand the impulse, but doing so often makes people sound more disabled today andputs words in the communities’ mouth.

        There are paraplegics writing and creating art today. There is a great list of needs they have from society which precedes ai assistance.

        More nefarious people (not saying you, to be clear) also do this to veil shitty tech or policies. “Think of the disabled, with targeted advertisements based on personal data we’ll make using the web less burdensome”

        • DaniAlexander@kbin.social
          link
          fedilink
          arrow-up
          3
          ·
          1 year ago

          I think your point is kind of silly. There are lots of people that can do lots of things but still people that can’t. I am also disabled. But I realized there are other people who cannot do what I can do that are also disabled. I think it’s pretty clear I was speaking of them. I’m not sure why they are suddenly unimportant in terms of a discussion. When speaking of a discussion, the most incredible breakthroughs are the ones that should be touted, imo. And also in my opinion, the ability to create where you couldn’t before, the ability to express your imagination that has been locked inside your head, is the greatest gift AI will give.

          Maybe you don’t feel the same way. That’s fine but don’t discount people with disabilities who cannot write or create right now.

          • Machinist3359@kbin.social
            link
            fedilink
            arrow-up
            2
            ·
            1 year ago

            To be clear I’m not saying there’s no value to such improvements, but specifically want people to exercise caution in the realm of the hypothetical.

            Rather, we should lift up actual evidence and voices of the people affected. If such disabled people are hard to find, that’s a good reason to reframe. Sometimes the actual needs are much less hypothetical. Sometimes the hypothetical greatly overestimated the tech.

            To root this discussion, maybe linking to paraplegic speaking on creative AI tools? Or similar examples of AI being used for a11y today which indicates this trend is realistic and a priority.

            • DaniAlexander@kbin.social
              link
              fedilink
              arrow-up
              1
              ·
              1 year ago

              Such people are not hard to find it’s just that this discussion is never centered around them. Why? Because this was out of the realm of possibility. This was just not on the radar for people.

      • donuts@kbin.social
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        You possibly haven’t considered the impact of unethical tech companies and governments using AI and pilfered genetic data to do any number of fucked up things.

        • Alleywurds@kbin.social
          link
          fedilink
          arrow-up
          2
          ·
          1 year ago

          No, I definitely have, and have written a good bit about and with AI. I just interviewed a someone who works in AI for medicine last week for my podcast.

          Any tools can be used in fucked up ways, but that doesn’t mean that they have literally no positive uses either.

    • AndrewZabar@beehaw.org
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      It’s already over. Our entire society is gonna collapse in the next couple of decades because of AI and climate change. So… I dunno, brace yourself.

      Humanity has no self-restraint. If they can, they do. For money. For power. For advantage. For lust. Anything and everything.

      Sorry to be such a downer, but the show’s already over. If you think otherwise, tell me please who and how is going to prevent lies, misinformation and deceit on a mass scale from ripping society apart? I hate to be right in this case, but I am. :-(

        • AndrewZabar@beehaw.org
          link
          fedilink
          arrow-up
          0
          ·
          1 year ago

          Clicks = money = supersede any desire to remove content because of fact checking. Lies yield money as green as the money truth yields.

          • Aatube@kbin.social
            link
            fedilink
            arrow-up
            1
            ·
            1 year ago

            More AI lies = People see patterns = Less clicks, eventual scandal = Bankrupt or tabloid status
            Not to mention most credible journalism outlets need to maintain credibility to keep their money

    • CoderKat@lemm.ee
      link
      fedilink
      arrow-up
      1
      ·
      edit-2
      1 year ago

      AI powered tooling is amazing. I already use it regularly for my work (I’m a programmer). It’s primarily in the form of intelligent auto complete (lookup GitHub copilot for an example). But AI can also do stuff like catch some bugs, automatically address code review comments, etc. I look forward to seeing it being able to generate larger blocks accurately (in particular, I’d love it to automate test generation – it currently can only handle very basic cases).

      I’m sure other industries can benefit similarly. Eg,

      • Video game level design could take in some assets you made in the theme of what you want and then generate slight variations. We’ve already had procedural generation of stuff like plants for ages (you can generate countless slightly different tree models, for example). This is just the next step into more complicated structures. For example, suppose you’re making a huge office space, like in Control. Many desks and whiteboards in that game suffer from asset reuse. AI could help give slight variations to make the setting feel more natural.
      • Graphics designers I’m sure already benefit from “magic eraser” functionality. It used to be time consuming to remove something from an image. Now it’s easy. I’m sure the next step is generally easier image editing, like moving objects in an image (Google demoed something like that at I/O).
      • Countless scientific uses, especially for chemistry and biology, because AI can be really great at constraint solving. We’re already seeing this. Specialized AI is better than doctors at diagnosing certain tumors, for example.