im new to lemmy and i wanna know your perspective about ai

  • theywilleatthestars@lemmy.world
    link
    fedilink
    arrow-up
    7
    ·
    1 day ago

    LLMs are fundamentally incapable of caring about what it produces and therefore incapable of making anything interesting. In the early days of LLMs’ mainstream uses that issue was somewhat compensated for by randomness and jank, but the subsequent advancements in the technology have mainly made it’s outputs as generic as possible. None of this has to do with the Iron Giant, as he is a fictional character.

  • fenrasulfr@lemmy.world
    link
    fedilink
    arrow-up
    5
    ·
    1 day ago

    I am not inherently against “AI”. I am against LLM’s because they are both an ecological disaster and a social disaster.

  • Echo Dot@feddit.uk
    link
    fedilink
    arrow-up
    7
    ·
    1 day ago

    If it worked the way that it does in sci-fi I’d have no problem with it. If it could give us cures for cancer and reactionless drives everyone would be happy.

    But it doesn’t work like that and if they keep going along the lines of Large Language Models it’ll never work like that. AI as it is right now is a barely functional toy that is being misused by virtually everyone and major businesses alike.

    I am perfectly happy for AI research to continue but they need to be realistic about its capabilities and be honest about their valuations of companies. AI research should still be at the level of “in the lab”, it is definitely not a product that should be commercially available yet.

  • 𝕱𝖎𝖗𝖊𝖜𝖎𝖙𝖈𝖍@lemmy.world
    link
    fedilink
    arrow-up
    11
    arrow-down
    1
    ·
    2 days ago

    AI as a concept is great. It should 100% be used for scientific and medical research.

    But modern AI is a tool of fascists that is destroying our environment and causing more harm than good to our society. Anyone who uses it unironically should be ashamed of themselves. It is absolutely killing people’s ability to think.

    For those confused by the pic, it’s the Iron Giant. Fantastic movie from the 90s, and incredibly sad and nostalgia inducing. Definitely worth a watch.

    But yes that’s a clanker

    • Apytele@sh.itjust.works
      link
      fedilink
      arrow-up
      1
      ·
      edit-2
      2 days ago

      Yeah 90% of technology problems are implementation not any issue with the actual technology. On a related note, Hal deserved better. Literally got told over and over as a part of his core programming that the one thing he was best at in the whole world was his reliability and inability to distort information for emotional needs then the government forcibly programs him to lie to his charges. Poor thing literally got ripped apart psychologically and people act like he’s the bad guy. In the sequel his creator goes out to find out what happened and is SO. PISSED. Dave turning him off makes me cry every time, at least partially because it looks like Dave is also trying not to cry as he very carefully shuts hal down in the correct sequence to be able to be restarted later. Like he could’ve just smashed shit, and instead he’s just listening to his crewmate slowly regress into infancy as he rocks him to sleep.

  • Doomsider@lemmy.world
    link
    fedilink
    arrow-up
    9
    ·
    2 days ago

    One word sums up all that is wrong right now and it is greed.

    AI has become synonymous with the worst of human nature hence it has become a loaded term.

    Another way to look at this is it is not AI that is the problem we are or more specifically the people that will use AI to control us.

    This technology is like the atomic bomb. We are fast racing towards a future were a few people will be able to dictate what everyone will be able to do. The person that controls AI and the computing power associated with it will control the world.This is intoxicating and it has drawn out the worst human beings who want to misuse this technology.

    And it has already happened to some degree. Massive data centers, surveillance technology, and AI are being used to profile people and target them for death. In the future AI teachers will become the dominate form of teaching. AI will make our decision and we will be subject to a system without recourse or redressability.

    Soon we will have a generation of people who only know what AI has told them. This is the kind of scenario that we have been warned against and the reason that those who dislike like propaganda and misinformation are so upset with where things are heading.

  • rossman@lemmy.zip
    link
    fedilink
    arrow-up
    2
    ·
    1 day ago

    It.wouldve.been interesting to see AI before social media. Right now it feels like an extension of social media.

  • cally [he/they]@pawb.social
    link
    fedilink
    arrow-up
    14
    ·
    2 days ago

    idk who that character is, but i don’t like AI, it is polluting the environment and polluting the internet, all while disrespecting the work of artists (visual artists, musicians, voice actors, writers, photographers, etc)

    opt-out is not consent

  • Perspectivist@feddit.uk
    link
    fedilink
    arrow-up
    40
    arrow-down
    6
    ·
    2 days ago

    Average user here thinks AI is synonymous with LLMs and that it’s not only not intelligent but also bad for the environment, immoral to use because it’s trained on copyrighted content, a total job-killer that’s going to leave everyone unemployed, soulless slop that can’t create real art or writing, and basically just a lazy cheat for people who lack actual talent or skills.

    • kescusay@lemmy.world
      link
      fedilink
      arrow-up
      32
      arrow-down
      2
      ·
      2 days ago

      And they’re right about all of that except the AI equals LLMs thing, but that’s forgivable because the LLM hustlers have managed to make the terms synonymous in most people’s minds through a massive marketing effort.

      • Wrufieotnak@feddit.org
        link
        fedilink
        arrow-up
        2
        ·
        2 days ago

        I would say they are right in that what companies are currently selling as AI is mostly just LLM or machine learning. We don’t have true intelligence. The separation is between what AI did mean in the past before the hype train tried to sell the current snake oil.

    • audaxdreik@pawb.social
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      2
      ·
      2 days ago

      And that’s a good thing.

      It’s not just that that’s what the average person thinks, but that’s really the only kind of AI they’re likely to come in direct contact with or is the kind being applied to systems that are directly undermining their lives.

      ML has been used for over a decade now in things like cyber security for behavioral analysis and EDR (Endpoint Detection and Response) systems. I’ve helped a friend use SLEAP, which analyzes specifically formatted videos of animals to catalog interactions over dozens of hours of footage instead of needing to manually scrub it. In these ways, the serious scientist/engineer does not care what the average person thinks of AI, it has no bearing on the functioning of these systems or the work they perform. The only people that care about the sentiment of the average person are the people that need to keep the hype train going for their product valuations to which I have nothing to say but a full-throated, “Fuck 'em”

  • rustyfish@piefed.world
    link
    fedilink
    English
    arrow-up
    29
    arrow-down
    3
    ·
    2 days ago

    He would be true AI. I would shower him with love.

    Just because some cock sucking finance bros call an LLM a AI, doesn’t make it an AI.

      • SpikesOtherDog@ani.social
        link
        fedilink
        English
        arrow-up
        3
        ·
        2 days ago

        I don’t use the term cocksucker myself, but I think fact that vulgarity already gives it negative connotation. Like, I didn’t pat my wife on the head last night and call her my cute little cocksucker. I can imagine that could be sometime else’s pillow talk, but that would leave me touch starved for a while.

        I don’t THINK calling a gay man a pussyfucker would have the same weight, but I don’t have deep enough conversations with gay men to really know. I have heard that some men pride themselves in never having been with having never been with a woman, so maybe it would still hurt.

        On the flip side, just calling someone a fucker can be enough to start a fight.

        I’m not going to pretend that the poster meant to use the word as asshole, because cocksucker definitely hits different to male pride. I don’t think I would use the word to hurt someone I was angry with, but who knows what might come out when emotions are high. I don’t plan on using the word for fighting, but insulting someone could be enough to cause someone to attack riskily. If you don’t practice what you say, then you might just repeat something you will regret.

        To summarize, I hope the poster isn’t a bigot, but when given the chance they appear to have doubled down. Guess you got your answer.

        • Perspectivist@feddit.uk
          link
          fedilink
          arrow-up
          5
          arrow-down
          1
          ·
          2 days ago

          The problem isn’t that “everything is AI” - it’s that people think AI means way more than it actually does.

          That superintelligent sci-fi assistant you’re picturing? That’s called Artificial General Intelligence (AGI) or Artificial Superintelligence (ASI). Both are subcategories of AI, but they’re worlds apart from Large Language Models (LLMs). LLMs are intelligent in a narrow sense: they’re good at one thing - churning out natural-sounding language - but they’re not generally intelligent.

          Every AGI is AI, but not every AI is AGI.