• UnderpantsWeevil@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      ·
      2 days ago

      As an enhancement to an existing suite of diagnostic tools, certainly.

      Not as a stand in for an oncology department, though.

      • willington@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        5
        ·
        1 day ago

        As an assist to an actual oncologist, only.

        I can see AI as a tool in some contexts, doing some specific tasks better than an unassisted person.

        But as a replacement for people, AI is a dud. I would rather be alone than have a gf AI. And yes I am taking trauma and personal+cultural baggage into account. LLM is also a product of our culture for the most part, so will have our baggage anyway. But at least in principle it could be trained to not have certain kinds of baggage, and still, I would rather deal with a person save for the simplest and lowest stake interactions.

        If we want better people, we need to enfranchise them and remove most paywalls from the world. Right now the world instead of being inviting is bristling with physical, cultural, and virtual fences, saying to us, “you don’t belong and aren’t welcome in 99.99% of the space, and the other 0.01% will cost you.” Housing for now is only a privelege. In a world like that it’s a miracle the people are as decent as they are. If we want better people we have to delibarately, on purpose, choose broadbased human flourishing as a policy objective, and be ruthless to any enemies of said objective. No amnesty for the billionaires and wannabe billionaires. Instead they are trying to shove down our throats AI/LLMs and virtual worlds as replacements for an actually decent and inviting world.