• GBU_28@lemm.ee
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    1
    ·
    1 year ago

    Again you aren’t seeing this because these models are being developed for private enterprise purposes.

    Regarding deep machine code analysis, sure, that’s gonna take work but the whole hallucination thing is an off the shelf, rookie problem these days

    • Rikudou_Sage@lemmings.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      It’s not, though. Hallucinations are inherent to the technology, it’s not a matter of training. Good training can greatly reduce the likelihood, but cannot solve it.