db0@lemmy.dbzer0.com to Technology@lemmy.worldEnglish · 8 months agoAI bots hallucinate software packages and devs download themwww.theregister.comexternal-linkmessage-square104fedilinkarrow-up1408arrow-down112cross-posted to: technology@lemmy.worldtechnology@beehaw.org
arrow-up1396arrow-down1external-linkAI bots hallucinate software packages and devs download themwww.theregister.comdb0@lemmy.dbzer0.com to Technology@lemmy.worldEnglish · 8 months agomessage-square104fedilinkcross-posted to: technology@lemmy.worldtechnology@beehaw.org
minus-squareQuaternionsRock@lemmy.worldlinkfedilinkEnglisharrow-up21arrow-down2·8 months agohttps://en.m.wikipedia.org/wiki/Hallucination_(artificial_intelligence)
minus-squarePrandom_returns@lemm.eelinkfedilinkEnglisharrow-up2arrow-down29·8 months agoIt’s as much as “Hallucination” as Tesla’s Autopilot is an Autopilot https://en.m.wikipedia.org/wiki/Tesla_Autopilot I don’t propagate techbro “AI” bullshit peddled by companies trying to make a quick buck Also, in the world of science and technology a “Standard” means something. Something that’s not a link to a wikipedia page. It’s still anthropomorphising software and it’s fucking cringe.
minus-squaresurewhynotlem@lemmy.worldlinkfedilinkEnglisharrow-up26arrow-down1·8 months agoOh man, I’m excited for you. Today is the day you learn words can have two meanings! Wait until you see what the rest of the dictionary contains. It is crazy! But not actually crazy, because dictionaries don’t have brains.
minus-squarePrandom_returns@lemm.eelinkfedilinkEnglisharrow-up1arrow-down24·8 months agoWow, clever. Did you literally hallucinate this yourself or did you ask your LLM girlfriend for help? And by literally, I mean figuratively.
minus-squareSemi-Hemi-Lemmygod@lemmy.worldlinkfedilinkEnglisharrow-up20arrow-down1·8 months agoYou’re gonna be real pissed to find out that computer bugs aren’t literal bugs
minus-squareT156@lemmy.worldlinkfedilinkEnglisharrow-up1·8 months agoWell, until a moth gets into your relays, anyhow.
minus-squareFlying Squid@lemmy.worldlinkfedilinkEnglisharrow-up1·8 months agoAlthough they did start out that way- https://education.nationalgeographic.org/resource/worlds-first-computer-bug/
minus-squarePrandom_returns@lemm.eelinkfedilinkEnglisharrow-up1arrow-down17·8 months agoI know it’s a big word, but surely you can google what anthropomorphization is? Don’t “ask” LLM, those things output garbage. Just google it.
minus-squarebbuez@lemmy.worldlinkfedilinkEnglisharrow-up10arrow-down1·8 months agoWatch out those software bugs may start crawling out of your keyboard
minus-squarelaughterlaughter@lemmy.worldlinkfedilinkEnglisharrow-up5arrow-down1·edit-28 months agoLike, literal garbage? The one sitting in my kitchen bin?!?!?!?!?!?!?!??!?!?!?!?!?!??!!?!
minus-squareFlying Squid@lemmy.worldlinkfedilinkEnglisharrow-up1arrow-down1·8 months agoYou call it a large language model, but there are much bigger things, it’s only approximating a human language, and it isn’t a physical model.
https://en.m.wikipedia.org/wiki/Hallucination_(artificial_intelligence)
It’s as much as “Hallucination” as Tesla’s Autopilot is an Autopilot
https://en.m.wikipedia.org/wiki/Tesla_Autopilot
I don’t propagate techbro “AI” bullshit peddled by companies trying to make a quick buck
Also, in the world of science and technology a “Standard” means something. Something that’s not a link to a wikipedia page.
It’s still anthropomorphising software and it’s fucking cringe.
Oh man, I’m excited for you. Today is the day you learn words can have two meanings! Wait until you see what the rest of the dictionary contains. It is crazy! But not actually crazy, because dictionaries don’t have brains.
Wow, clever. Did you literally hallucinate this yourself or did you ask your LLM girlfriend for help?
And by literally, I mean figuratively.
You’re gonna be real pissed to find out that computer bugs aren’t literal bugs
Well, until a moth gets into your relays, anyhow.
Although they did start out that way-
https://education.nationalgeographic.org/resource/worlds-first-computer-bug/
I know it’s a big word, but surely you can google what anthropomorphization is? Don’t “ask” LLM, those things output garbage. Just google it.
Watch out those software bugs may start crawling out of your keyboard
Like, literal garbage? The one sitting in my kitchen bin?!?!?!?!?!?!?!??!?!?!?!?!?!??!!?!
You call it a large language model, but there are much bigger things, it’s only approximating a human language, and it isn’t a physical model.
deleted by creator