db0@lemmy.dbzer0.com to Technology@lemmy.worldEnglish · 1 year agoAI bots hallucinate software packages and devs download themwww.theregister.comexternal-linkmessage-square104fedilinkarrow-up1408arrow-down112cross-posted to: technology@lemmy.worldtechnology@beehaw.org
arrow-up1396arrow-down1external-linkAI bots hallucinate software packages and devs download themwww.theregister.comdb0@lemmy.dbzer0.com to Technology@lemmy.worldEnglish · 1 year agomessage-square104fedilinkcross-posted to: technology@lemmy.worldtechnology@beehaw.org
minus-squarePrandom_returns@lemm.eelinkfedilinkEnglisharrow-up1arrow-down24·1 year agoWow, clever. Did you literally hallucinate this yourself or did you ask your LLM girlfriend for help? And by literally, I mean figuratively.
minus-squareBoomer Humor Doomergod@lemmy.worldlinkfedilinkEnglisharrow-up20arrow-down1·1 year agoYou’re gonna be real pissed to find out that computer bugs aren’t literal bugs
minus-squareFlying Squid@lemmy.worldlinkfedilinkEnglisharrow-up1·1 year agoAlthough they did start out that way- https://education.nationalgeographic.org/resource/worlds-first-computer-bug/
minus-squareT156@lemmy.worldlinkfedilinkEnglisharrow-up1·1 year agoWell, until a moth gets into your relays, anyhow.
minus-squarePrandom_returns@lemm.eelinkfedilinkEnglisharrow-up1arrow-down17·1 year agoI know it’s a big word, but surely you can google what anthropomorphization is? Don’t “ask” LLM, those things output garbage. Just google it.
minus-squarebbuez@lemmy.worldlinkfedilinkEnglisharrow-up10arrow-down1·1 year agoWatch out those software bugs may start crawling out of your keyboard
minus-squarelaughterlaughter@lemmy.worldlinkfedilinkEnglisharrow-up5arrow-down1·edit-21 year agoLike, literal garbage? The one sitting in my kitchen bin?!?!?!?!?!?!?!??!?!?!?!?!?!??!!?!
minus-squareFlying Squid@lemmy.worldlinkfedilinkEnglisharrow-up1arrow-down1·1 year agoYou call it a large language model, but there are much bigger things, it’s only approximating a human language, and it isn’t a physical model.
Wow, clever. Did you literally hallucinate this yourself or did you ask your LLM girlfriend for help?
And by literally, I mean figuratively.
You’re gonna be real pissed to find out that computer bugs aren’t literal bugs
Although they did start out that way-
https://education.nationalgeographic.org/resource/worlds-first-computer-bug/
Well, until a moth gets into your relays, anyhow.
I know it’s a big word, but surely you can google what anthropomorphization is? Don’t “ask” LLM, those things output garbage. Just google it.
Watch out those software bugs may start crawling out of your keyboard
Like, literal garbage? The one sitting in my kitchen bin?!?!?!?!?!?!?!??!?!?!?!?!?!??!!?!
You call it a large language model, but there are much bigger things, it’s only approximating a human language, and it isn’t a physical model.
deleted by creator