![](/static/253f0d9b/assets/icons/icon-96x96.png)
![](https://fry.gs/pictrs/image/c6832070-8625-4688-b9e5-5d519541e092.png)
You see, the music labels know that stealing from Spotify is morally correct.
You see, the music labels know that stealing from Spotify is morally correct.
With recall you can search for a website you saw once, a link in a discord channel, an email all at once in one place.
They have (or had) a suspiciously high confession rate too.
They’re really, really bad at context. The main failure case isn’t making things up, it’s having text or image in part of the result not work right with text or image in another part because they can’t even manage context across their own replies.
See images with three hands, where bow strings mysteriously vanish etc.
5/6 not wearing them seems more statistically significant
“Two sources told Electrek that Tinucci was fighting back pressure from Musk to fire a bigger percentage of her team, and the CEO decided to let go of the entire team as an example.”
https://electrek.co/2024/05/01/elon-musk-throwing-weight-tesla-wrecking-ball/
Just because it’s open in another tab, anyone who wants to see what threads does can read this aggressive view on insta, mosseri and other tech industry managers
Remotely at scale.
So yeah you could assassinate someone like that, or you could break every cars brakes at once and have thousands of simultaneous car accidents timed during some other infrastructure attack
Op has a pencil in the top right, looks like it was edited
The only take that makes sense
There’s a long glorious history of things being AI until computers can do them, and then the research area is renamed to something specific to describe the limits of it.
They did promise skynet ai though. They’ve misrepresented it a great deal
You realise those robots were made by humans to win a war? That’s the trick, the danger is humans using ai or trusting it. Not skynet or other fantasies.
I’m scared of Second Variety
Until someone uses it for a little more than boilerplate, and the reviewer nods that bit through as it’s hard to review and not something a human/the person who “wrote” it would get wrong.
Unless all the ai generated code is explicitly marked as ai generated this approach will go wrong eventually.