![](/static/253f0d9b/assets/icons/icon-96x96.png)
![](https://fry.gs/pictrs/image/c6832070-8625-4688-b9e5-5d519541e092.png)
My one dark hope is AI will be enough of an impetus for somebody to update DMCA
My one dark hope is AI will be enough of an impetus for somebody to update DMCA
> pay once, get access to everything everywhere
> thinks about Elsevier
OH GOD PLEASE NO
That doesn’t seem to be the same article
Turns out that whole idea of women being the primary bearers of hundred of years of exploited reproductive labor might have had some weight to it, huh.
All that labor being redirected into “L’economie” means that, at base, you’ll have less children.
This is interesting but I’ll reserve judgement until I see comparable performance past 8 billion params.
All sub-4 billion parameter models all seem to have the same performance regardless of quantization nowadays, so 3 billion is a little hard to see potential in.
I seriously doubt the viability of this, but I’m looking forward to being proven wrong.
It helps differentiate between GNU/Linux users and the five people who use GNU/Hurd
Judging by my bank account I’m transitioning to non-profit status as well.
In my experience these open models is where the real work is being done. The large supervised models like DALL-E etc are more flashy but there’s a lot more going on behind the scenes than the model itself so it feels like it’s hard to gauge the real progress being done
Typically revolutions only occur when a significant number of elites defect from the current regime.
Large numbers of dissatisfied people need a Schelling Point to rally around and coordinate effectively.
Best bet for revolution right now seems to be for more elite colleges to start withholding degrees over this Israeli thing.
It’s usually not the water itself but the energy used to “systemize” water from out-of-system sources
Pumping, pressurization, filtering, purifying all take additional energy.
The problem is notably “powerful”, AIs need pretty significant hardware to run well
As an example the snapdragon NPUs I think can barely handle 7B models.
deleted by creator
Seems like the thing I’ve always considered true: you can turn a mediocre game into a masterpiece with the right application of music.
Not that I’m saying Stardew is mediocre, but good music seems to uplift a game more than any other part.
This is a good move for international open source projects, with multiple lawsuits in multiple countries around the globe currently ongoing, the intellectual property nature of code made using AI isn’t really secure enough to open yourself up to the liability.
I’ve done the same internally at our company. You’re free to use whatever tool you want but if the tool you use spits out copyrighted code, and the law eventually has decided that model users instead of model trainers are liable for model output, then that’s on you buddy.
Doing gods work here
ChatGpt already is multiple smaller models. Most guesses peg chatgpt4 as a 8x220 Billion parameter mixture of experts, or 8 220 billion parameter models squished together