

Is it pronounced like the blood thinner/rat poison (warfarin)?
Is it pronounced like the blood thinner/rat poison (warfarin)?
When it was new it had minor charm-- the idea it was cheap and there were trillions of coins in circulation made it so penny-ante that people could have fun with it, and experiment with the tech on a tiny budget.
I played a little with it back in 2014 or so. You could buy some by interacting with a Reddit bot, and I mined a few coins on a GTX 660 (midrange gaming card for the day.
I recall sending 5000 coins to a local dogs-rescue that tried to join on the novelty, and paying for some used RAM in part with it.
By then BTC was basically unplayable without a rack of ASICs and it was already moving past the “currency” phase straight to “speculative asset”.
There’s a huge shift in male role models over the past few decades, and it always felt to me like the people who could never fit into the old militaristic, athletic “conqueror”-style mould saying “we’ll invent our own definition of masculinity” than a direct, fully-bought-in progression.
This will leave people behind-- the ones who can’t find new “appropriate” idols or aren’t impressed by their achievements. The Linus Torvalds version of conquering the world is hardly the Genghis Khan version.
Maybe we need to find a way to broaden the modern pantheon to figures that can resonate with a traditional audience.
What the hell is with the “Thank you for your attention to this matter?”
You’re shitposting to a global media audience, not politely asking Facilities to restock the vending machine with Snickers bars.
Are you just in full Business Guy Autocomplete mode? A Bigly Language Model?
XFCE’s old panel was a distinct mimic of CDE’s. I liked it…
But now CDE is open source and NsCDE gives you the same look with a highly customised fvwm config if you don’t want to stick to the Motif universe.
You might look into the Kailh Box switches. The click leaf adds a distinct tactile bump, plus the sound factor. The Box Navy is almost painfully tactile.
Try RiscOS for a glimpse of a world most of us missed.
I suspect it could be seen as a proper noun.
If Acme and FooCorp create a bridge between their private network spaces, it’s an internet (common noun) but not the Internet (proper noun, referring to the one with Goatse).
Let’s find an English teacher. And yell at them for forcing us to read the same terrible novel in both 10th and 12th grades. Maybe after that, return to this subject.
I’d say it’s a bad thing because it’s the wrong threat model as a default.
More home users are in scenarios like “I spilled a can of Diet Sprite into my laptop, can someone yank the SSD and recover my cat pictures” than “Someone stole my laptop and has physical access to state secrets that Hegseth has yet to blurt on Twitch chat”. Encryption makes the first scenario a lot harder to easily recover from, and people with explicit high security needs should opt into it or have organization-managed configs.
There is the technical argument that PoS was more energy efficient than running data centres full of ASICs or sometimes GPUs solely to produce proof-of-work.
It’s still different flavours of Let’s Prentend We’re Finance Except Without Grownup Boring Rules, but if we can avoid burning gigawatts and puffing the cost of GPUs, there’s a case for it.
Even the Grinch didn’t go on TV to tell peoole he was stealing Christmas.
The Global Foundries split was probably a way to get AMD out of the hyper-capital-intensive fab business. And without a tier-1 customer, Global had less reason to pursue smaller nodes.
Intel has that national-champion thing to keep it afloat. I can imagine there are defence contracts that will never go to a “TSMC Arizona Division” and they’ll pay whatever it takes to keep that going.
It’s easier just to price in the fee than having to shut down or retool a project.
The problem with attribution is the difficulty of 1000% accurate compliance.
If you grab 100 lines of code from a repository, or five paragraphs from a story, there’s probably a claim there. If you grab a single word, there’s probably not. But in the middle, there’s a paralysis of uncertainty-- is n lines similar enough to create liability? Can you remember where you saw what reliably? You end up with a bias towards “over-attribution” and it becomes difficult to pare it back. Does everything need a full Git-style commit history? Are we forever stuck keeping a credit on a project because it’s difficult to prove you’ve fully scrubbed their contributions?
Focus on how we pay artists (ideally lush grants) and forget about credit. Maybe establish a culture where it’s voluntary and acceptable-- that people feel that they’re allowed to cite their raw materials, and reuse doesn’t make the work lesser-- but don’t try to use the courts to force people to try to remember and track where they saw something when they just want to create, or it creates a hostile environment.
Hey, the broken clock’s right!
IP law always had a built-in scale pronlem. Without a registration-required copyright model, and probably some sort of mandatory licensing rate system, the sheer logistics of finding and arranging rights made a lot of business models inpractical. (For example, why aren’t modern bookstores just print-on-demand kiosks, or streaming services have All The Content? In large part because it would cost thousands to track down owners and negotiate terms for $1.87 in royalties multiplied by every item in the catalog.)
This was ignorable for a long time, or even a commercial advantage for firms with access to large, pre-negotiated catalogs. The AI boom created a surprise market of non-incumbents who need to get access to a lot of IP in a streamlined manner.
If we open the door for bulk IP clearance to grant the AI bubble a stro ger legal footing, it can also allow other, potentially more interesting business ideas to slip through.
Don’t tell him there’s been women on the $1 coin since 1979, and recently themed seasonal quarter reverses that alternate between illegible and just overly busy.
But what data would it be?
Part of the “gobble all the data” perspective is that you need a broad corpus to be meaningfully useful. Not many people are going to give a $892 billion market cap when your model is a genius about a handful of narrow subjects that you could get deep volunteer support on.
OTOH maybe there’s probably a sane business in narrow siloed (cheap and efficient and more bounded expectations) AI products: the reinvention of the “expert system” with clear guardrails, the image generator that only does seaside background landscapes but can’t generate a cat to save its life, the LLM that’s a prettified version of a knowledgebase search and NOTHING MORE
We’ve seen decentralized education and it tends to have problems with resourcing and economies of scale, and content policies get easily hijacked by loud people with personal vendettas.
That’s what baffles me with the DOGE fracas. How long will solidarity hold when there are some very clear winners and losers within their own class?
There are a lot of billionaires who have fat revenue streams coming out of the federal budget, and I don’t think they’re all eager to trigger some sort of Mad Max/Medieval social collapse just so they can be the Archduke of San Jose after America implodes. I doubt they all bought the Network State story.
A fair number of them, expecting to live for more than 10 years and wanting to remain rich, probably invested aggressively into “skate where the puck is going” businesses that are now being slaughtered in the name of doubling down on fossil fuels and uncompetitive domestic manufacturers. Will Elon eat their losses? Of course, he’s committing financial seppuku too.
Grok went into a new conspiracy k-hole?