It’s like a watermelon on a toothpick. I bet he was going home to cry on his oversized pillow.
It’s like a watermelon on a toothpick. I bet he was going home to cry on his oversized pillow.
At first glance, I probably thought JXL was another attempt at JPEG2000 by a few bitter devs, so I had ignored it.
Yeah, my examples/description was more intended to be conceptual for folks that may not have dealt with the nitty gritty. Just mental exercises. I’ve only done a small bit of image analysis, so I have a general understanding of what’s possible, but I’m sure there are folks here (like you) that can waaay outclass me on details.
These intermediate-to-deep dives are very interesting. Not usually my cup of tea, but this does seem big. Thanks for the info.
(fair warning - I go a little overboard on the examples. Sorry for the length.)
No idea on the details, but apparently it’s more efficient for multithreaded reading/writing.
I guess that you could have a few threads reading the file data at once into memory. While one CPU core reads the first 50% of the file, and second can be reading in the second 50% (though I’m sure it’s not actually like that, but as a general example). Image compression usually works some form of averaging over an area, so figuring out ways to chop the area up, such that those patches can load cleanly without data from the adjoining patches is probably tricky.
I found this semi-visual explanation with a quick google. The image in 3.4 is kinda what I’m talking about. In the end you need equally sized pixels, but during compression, you’re kinda stretching out the values and/or mapping of values to pixels.
Not an actual example, but highlights some of the problems when trying to do simultaneous operations…
Instead of pixels 1, 2, 3, 4 being colors 1.1, 1.2, 1.3, 1.4, you apply a function that assigns the colors 1.1, 1.25, 1.25, 1.4. You now only need to store the values 1.1, 1.25, 1.4 (along with location). A 25% reduction in color data. If you wanted to cut that sequence in half for 2 CPUs with separate memory blocks to read at once, you lose some of that optimization. Now CPU1 and CPU2 need color 1.25, so it’s duplicated. Not a big deal in this example, but these bundles of values can span many pixels and intersect with other bundles (like color channels - blue can be most efficiently read in 3 pixels wide chunks, green 2 pixel wide chunks, and red 10 pixel wide chunks). Now where do you chop those pixels up for the two CPUs? Well, we can use our “average 2 middle values in 4 pixel blocks” approach, but we’re leaving a lot of performance on the table with empty or useless values. So, we can treat each of those basic color values as independent layers.
But, now that we don’t care how they line up, how do we display a partially downloaded image? The easiest way is to not show anything until the full image is loaded. Nothing nothing nothing Tada!
Or we can say we’ll wait at the end of every horizontal line for the values to fill in, display that line, then start processing the next. This is the old waiting for the picture to slowly load in 1 line at a time cliche. Makes sense from a human interpretation perspective.
But, what if we take 2D chunks and progressively fill in sub-chunks? If every pixel is a different color, it doesn’t help, but what about a landscape photo?
First values in the file: Top half is blue, bottom green. 2 operations and you can display that. The next values divide the halves in half each. If it’s a perfect blue sky (ignoring the horizon line), you’re done and the user can see the result immediately. The bottom half will have its values refined as more data is read, and after a few cycles the user will be able to see that there’s a (currently pixelated) stream right up the middle and some brownish plant on the right, etc. That’s the image loading in blurry and appearing to focus in cliche.
All that is to say, if we can do that 2D chunk method for an 8k image, maybe we don’t need to wait until the 8k resolution is loaded if we need smaller images for a set. Maybe we can stop reading the file once we have a 1024x1024 pixel grid. We can have 1 high res image of a stoplight, but treat is as any resolution less than the native high res, thanks to the progressive loading.
So, like I said, this is a general example of the types of conditions and compromises. In reality, almost no one deals with the files on this level. A few smart folks write libraries to handle the basic functions and everyone else just calls those libraries in their paint, or whatever, program.
Oh, that was long. Um, sorry? haha. Hope that made sense!
Oh, I’ve just been toying around with Stable Diffusion and some general ML tidbits. I was just thinking from a practical point of view. From what I read, it sounds like the files are smaller at the same quality, require the same or less processor load (maybe), are tuned for parallel I/O, can be encoded and decoded faster (and there being less difference in performance between the two), and supports progressive loading. I’m kinda waiting for the catch, but haven’t seen any major downsides, besides less optimal performance for very low resolution images.
I don’t know how they ingest the image data, but I would assume they’d be constantly building sets, rather than keeping lots of subsets, if just for the space savings of de-duplication.
(I kinda ramble below, but you’ll get the idea.)
Mixing and matching the speed/efficiency and storage improvement could mean a whole bunch of improvements. I/O is always an annoyance in any large set analysis. With JPEG XL, there’s less storage needed (duh), more images in RAM at once, faster transfer to and from disc, fewer cycles wasted on waiting for I/O in general, the ability to store more intermediate datasets and more descriptive models, easier to archive the raw photo sets (which might be a big deal with all the legal issues popping up), etc. You want to cram a lot of data into memory, since the GPU will be performing lots of operations in parallel. Accessing the I/O bus must be one of the larger time sinks and CPU load becomes a concern just for moving data around.
I also wonder if the support for progressive loading might be useful for more efficient, low resolution variants of high resolution models. Just store one set of high res images and load them in progressive steps to make smaller data sets. Like, say you have a bunch of 8k images, but you only want to make a website banner based on the model from those 8k res images. I wonder if it’s possible to use the the progressive loading support to halt reading in the images at 1k. Lower resolution = less model data = smaller datasets to store or transfer. Basically skipping the downsampling.
Any time I see a big feature jump, like better file size, I assume the trade off in another feature negates at least half the benefit. It’s pretty rare, from what I’ve seen, to have improvements on all fronts.
Even better, this must be fantastic when you’re training AI models with millions of images. The compression level AND performance should be a game changer.
Yeah, that looks more reasonable. The original graph makes it look like there have been ~5x the number of deaths in the last few years compared to ~10 years ago. Adjusted for population growth, it’s ~2-3x.
That’s still really concerning and makes the point the article was making, while being much more accurate and defensible when scrutinized. Thanks for that!
Exactly. I stumbled across this report from the AZ Dept of Health which breaks it down into per 100k people and the data still supports the author’s point. The report then goes on to divide up the population by age, residents vs visitors, county, etc.
Hell, the FT author could have just included a plot of the population growth, which was pretty linear. Not great, but better than nothing.
Grinds my gears.
Just thought I’d add this report from the AZ health department. This breaks down the factors MUCH better and comes to a similar, but not quite as extreme, conclusion. Only part is normalized for population, but it gives an idea of how to scale the numbers.
Yes. Hot air is thinner, so there’s less lift on aircraft wings. There’s actually a conversion they’re supposed to use that basically says, 'At this temp, treat the plane as if it’s actually at this other, much higher, altitude."
Here’s one of the recent videos I’ve seen mentioning it (around 5 min in they mention the “density altitude”). I’m not a pilot and just find the stuff interesting.
I’m not advocating for better or worse. In the end, the data shows what it shows. I’m just saying that there was essentially no “analysis”, making any interpretation inappropriate.
Hey, more people should survive, thanks to newer medical treatments and more concentration of populations around cities.
On the flip side, there’s a larger portion of the population that’s older and from out of state.
In between there’s the chance that the threat of heat-related health problems should be much diminished due to widespread access to air conditioning. But, that also means more people haven’t had first hand experience with heat exhaustion/stroke, and don’t realize how quickly things can go from kinda bad to dead.
Yeah, it can be as simple as the death certificates requiring only a primary cause of death.
Old man collapses from a heart attack while trying to change a tire on a hot desert road? Cause of death: heart attack. If more details are requested, they could probably get away with just claiming age-related health issues. The guy is dead, no foul play, the case is closed.
The libs are making us slaves to those damn thermomasters! They better not take away my freedom to boil off those 3 remaining brain cells!
Very much this, and especially over this period. More universal diagnostics, more emphasis on secondary causes and contributors, etc.
And it works the other way, too. Fewer people should die per capita based on faster EMS response times, better medicine, more urban living, etc.
The big one for me is age. I never really heard of people retiring to Arizona until the late 90s. It was always Florida before then. The over 50 crowd is 36% now vs 23% in 1970.
I agree. And shit like this makes me trust financial reporting in general. It’s akin to not accounting for inflation in financial graphs.
And yes, the risk adjustment can be as complex as they want to make it, but when I clicked, I was expecting a study of some type. Probably my bias kicking in. My first thought was, “Are they kidding?” Then I saw it was from a news source and thought, “Oh, okay… no wait. Still, they know this is bad, right?”
Still gets those nummy clicks, I guess.
I’m 110% on board with global warming, but this graph is misleading.
The author needs to at least correct for population changes (heat deaths per X residents). Even better would be to account for changing demographics, like age and county. From this random stats website, it looks like there has been a dramatic increase in proportion of older residents since 1970. Old people are more likely to die, so more elders = more deaths.
If I wasn’t about to head to bed, I might try to fix it, but… sleep.
Oh, and I’m pretty sure there has been an increase in small plane crashes in AZ. The hot air is much thinner than most pilots are used to, so they tend to forget accounting for changes in thrust and climb rates. I’m pretty sure a couple happened in just the last few weeks.
“Black Company” fighting the good fight, it seems.
Ah, yes, you’re right! Thanks for that.
smh
That’s fucking tragic. Makes me want to whip out the ole Hacker Manifesto.
Kids will never again know the fun of dealing with long distance calling plans and the barely usable international calling that used to cost half you rent for a 15 minute conversation.
Probably based on the Cap’n Crunch whistle pay phone hack.
Someone correct me if I’ve missed a few bits, but here’s the story…
First, a little history.
Payphones were common. If you’re younger, you’ve probably seen them in movies. To operate them, you picked up the handset, listened for the dial tone (to make sure no one yanked the cord loose), inserted the amount shown by the coin slot, and then dialed. You have a limited amount of time before an automatic message would ask you to add more money. If you dialed a long distance number, a message would play telling you how much more you needed to insert.
There were no digital controls to this - no modern networking. The primitive “computers” were more like equipment you’d see in a science class. So, to deal with the transaction details, the coin slot mechanism would detect the type of coin inserted, mute the microphone on the handset, and transmit a series of tones. Just voltage spikes. The muting prevented the background noise from interfering with the signal detection. Drop a quarter in the slot and you’d hear the background noise suddenly disappear followed by some tapping sounds (this was just bleed through).
It’s also relevant to know that cereals used to include a cheap, little toy inside. At one point, Cap’n Crunch had a whistle which had a pitch of 2600Hz.
The story goes that someone* figured out that the tones sent by the payphones were at 2600Hz - same as the whistle. You could pick up a payphone handset and puff into the whistle a certain number of times, and ti would be detected as control signals (inserting money).
That’s right! Free phone calls to anywhere. I’m hazy on the specifics, but I’m pretty sure there were other tricks you could do, like directly calling restricted technician numbers, too. The reason the 2600Hz tone was special had to do with something like it was used as a general signal that didn’t trigger billing.
It knocked the idea of phone hacking, or “phreaking”, from a little known quirk, to an entire movement. Some of the stuff was wild and if you’re interested, look up the different “boxes” that people distributed blueprints for. Eventually, the phone companies caught on and started making it harder to get at wires and more sophisticated coin receptacles.
If you’ve ever seen the magazine 2600 back in the 90s and early 00s, that’s the origin of the name.
All that is to say, if you knew nothing about technology and watched a guy whistle into a phone to get special access, you’d probably be freaked out. Who knows what that maniac could do with a flute!
Goddamn that was poetic, ya cunt.