Are the streamed data stored in a local cache? Surely the bandwidth costs are going up to the sky with the server sending data to every single player.
Are the streamed data stored in a local cache? Surely the bandwidth costs are going up to the sky with the server sending data to every single player.
I think there’s one key thing you missed: you have never bought a copy of the game on Steam! It’s always been a license. Valve simply made the fact clear now because of legal changes.
so the next question, is this retroactive
So the answer for this is a solid no.
For SSDs this has historically not been the case, there’s no way in hell you could buy a 1TB SSD within $200 a decade ago.
Last time I asked around about this question, the answer was surprisingly “probably not much”! When a low-power x86 chip (like those mobile chips) is idling (which is pretty much all the time if all you are doing is hosting a server on it) it consumes very little power, about the same level as an idling Pi. It is when the frequency ramps up that performance-per-watt gets noticeably worse on x86.
Edit: My personal test showed that my x86 laptop fared slightly worse than my Pi 3 in idling power (~2 watts higher it seems), but that laptop is oooooooold.
I don’t think it is feasible to build one without a massive scope creeping. The grand strategies have very different focuses depending on the era.
It’s not just DNS. I have this rule in my firewall:
udp dport 15600 counter drop comment "Block Samsung TV shenanigans"
So far, it has blocked 20575 packets (constituting 1304695 bytes) in 6 days and 20 hours.
Anticheats can be very invasive, they can theoretically scan all the files inside your computer (whether it is practically done, I don’t know but it surely feels like it’s been done), take screenshots regularly, send your hardware information, etc. So yeah, if you are someone who takes security seriously…
Ah you got my comment wrong! I didn’t mean to suggest Gecko is closed source. I just wanted another web engine that is also open source.
Servo was an experimental ground for Mozilla in some ways (like testing out a new CSS engine and porting it back to Gecko if it works). So it’s quite normal for people to be unaware of it, it was not meant for the public.
But later on it was abandoned by Mozilla and stuck in a limbo, until it got picked up by the Linux Foundation. Now it’s a standalone project and I wish them well. We really need a new FOSS web engine.
I recall reading somewhere the earlier compilers had a hard limit on the length of function names, due to memory constraints.
One of the issues at hand is that X11, the predecessor of Wayland, does not have a standardized way to tell applications what scale they should use. Applications on X11 get the scale from environment variables (completely bypassing X11), or from Xft.dpi, or by providing in-application settings, or they guess it using some unorthodox means, or simply don’t scale at all. It’s a huge mess overall.
It is one of the more-or-less fundamentally unfixable parts of the protocol, since it wants everything to be on the same coordinate space (i.e. 1 pixel is 1 pixel everywhere, which is… quite unsuitable for modern systems.)
Wayland does operate like how you say it and applications supporting Wayland will work properly in HiDPI environments.
However a lot of people and applications are still on X11 due to various reasons.
LoDPI applications are either tiny sized or upscaled (= blurriness), aren’t they?
Yeah I get the display server part. What I meant was that 200% scaling gets you 1920x1080 logical resolution on HiDPI applications – LoDPI applications continue to be blurry just as if you set your actual resolution to 1080p, but HiDPI applications will enjoy the enhanced visual acuity.
Even on smaller screens like the 14" ones, the quality of very high resolution (e.g. 4K) is still quite visible IMO, especially when it comes to text rendering. But it could very well just be my eyes.
It’s not even Linux’s fault. Plenty of apps support HiDPI on Linux.
It’s the developers who still think that LoDPI-only is still acceptable when it’s already 2024.
Isn’t scaling to 200% the same as lowering the resolution to half? And you lose the high DPI for apps that support it too.
Agreed. HiDPI is the way to go and we should appreciate Framework for putting that in their laptops instead of continuing the use of shitty 1366x768 screens.
Xorg is the reason why OP is facing the scaling issues. OP, try to force the apps to run on native Wayland if they support it but don’t default to it. The Wayland page on Arch wiki has instructions on that. Immensely improved my HiDPI experience.
…Why is there Dunkin Donuts inside a hospital?
Agencies that are still living in the 90s…
The modern electronic devices are far more railroaded than it was back in the day tho.
Want to download an application? There’s the App Store. No need to download random .exes from sketchy websites (and learn what a “computer virus” is the hard way)
Downloaded a picture? It’s instantly inside your gallery. Back then we needed to find a folder called “Download” or “My Documents” using something called the Explorer!
iPhone and Android made a lot of things dumber and easier to take in, but I feel like it had a detrimental effect on digital literacy.
Usually I sympathize with sentiments like this (“people use X because of uncontrolled circumstances”), but browsers are not one of them.
If you have a website that requires the use of Chrome, then just use Chrome for that website! It’s not an either-or thing – you can install both browsers and use Firefox as the primary one.
And that’s what makes this statement so problematic. You don’t earn anything by staying exclusively on Chrome, when both it and Firefox can work alongside each other.