I don’t think it is feasible to build one without a massive scope creeping. The grand strategies have very different focuses depending on the era.
I don’t think it is feasible to build one without a massive scope creeping. The grand strategies have very different focuses depending on the era.
It’s not just DNS. I have this rule in my firewall:
udp dport 15600 counter drop comment "Block Samsung TV shenanigans"
So far, it has blocked 20575 packets (constituting 1304695 bytes) in 6 days and 20 hours.
Anticheats can be very invasive, they can theoretically scan all the files inside your computer (whether it is practically done, I don’t know but it surely feels like it’s been done), take screenshots regularly, send your hardware information, etc. So yeah, if you are someone who takes security seriously…
Ah you got my comment wrong! I didn’t mean to suggest Gecko is closed source. I just wanted another web engine that is also open source.
Servo was an experimental ground for Mozilla in some ways (like testing out a new CSS engine and porting it back to Gecko if it works). So it’s quite normal for people to be unaware of it, it was not meant for the public.
But later on it was abandoned by Mozilla and stuck in a limbo, until it got picked up by the Linux Foundation. Now it’s a standalone project and I wish them well. We really need a new FOSS web engine.
I recall reading somewhere the earlier compilers had a hard limit on the length of function names, due to memory constraints.
One of the issues at hand is that X11, the predecessor of Wayland, does not have a standardized way to tell applications what scale they should use. Applications on X11 get the scale from environment variables (completely bypassing X11), or from Xft.dpi, or by providing in-application settings, or they guess it using some unorthodox means, or simply don’t scale at all. It’s a huge mess overall.
It is one of the more-or-less fundamentally unfixable parts of the protocol, since it wants everything to be on the same coordinate space (i.e. 1 pixel is 1 pixel everywhere, which is… quite unsuitable for modern systems.)
Wayland does operate like how you say it and applications supporting Wayland will work properly in HiDPI environments.
However a lot of people and applications are still on X11 due to various reasons.
LoDPI applications are either tiny sized or upscaled (= blurriness), aren’t they?
Yeah I get the display server part. What I meant was that 200% scaling gets you 1920x1080 logical resolution on HiDPI applications – LoDPI applications continue to be blurry just as if you set your actual resolution to 1080p, but HiDPI applications will enjoy the enhanced visual acuity.
Even on smaller screens like the 14" ones, the quality of very high resolution (e.g. 4K) is still quite visible IMO, especially when it comes to text rendering. But it could very well just be my eyes.
It’s not even Linux’s fault. Plenty of apps support HiDPI on Linux.
It’s the developers who still think that LoDPI-only is still acceptable when it’s already 2024.
Isn’t scaling to 200% the same as lowering the resolution to half? And you lose the high DPI for apps that support it too.
Agreed. HiDPI is the way to go and we should appreciate Framework for putting that in their laptops instead of continuing the use of shitty 1366x768 screens.
Xorg is the reason why OP is facing the scaling issues. OP, try to force the apps to run on native Wayland if they support it but don’t default to it. The Wayland page on Arch wiki has instructions on that. Immensely improved my HiDPI experience.
…Why is there Dunkin Donuts inside a hospital?
Agencies that are still living in the 90s…
The modern electronic devices are far more railroaded than it was back in the day tho.
Want to download an application? There’s the App Store. No need to download random .exes from sketchy websites (and learn what a “computer virus” is the hard way)
Downloaded a picture? It’s instantly inside your gallery. Back then we needed to find a folder called “Download” or “My Documents” using something called the Explorer!
iPhone and Android made a lot of things dumber and easier to take in, but I feel like it had a detrimental effect on digital literacy.
The “quit having fun” meme is ironically becoming as cringey as the thing it is originally complaining about.
You will help the community more by telling non-Linux people why Linux gaming is better, and this meme is doing the exact opposite of it – “oh Linux can’t play some games, yada yada. But we are still better! Switch over!” – like what’s the logic of it?
What’s the purpose of this meme other than circlejerking?
Disclamer: I am a Linux user myself, started with Debian and is now using Arch Linux.
I will share some advantages I experienced in Linux gaming:
Alt-tabbing old fullscreened games won’t mess with my monitor.
The compatibility of Wine when it comes to some older games is wild. SimCity 4 actually crashed less when I played it on Linux.
Better performance across the board. Granted it’s just a mere 5% difference but I will take it, why not.
Indeed, the Ryzen laptops are very nice! I have one (the 4800H) and it lasts ~8 hours on battery, far more than what I expected from laptops of this performance level. My last laptop barely achieved 4 hours of battery life.
I had stability issues in the first year but after one of the BIOS updates it has been smooth as butter.
If proper SATA ever goes away, I’d wager that there will still be SATA-to-USB adapters on sale. Heck, people still find ways to connect floppy drives to their modern PCs.
This is why I try my damnedest not to write in weakly typed languages.
string
+ object
makes no logical sense, but the language will be like “'no biggie, you probably meant string + string so let’s convert the object to string”! And so all hell breaks loose when the language’s assumption is wrong.
Last time I asked around about this question, the answer was surprisingly “probably not much”! When a low-power x86 chip (like those mobile chips) is idling (which is pretty much all the time if all you are doing is hosting a server on it) it consumes very little power, about the same level as an idling Pi. It is when the frequency ramps up that performance-per-watt gets noticeably worse on x86.
Edit: My personal test showed that my x86 laptop fared slightly worse than my Pi 3 in idling power (~2 watts higher it seems), but that laptop is oooooooold.