• 0 Posts
  • 34 Comments
Joined 1 year ago
cake
Cake day: July 16th, 2023

help-circle
  • As everybody else has said, Debian is working as intended. To respond to the actual post though, Debian is working exactly as it always has.

    If you think Debian used to be good, you must really love it now. It is better than ever.

    Unlike in the past, the primary drawback of Debian Stable ( old package versions ) has multiple viable solutions. Other have rightly pointed out things like the Mozilla APT package and Flatpaks. Great solutions.

    My favourite solution is to install Arch via Distrobox. You can then get all the stability of Debian everywhere you need it and, anytime you need additional packages or newer packages, you can install them in the Arch distrobox. Firefox is a prime candidate. You are not going to get newer packages or a greater section than via he Arch repos / AUR ( queue Nix rebuttals ).



  • I would say that you make a decent argument that the ALU has the strongest claim to the “bitness” of a CPU. In that way, we are already beyond 64 bit.

    For me though, what really defines a CPU is the software that runs natively. The Zen4 runs software written for the AMD64 family of processors. That is, it runs 64 bit software. This software will not run on the “32 bit” x86 processors that came before it ( like the K5, K6, and original Athlon ). If AMD released the AMD128 instruction set, it would not run on the Zen4 even though it may technically be enough hardware to do so.

    The Motorola 68000 only had a 16 but ALU but was able to run the same 32 bit software that ran in later Motorola processors that were truly 32 bit. Software written for the 68000 was essentially still native on processors sold as late as 2014 ( 35 years after the 68000 was released ). This was not some kid of compatibility mode, these processors were still using the same 32 bit ISA.

    The Linux kernel that runs on the Zen4 will also run on 64 bit machines made 20 years ago as they also support the amd64 / x86-64 ISA.

    Where the article is correct is that there does not seem to be much push to move on from 64 bit software. The Zen4 supports instructions to perform higher-bit operations but they are optional. Most applications do not rely on them, including the operating system. For the most part, the Zen4 runs the same software as the Opteron ( released in 2003 ). The same pre-compiled Linux distro will run on both.















  • Other than having to know Rust, adding Rust to a C code base is not difficult. They play well together.

    There is no need to rewrite old code but, once Rust is there, you are free to.

    Linux is a bit of a special case as you cannot just blindly use the Rust standard library.

    Having to have a Rust tool chain to build with may or may not be an issue.

    For some use cases, like BSD or the Linux kernel, platform support is also a consideration.



  • I cannot answer your question because it proceeds from an assumption I cannot related to. In my view, Linux is much easier to setup and I have immediate access to a huge software library and am immediately productive.

    Installing Windows is much more of a hassle ( the licensing alone ) and, even once installed, you have a system that does nothing useful and needs much more time to install software on before accomplishing anything. Every time you turn around, it is throwing up arbitrary and artificial roadblocks.

    Unless it is already installed, I personally cannot fathom why people would want to spend their time installing Windows.


  • Like the other SVN, the author has been around forever in the Linux space. And yet his articles always come across like somebody that first discovered it a few months ago.

    Somehow I find myself rankling at even his smallest editorials as well. Take this one: “Then, there’s GCC Front-End for Rust, which can be loaded by the existing rustc frontend, but benefits from GCC optimizations.” Ok, sure, as long as we do not mind losing all the LLVM optimizations. Why not just say that Rust uses LLVM today and that there is a project to allow the use of the GCC-backend as an option? Leave it to the reader to decide which ecosystem or optimizations they prefer as this article is not even about that. It is like he Googled Rust and added everything he could find to the article. I can think of a reason to use the GCC version of Rust with the kernel. If you are using GCC to compile the C parts of the kernel, you also need LLVM for Rust. A GCC based Rust would eliminate that requirement. Presumably this also allows both the C and Rust code to target the same platforms. GCC and LLVM platform support is not the same. Those would be relevant points but neither of those is about “GCC optimizations”.

    That is just one example.

    Basically, I find his articles to be as misleading as they are informative and that is not what I want from the technical press.