I’m making this post after endless frustrations with learning Rust and am about to just go back to TypeScript. Looking at Rust from the outside, you’d think it was the greatest thing ever created. Everyone loves this language to a point of being a literal cult and its popularity is skyrocketing. It’s the most loved language on Stackoverflow for years on end. Yet I can’t stand working in it, it gets in my way all the time for pointless reasons mostly due to bad ergonomics of the language. Below are most of the issues I’ve encountered:

  • Cargo is doing too many things at once. It’s a build system but also a package manager but also manages dependencies? Idk what to even call it.

  • Syntax is very confusing for no reason. You can’t just look at rust code and immediately know what it does. Having to pollute your code &, ? and .clone() everywhere to deal with ownership, using :: to refer to static methods instead of a “static” keyword. Rust syntax is badly designed compared to most other languages I used. In a massive codebase with tons of functions and moving parts this is unreadable. Let’s take a look at hashmaps vs json

let mut scores = HashMap::new();
scores.insert(String::from("Name"), Joe);
scores.insert(String::from("Age"), 23);

Supposively bad typescript

const person = {
  name: "joe",
  age: 23
}

Js is way more readable. You can just look at it and immediately know what the code is doing even if you’ve never coded before. That’s good design, so why do people love rust and dislike typescript then?

  • Similarly, Async code starts to look really ugly and overengineered in rust.

  • Multiple string types like &str, String, str, instead of just one “str” function

  • i32 i64 i8 f8 f16 f32 instead of a single unified “number” type like in typescript. Even in C you can just write “int” and be done with it so it’s not really a “low level” issue.

  • Having to use #[tokio:main] to make the main function async (which should just be inbuilt functionality, btw tokio adds insane bloat to your program) yet you literally can’t write code without it. Also what’s the point of making the main function async other than 3rd party libraries requiring it?

  • Speaking of bloat, a basic get request in a low level language shouldn’t be 32mb, it’s around 16kb with C and libcurl, despite the C program being more lines of code. Why is it so bloated? This makes using rust for serious embedded systems unfeasible and C a much better option.

  • With cargo you literally have to compile everything instead of them shipping proper binaries. Why??? This is just a way to fry your cpu and makes larger libraries impossible to write. It should be on the part of the maintainer to build the package beforehand and add the binary. Note that i don’t mean dependencies, I mean scripts with cargo install. There is no reason a script shouldn’t be compiled beforehand.

Another major issue I’ve encountered is libraries in Rust, or lack thereof. Every single library in rust is half-baked. Axum doesn’t even have a home page and its docs are literally a readme file in cargo, how’s that gonna compare to express or dotnet with serious industry backing? If you write an entire codebase in Axum and then the 1 dev maintaining it decides to quit due to no funding then what do you do? No GUI framework is as stable as something like Qt or GTK, literally every rust project has like 1 dev maintaining it in his free time and has “expect breaking changes” in the readme. Nothing is stable or enterprise ready with a serious team with money backing it.

As for “memory safety”, it’s a buzzword. Just use a garbage collector. They’re invulnerable to memory issues unless you write infinite while loop and suitable for 99% of applications.

“But muh performance, garbage collectors are slow!”

Then use C or C++ if you really need performance. Both of them are way better designed than Rust. In most cases though it’s just bikeshedding. We’re not in 1997 where we have 10mb of ram to work with, 9/10 times you don’t need to put yourself through hell to save a few megabyes of a bundle size of a web app. There are apps with billions of users that run fine on php. Also, any program you write should be extensively tested before release, so you’d catch those memory errors if you aren’t being lazy and shipping broken software to the public. So literally, what is the point of Rust?

From the outside looking in, Rust is the most overwhelming proof possible to me that programmers are inheritly hobbists who like tinkering rather than actually making real world apps that solve problems. Because it’s a hard language, it’s complicated and it’s got one frivelous thing it can market “memory safety!”, and if you master it you’re better than everyone else because you learned something hard, and that’s enough for the entire programming space to rank it year after year the greatest language while rewriting minimal c programs in rust quadrupling the memory usage of them. And the thing is, that’s fine, the issue I have is people lying and saying Rust is a drop in replacement for js and is the single greatest language ever created, like come on it’s not. Its syntax and poor 3rd party library support prove that better than I ever can

“Oh but in rust you learn more about computers/low level concepts, you’re just not good at coding”

Who cares? Coding is a tool to get shit done and I think devs forget this way too often, like if one works easier than the other why does learning lower level stuff matter? It’s useless knowledge unless you specifically go into a field where you need lower level coding. Typescript is easy, rust is not. Typescript is therefore better at making things quick, the resourse usage doesn’t matter to 99% of people and the apps look good and function good.

So at this point I’m seeing very little reason to continue. I shouldn’t have to fight a programming language, mostly for issues that are caused by lack of financial backing in 3rd party libraries or badly designed syntax and I’m about to just give up and move on, but I’m in the minority here. Apparently everyone loves dealing with hours and hours of debugging basic problems because it makes you a better programmer, or there’s some information I’m just missing. Imo tho think rust devs need to understand there’s serious value in actually making things with code, the ergonomics/good clean design of the language, and having serious 3rd party support/widespread usage of libraries. When you’re running a company you don’t have time to mess around with syntax quirks, you need thinks done, stable and out the door and I just don’t see that happening with Rust.

If anyone makes a serious comment/counterargument to any of my claims here I will respond to it.

  • ExperimentalGuy@programming.dev
    link
    fedilink
    arrow-up
    18
    ·
    3 months ago

    I think one thing to mention is that Rust is highly specific in what it does. In most of the examples you mentioned, string types, tokio::main, you can essentially just say that rust is more explicit. When initializing an integer variable in C using int, it’s not specified what use the integer is or whether it’s signed or not. i32, uint16_t you can see how it’s specified. Using tokio::main before your main function just specifies that you’re using the tokio asynchronous executor for your async code. In the case of string types, they all have different implementations which just help with being specific.

    The reason I like Rust is because I know what’s happening when I read it. Did I have to read the whole async book to understand how the tokio::main stuff works? Yes. But now I understand exactly how it works. The problem I have with using Javascript is that it doesn’t have that high amount of explicitness(is that a word?). At the end of the day, if you’re using it for a personal project or you’re arguing for language supremacy, it really just comes down to personal preference.

    • Valmond@lemmy.world
      link
      fedilink
      arrow-up
      2
      arrow-down
      6
      ·
      edit-2
      3 months ago

      When is an int not 32bits nowadays, and, seriously, not signed??

      I mean on some obscure platform it might be 16 bits, but unsigned??!

      Edit: OMG yes there are different sized datatypes I know that lol! Default types are not unsigned 11 bit things though. SMH.

      • Doom4535@lemmy.sdf.org
        link
        fedilink
        arrow-up
        15
        ·
        edit-2
        3 months ago

        Enter embedded programming, believe it or not there is a ton of low level C code being written; also, try adding a new protocol for networking, there are many cases where bitstructure matters, I recently wrote a small bit of code for a project that used bit packing so that we could fit inside of a single Ethernet frame so that we didn’t have to deal with fragmentation and the networking overhead it caused.

        For context, what is your past programming background and what are you trying to do? While rust is a great language, it may not be the right tool for what you’re trying to do if these are things that you view as unnecessary

        • Valmond@lemmy.world
          link
          fedilink
          arrow-up
          1
          arrow-down
          1
          ·
          3 months ago

          I have done low level dev, and put stuff in wan frames (576 bytes IIRC) and meddling with ethernet frames (1500 bytes), now you tell me where on earth you find a not totally obscure int thats unsigned by default.

          That was my statement you know.

          For rust, it feels like someone fell in love with c++ template meta programming (you know the … variadic template and all that) and wanted a better language for that. Good gor them if they like it.

          • Doom4535@lemmy.sdf.org
            link
            fedilink
            arrow-up
            1
            ·
            3 months ago

            Soooo, an int in most architectures is actually signed, usually a 2’s compliment signed 32 bit value; but the spec does not require it to be 32bits, some platforms might use an 8 bit or 16bit value instead (think older 8bit microcontrollers). That’s why they have ‘int32_t’, ‘uint32_t’, etc for C/C++, it just sounds like you haven’t used these number types (check stdint.h). Rust just requires you to use the more explicit number format by default (which I personally prefer because I have had to jump between embedded and Linux development).

            The multiple string types are annoying at first, but its probably better to realize that they are more like two types (String and str) with the apperstand (&) ones being references (think pointer in C/C++). A String is like a C++ std::string and a str is more like a C-String (fixed sized array ish of chars).

      • bamboo@lemm.ee
        link
        fedilink
        arrow-up
        10
        ·
        3 months ago

        I do systems programming work, sometimes with constrained memory scenarios. We always want to use the smallest types we can for any task, and unless negative numbers are a necessary, always prefer unsigned. That means a lot of u8 and u16 unless we know a value is likely to need more bits to be represented. Probably doesn’t matter as much in we programming but that’s not Rust’s niche (or well not its original niche).

        • Valmond@lemmy.world
          link
          fedilink
          arrow-up
          1
          arrow-down
          1
          ·
          3 months ago

          Well where are those unsigned default ‘int’ I said doesn’t exist, and that everyone seems to not think I’m right about?

          Don’t get me wrong, unsigned integers are useful, that’s why we hate java btw, but it was not really the question.

          Also, if you’re using like 16 bits ints because you have memory constraints then you are doing it wrong. All modern compilers can handle any kind of bits as long as it’s less than the base size, so you can have a 3bit int, another 3bit int and 2 1 bit ints, nicely #packed into a byte. You use the :3 syntax.

          • bamboo@lemm.ee
            link
            fedilink
            arrow-up
            2
            ·
            3 months ago

            Your default types for that are i32 or u32. It’s the exact same number of characters yet encodes more precise information.

            I’m aware of packing, but for my specific niche the main bottleneck is CPU, and it’s important to minimize the amount of memory usage to improve data locality, increasing cache hit rates, ultimately increasing cpu throughout. Any gains we would make by packing such small values would likely be eliminated by the cost of unpacking them, unless it’s a flags-like value where we are primarily comparing individual bits.

            • Valmond@lemmy.world
              link
              fedilink
              arrow-up
              1
              arrow-down
              1
              ·
              3 months ago

              So not default??

              For the packing/unpacking, sure it all depends on where you want the cost.

          • Doom4535@lemmy.sdf.org
            link
            fedilink
            arrow-up
            1
            ·
            edit-2
            3 months ago

            The constraint on memory isn’t on the compiler it is in the available ram and flash on ultra-lowcost/power microcontrollers, use can find some with less than 1KByte of ram, so that 32bit int could have been 4 8bit ints in your electric toothbrush. The packing of bits is space effecient, but not compute effecient and takes several extra clock cycles to finish when run on an 8bit microcontroller. It’s better to store it in the native 8bit word size for these devices. Further more, on more powerful systems using the smaller size can allow you to optimize for SIMD instructions and actually do multiple operations per clock beyond what you could do with the 32bit size.

            There are reasons for these types to exist; as was mentioned elsewhere, if you don’t care you can always just use i32 or i64 for your code and be done with it.

      • taaz@biglemmowski.win
        link
        fedilink
        English
        arrow-up
        9
        ·
        3 months ago

        When is an int not 32bits nowadays

        C standard does not actually define the exact sizes of long/int and so on, it’s just what is now most popular (it does have some limitations and requirements on these types though)

        • bamboo@lemm.ee
          link
          fedilink
          arrow-up
          3
          ·
          3 months ago

          It does define minimum sizes for different types. An int for example is at least two bytes, whatever size those might be!

        • Valmond@lemmy.world
          link
          fedilink
          arrow-up
          1
          arrow-down
          1
          ·
          3 months ago

          Duh I know, it’s just that it’s now the standard that your ‘int’ is signed, you know as I said ‘show me otherwise’. It’s also wildly most common 32 bits.

            • Valmond@lemmy.world
              link
              fedilink
              arrow-up
              1
              ·
              3 months ago

              Downvote all you want, even tho gs you dont understand lol

              In 99% of C/C++ compilers, if you write “int” then it is treated as a signed 32 bit data type. If you want something else you need to specify it. Like unsigned char for example is a non signed 16bit data tupe (again,on 99% of C/C++ compilers). Thus int defaults to a signed 32 bit data type which makes ‘int’ a default value. I don’t know how to better explain that better to a developer. If you don’t understand, please do tell.

              • Anatol@chaos.social
                link
                fedilink
                arrow-up
                0
                ·
                3 months ago

                @Valmond I’m aware of that, but that’s for different languages. C/C++ don’t get to define if programming in general has a “default int”

                • Valmond@lemmy.world
                  link
                  fedilink
                  arrow-up
                  1
                  arrow-down
                  1
                  ·
                  3 months ago

                  In C/C++ the default int definitely exists, and is a signed 32bit in the most overwhelming cases, what are you talking about?

                  • Anatol@chaos.social
                    link
                    fedilink
                    arrow-up
                    0
                    ·
                    3 months ago

                    @Valmond the conversation used to be about Rust. You asked about “default int”, a concept from C/C++. I am talking about this not being a *universal* concept. It is specific to those languages.

      • FizzyOrange@programming.dev
        link
        fedilink
        arrow-up
        2
        ·
        edit-2
        3 months ago

        On Arduino it’s 16 bits. (At least the original ATMEL ones; I dunno if they’ve finally moved to ARM yet.)

        This lost me quite a lot of time when I tried to use their SD card library on a 32-bit Arduino and it hung due to some code assuming int was 16 bits.

        • Valmond@lemmy.world
          link
          fedilink
          arrow-up
          1
          arrow-down
          2
          ·
          3 months ago

          Unsigned ?

          I mean the nintendo ds math library was 16 bits too, it happens, but unsigned? Never heard of it.

          • Diva (she/her)@lemmy.ml
            link
            fedilink
            English
            arrow-up
            3
            arrow-down
            1
            ·
            3 months ago

            I use unsigned and fixed point sub-16 bit integers (you can make them almost any size you want) in gateware you can optimize things and one of those is not using a ton of huge numbers to do math when you don’t need to.

              • Diva (she/her)@lemmy.ml
                link
                fedilink
                arrow-up
                1
                arrow-down
                3
                ·
                3 months ago

                Maybe translation issue? I just meant for fpga work int is 32 signed by default, but it’s way more common to use logic variables which are unsigned and arbitrary length (>=1)

                • Valmond@lemmy.world
                  link
                  fedilink
                  arrow-up
                  1
                  arrow-down
                  1
                  ·
                  edit-2
                  3 months ago

                  C/C++ compilers. Not FPGA.

                  Anyways, int is s32 very very very often, that’s all my point was. C, gpgpu, c#, java, …

                  • Diva (she/her)@lemmy.ml
                    link
                    fedilink
                    arrow-up
                    2
                    arrow-down
                    3
                    ·
                    3 months ago

                    Makes sense, I keep hearing people using rust for microcontroller, even had someone telling me that it could be use for fpga code, but I’m skeptical of the utility