• 7 Posts
  • 983 Comments
Joined 1 year ago
cake
Cake day: March 22nd, 2024

help-circle



  • Machine learning has been a field for years, as others said, yeah, but Wikipedia would be a better expansion of the topic. In a nutshell, it’s largely about predicting outputs based on trained input examples.

    It doesn’t have to be text. For example, astronmers use it to find certain kinds of objects in raw data feeds. Object recognition (identifying things in pictures with little bounding boxes) is an old art at this point. Series prediction models are a thing, languagetool uses a tiny model to detect commonly confused words for grammar checking. And yes, image hashing is another, though not entirely machine learning based. IDK what Tineye does in their backend, but there are some more “oldschool” approaches using more traditional programming techniques, generating signatures for images that can be easily compared in a huge database.

    You’ve probably run ML models in photo editors, your TV, your phone (voice recognition), desktop video players or something else without even knowing it. They’re tools.

    Seperately, image similarity metrics (like lpips or SSIM) that measure the difference between two images as a number (where, say, 1 would be a perfect match and 0 totally unrelated) are common components in machine learning pipelines. These are not usually machine learning based, barring a few execptions like VMAF (which Netflix developed for video).

    Text embedding models do the same with text. They are ML models.

    LLMs (aka models designed to predict the next ‘word’ in a block of text, one at a time, as we know them) in particular have an interesting history, going back to (If I even remember the name correctly) BERT in Google’s labs. There were also tiny LLMS people did run on personal GPUs before ChatGPT was ever a thing, like the infamous Pygmalion 6B roleplaying bot, a finetune of GPT-J 6B. They were primitive and dumb, but it felt like witchcraft back then (before AI Bros marketers poisoned the well).











  • This is so stupid.

    To me, “AI” in a car would be like highlighting pedestrians in a HUD, or alerting you if an unknown person messes with the car, or maybe adjusting mood lighting based on context. Or safety features.

    …Not a chatbot.

    I’m more “pro” (locally hostable, task specific) machine learning than like 99% of Lemmy, but I find the corporate obsession with cloud instruct textbots bizarre. It would be like every food corp living and breathing succulents. Cacti are neat, but they don’t need to be strapped to every chip bag, every takeout, every pack of forks.


  • I feel like there’s a “bell curve” for Linux gaming enjoyment.

    If you’re even a little techy, like not using your PC begrudgingly and mostly live in iOS or whatever, the switch will feel like a relief. But many PC users aren’t; they arent interested in what a OS or file system is, they just want League or Sims to pop up and that’s it.

    …And then there’s me. I use Linux for hours every day, I’m pretty familiar with the graphics stacks and such… But I need the performance of stripped, neutered Windows I dual boot for weird, modded sim games I sometimes play. And frankly, it’s more convenient for many titles I need to get up and running quick for coop or whatever. There’s also tools like SpecialK that don’t work on Linux, and help immensely with certain games/displays.


  • Not everyone’s a big kb/mouse fan. My sister refuses to use one on the HTPC.

    Hence I think that was its non-insignificant niche; couch usage. Portable keyboards are really awkward and clunky on laps, and the steam controller is way better and more ergonomic than an integrated trackpad.

    Personally I think it was a smart business decision, because of this:

    It doesnt have 2 joysticks so I just buy an Xbox one instead.

    No one’s going to buy a steam-branded Xbox controller, but making it different does. And I think what killed it is that it wasn’t plug-and-play enough, eg it didn’t work out of the box with many games.