• 0 Posts
  • 6 Comments
Joined 1 year ago
cake
Cake day: June 10th, 2023

help-circle


  • For LLMs, I’ve had really good results running Llama 3 in the Open Web UI docker container on a Nvidia Titan X (12GB VRAM).

    For image generation tho, I agree more VRAM is better, but the algorithms still struggle with large image dimensions, ao you wind up needing to start small and iterarively upscale, which afaik works ok on weaker GPUs, but will gake problems. (I’ve been using the Automatic 1111 mode of the Stable Diffusion Web UI docker project.)

    I’m on thumbs so I don’t have the links to the git repos atm, but you basically clone them and run the docker compose files. The readmes are pretty good!



  • I loved my course on patterns. It was tough, but I now regularly feel like I can apply mastery of this tricky subject to my software projects. The course used a variety of techniques:

    • Read the seminal Design Patterns book by Gamma et al., for an overview of the concepts.
    • Every week, we’d incorporate three patterns into a preexisting XML processor project. My final one had like 25 patterns, which was challenging to keep working amidst refactoring. (You don’t have to do them cumulatively, but I enjoyed it.)
    • We’d have to ask pattern-specific questions of our classmates in forum threads; and occasionally we’d be assigned to answer some.
    • We each wrote up our own pattern. (I designed one based on my experiences handling data exchange between web apps and clients.)

    Together, this taught us

    • How the patterns could concretely look in practice.
    • Pros, cons, and other considerations for each.
    • Similaraties, differences, and nuances. (We’d joke that everything was the Template pattern if you squinted.)
    • The impact of modifications to the patterns.
    • How to recognize, create, hone, collaborate on, and share patterns.

    I appreciate this approach because patterns are an inherently fuzzy subject.