• U7826391786239@lemmy.zip
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 months ago

        yea, i’m surprised, 32GB is goddamn ridiculous for anything, let alone for a shitty hp branded autocorrect

        fuck AI, fuck HP, and fuck “laptop subscription”

        the saddest thing is, people will sign up, if for no other reason than they have no other option

        • partial_accumen@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          2 months ago

          yea, i’m surprised, 32GB is goddamn ridiculous for anything, let alone for a shitty hp branded autocorrect

          32GB is actually considered the bare minimum for most of the common locally run LLM models. Most folks don’t run a locally run LLM. They use a cloud service, so they don’t need a huge pile of RAM locally. However, more privacy focused or heavy users with cost concerns might choose to run an LLM locally so they’re not paying per token. With regards to locally run LLMs, this would be comparable to renting car when you need it vs buying one outright. If you only need a car once a year, renting is clearly the better choice. If you’re driving to work everyday then clearly buying the car yourself is a better deal overall.

          You are perfectly fine not liking AI, but you’re also out-of-touch if you think 32GB is too big for anything. Lots of other use cases need 32GB or more and have nothing to do with AI.

          I agree with your frustration with subscription laptops. I hope people don’t use it.

          • FireWire400@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            2 months ago

            That’s not really true. You can download lightweight models to run with LM Studio just as easily.