• 0 Posts
  • 314 Comments
Joined 1 year ago
cake
Cake day: March 8th, 2024

help-circle


  • You didn’t, I did. The starting models cap at 24, but you can spec up the biggest one up to 64GB. I should have clicked through to the customization page before reporting what was available.

    That is still cheaper than a 5090, so it’s not that clear cut. I think it depends on what you’re trying to set up and how much money you’re willing to burn. Sometimes literally, the Mac will also be more power efficient than a honker of an Nvidia 90 class card.

    Honestly, all I have for recommendations is that I’d rather scale up than down. I mean, unless you also want to play kickass games at insane framerates with path tracing or something. Then go nuts with your big boy GPUs, who cares.

    But for LLM stuff strictly I’d start by repurposing what I have around, hitting a speed limit and then scaling up to maybe something with a lot of shared RAM (including a Mac Mini if you’re into those) and keep rinsing and repeating. I don’t know that I personally am in the market for AI-specific muti-thousand APUs with a hundred plus gigs of RAM yet.


  • Thing is, you can trade off speed for quality. For coding support you can settle for Llama 3.2 or a smaller deepseek-r1 and still get most of what you need on a smaller GPU, then scale up to a bigger model that will run slower if you need something cleaner. I’ve had a small laptop with 16 GB of total memory and a 4060 mobile serving as a makeshift home server with a LLM and a few other things and… well, it’s not instant, but I can get the sort of thing you need out of it.

    Sure, if I’m digging in and want something faster I can run something else in my bigger PC GPU, but a lot of the time I don’t have to.

    Like I said below, though, I’m in the process of trying to move that to an Arc A770 with 16 GB of VRAM that I had just lying around because I saw it on sale for a couple hundred bucks and I needed a temporary GPU replacement for a smaller PC. I’ve tried running LLMs on it before and it’s not… super fast, but it’ll do what you want for 14B models just fine. That’s going to be your sweet spot on home GPUs anyway, anything larger than 16GB and you’re talking 3090, 4090 or 5090, pretty much exclusively.


  • This is… mostly right, but I have to say, macs with 16 gigs of shared memory aren’t all that, you can get many other alternatives with similar memory distributions, although not as fast.

    A bunch of vendors are starting to lean on this by providing small, weaker PCs with a BIG cache of shared RAM. That new Framework desktop with an AMD APU specs up to 128 GB of shared memory, while the mac minis everybody is hyping up for this cap at 24 GB instead.

    I’d strongly recommend starting with a mid-sized GPU on a desktop PC. Intel ships the A770 with 16GB of RAM and the B580 with 12 and they’re both dirt cheap. You can still get a 3060 with 12 GB for similar prices, too. I’m not sure how they benchmark relative to each other on LLM tasks, but I’m sure one can look it up. Cheap as the entry level mac mini is, all of those are cheaper if you already have a PC up and running, and the total amount of dedicated RAM you get is very comparable.




  • But they fussed about Call of Duty.

    If I’m annoyed about anything it’s that. Gamers are so often using these ostensible customer protection or political affinity issues as a cudgel for what is ultimately a branding preference. This results on excusing some crappy stuff from people they semi-irrationally like (loot boxes on Steam games are fine!, we don’t talk about GenAI on InZOI!) but give extreme amounts of crap to companies they semi-irrationally dislike even for relatively positive things they do.

    I’d mind less if the difference was based on size or artistic quality, but dude, InZOI is from Krafton. I don’t know that the PUBG guys are the plucky indies I want to stretch my moral stances to support.




  • Hey, at least you’re honest about it.

    I don’t shill for software, man. Not for free, anyway.

    But, you know, I talk to enough people about tech stuff to know that Linux getting name dropped generates at most some brief flicker of recognition in like 95% of adults, not some half-remembered decades-old stereotypes. There just isn’t enough awareness to support misconception here. And some of the misconception isn’t that “mis” in the first place, for the standards of non-technical normies.

    FWIW, I’d love a free, usable mainstream OS alternative to Apple and Microsoft. I don’t think Linux as currently designed is built to be that effectively, but it’d sure be nice if somebody figured it out. Someone that isn’t Google trying to open yet another revenue stream for ads.


  • Man, scale is such a hard thing to get intuitively.

    I mean, yeah, Linus Sebastian has a huge following. It’s a huge following of self-selected nerds, though. Most people have no idea who he is. Wouldn’t even know what he’s talking about if you showed it to them.

    And that was one thing that he did once. That mostly nobody cared about unless they are an active Linux fan. Which is itself a tiny niche.

    Humans just have a hard time parsing when things are big or small, particularly if it’s things they are a part of. This is not stupidity, it’s just how human perception works. It works both ways, too. A lot of mondern media is about having these parasocial relationships with huge media personalities and thinking you’ve found some hidden gem only to find out that your grandma follows them already.

    It’s not that we’re dumb as a species, it’s that we’ve created this ecosystem built specifically to exploit human perceptual limits for profit and now it’s all we have. It kinda sucks.

    Sorry, I went places there, but this whole thread (and honestly, the entire Lemmy linux community) makes me think about this constantly.


  • You’d think, but at least in my Manjaro install I had the exact same, if not a bit worse, of an experience trying to share an exFAT drive than a NTFS drive. I don’t recommend it either way.

    I definitely play enough games without full Linux support that I wouldn’t have switched fully, even if I didn’t need Windows for work. The anticheat issues are one thing, but with a high end Nvidia card I found a bunch of proprietary features either didn’t work or underperformed compared to Windows. Mix that with a HDR, VRR display and it was a bit of a mess.

    Linux was snappier for desktop office work most of the time, though.


  • Hosting the games on NTFS and loading them into Steam from there under Linux is possible. It is inconsistent and a hasssle, though.

    I will say the setup the OP suggests is totally doable, but when I’ve had it that way it turned out to be easier to just do everything else on Windows than to flip back and forth, so after I updated some hardware I haven’t been on a hurry to set up Linux again.

    I’d say it’s more convenient to do this long term if you have two PCs. Maybe a laptop for Linux work and a desktop with a powerful GPU for gaming. Being able to have both on sleep and quickly switching back and forth is less likely to make you (well, me, at least) lazy than having to reboot each time.




  • Those goalposts are moving at supersonic speeds, man.

    “AI driven NPCs” are just chatbots, and generative AI is generative AI. I thought the issue with GenAI was supposed to be that the data for training was of dubious legitimacy (which these models certainly still are) and that they were cutting real artists, writers and developers out of the workforce (which these by definition are).

    Nobody seemed to be particularly fine with Stable Diffusion when that came out and could be run locally. I guess we’ve found the level of convenience against which activism will just deal with it.

    Which, again, is fine. I don’t have a massive hate boner against GenAI, even if I do think it needs specific regulation for both training and usage. But there is ZERO meaningful difference between InZOI using AI generation for textures, dialogue and props and Call of Duty using it to make gun skins. Those are the same picture.



  • Yeah, there were a few attempts in the 00s (including several NSFW ones, for some reason). It’s definitely tough to get right. I see the on-paper appeal of InZOI, in that it seems to be going for the same “we’ll do what Maxis won’t” appeal the original Cities: Skylines had. It’s just that with The Sims you risk finding out there was a good reason for what they weren’t doing, I guess.

    I don’t know what’s going on at Maxis. I don’t know that rolling a whole modern platform, games-as-service approach into Sims 4 retroactively is the right call, regardless of it’s due to a lack of capacity to do it or a strategic choice. I am pretty sure that a lot of the stuff in InZOI isn’t doing it for me, though. Those two ideas can be held at once.


  • I see how some of the weirdness in InZOI is in “so bad it’s hilarious” territory.

    I am not an anti-GenAI zealot, myself. I actually think a few of the ways they use it there are perfectly valid and make sense to support user generation… but are almost certainly a moderation nightmare that is about to go extremely off the rails. Others are more powerful than Sims on paper but the UI seems bonkers and borderline unusable.

    I can see the idea of wanting another Sims successor, or both a successor and a competitor, but it’s hard to see the treatment as anything but hypocritical at this point. If anything, I think it shows that there is a reason why there is such a gap between The Sims’ success and how many viable competitors have surfaced. Turns out The Sims is REALLY hard to get right. Even Sim City, which feels more complex at a glance, was much easier to clone or improve.