this post was submitted on 12 Mar 2026
1729 points (99.1% liked)
Programmer Humor
30362 readers
1737 users here now
Welcome to Programmer Humor!
This is a place where you can post jokes, memes, humor, etc. related to programming!
For sharing awful code theres also Programming Horror.
Rules
- Keep content in english
- No advertisements
- Posts must be related to programming or programmer topics
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
With 32 and 64 GB systems I've never run out of RAM, so the RAM isn't the issue at all.
Optimization just sucks.
Have you ever tried running a decent sized LLM locally?
Decent sized for what?
Creative writing and roleplay? Plenty, but I try to fit it into my 16 GB VRAM as otherwise it's too slow for my liking.
Coding/complex tasks? No, that would need 128GB and upwards and it would still be awfully slow. Except you use a Mac with unified memory.
For image and video generation you'd want to fit it into GPU VRAM again, system RAM would be way too slow.
I use a Mac with unified memory, so that distinction slipped my mind.