brucethemoose

joined 1 year ago
[–] brucethemoose@lemmy.world 9 points 12 hours ago

I mean... No one on any Lemmy instance thinks the US has any leg to stand on.

Even righties I know think the country has kinda gone to hell. Not that any are here.

[–] brucethemoose@lemmy.world 4 points 13 hours ago (1 children)

That’s not what I’m saying. They’ve all but outright said they’re unprofitable.

But revenue is increasing. Now, if it stops increasing like they’ve “leveled out”, that is a problem.

Hence it’s a stretch to assume they would decrease costs for a more expensive model since that would basically pop their bubble well before 2029.

[–] brucethemoose@lemmy.world 5 points 14 hours ago (4 children)

Or it might not. It would be a huge short term risk to do so.

As FaceDeer said, that we truly don't know.

[–] brucethemoose@lemmy.world 7 points 14 hours ago

To be fair, OpenAI's negative profitability has been extensively reported on.

Your point stands though; there's no evidence they're trying to decrease revenue. On the contrary, that would be a huge red flag to any vested interests.

[–] brucethemoose@lemmy.world 2 points 14 hours ago* (last edited 14 hours ago)

BTW you can share NTFS partitions with Windows and Linux, if that's your SO's concern.

I do this, and it works really well! This used to be an issue in Mint because its kernel was kinda old, but no more.

Obviously getting into partitioning is a lot of configuration for most people, but still.

[–] brucethemoose@lemmy.world 14 points 15 hours ago* (last edited 15 hours ago) (19 children)

I don’t buy the research paper at all. Of course we have no idea what OpenAI does because they aren’t open at all, but Deepseek's publish papers suggest it’s much more complex than 1 model per node… I think they recommended like a 576 GPU cluster, with a scheme to split experts.

That, and going by the really small active parameter count of gpt-oss, I bet the model is sparse as heck.

There’s no way the effective batch size is 8, it has to be waaay higher than that.

[–] brucethemoose@lemmy.world 3 points 15 hours ago* (last edited 15 hours ago) (2 children)

Fair.

And it’s the default.

Eventually, OEMs might find all this stuff is hurting sales and start offering Linux. I think that would be huge, as 99% of buyers will just stick to the default.

[–] brucethemoose@lemmy.world 2 points 15 hours ago* (last edited 15 hours ago) (4 children)

Even if true, that doesn't mean you have to put up with it everywhere else.

I dual boot linux. These days, I only boot an extremely neutered Windows for most games or anything HDR, but basically for nothing else. Honestly a lot of old Windows stuff works better in Wine anyway.

[–] brucethemoose@lemmy.world -3 points 1 day ago* (last edited 1 day ago)

Do people in China really think this?

Turn on Fox News, or Left comedy shows. Or major influencers.

The Americans biggest threat is… Americans. Specifically the other side. I haven’t heard mega conservative or mega liberal family, or anyone, even utter the word “China” in a while. Honestly the only place I see it now is finance news, and they are just jawboning to move stocks anyway.

Trump and senators do say it sometimes I guess, but TBH it’s mostly on deaf or bored ears.

Can’t speak for the UK, but I imagine they are starting to look across the pond with worry.

[–] brucethemoose@lemmy.world 1 points 1 day ago

I honestly think Zuck is more of a coward than the Palantir CEO and that techno feudalism crowd. He’s just so obviously insecure in all his decision making it’s unreal.

[–] brucethemoose@lemmy.world 1 points 1 day ago* (last edited 1 day ago) (2 children)

Zuckerberg is such a coward.

He bends over backwards for even the slightest change in wind, like VR or a fascist govt. He dropped open-weight llama at the first experimental stumble.

Mark my words, if a mega liberals got into power, he’d fire this guy and act as woke as can be.

[–] brucethemoose@lemmy.world -1 points 2 days ago* (last edited 2 days ago)

I mean, you mind as well do it right then. Use free, crowd hosted roleplaying finetunes, not a predatory OpenAI frontend.

https://aihorde.net/

https://lite.koboldai.net/

Reply/PM me, and I’ll spin up a 32B or 49B instance myself and prioritize it for you, anytime. I would suggest this over ollama as the bigger models are much, much smarter.

view more: next ›