this post was submitted on 21 Feb 2026
237 points (97.2% liked)

Technology

81722 readers
3510 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] scala@lemmy.ml 5 points 21 hours ago

Glad I dipped before they slapped Ai in every detail. Rip cortana.

[–] qaeta@lemmy.ca 27 points 1 day ago (1 children)

My next computer will be Linux because of all this nonsense. The only thing that was keeping me on Windows was gaming, and Valve has solved that issue for every game I play via Proton. Sayonara MicroSlop!

[–] Bruncvik@lemmy.world 6 points 22 hours ago

My current computer will be Linux, as soon as I stop procrastinating and clean up my documents and back them up on my NAS. Already did that with my travel laptop.

[–] Frenchgeek@lemmy.ml 11 points 1 day ago (1 children)

Oh? They found a way to make a PC with no hard drive, no RAM, and no GPU?

[–] SomethingBurger@jlai.lu 1 points 21 hours ago

No hard drive and no GPU is trivial.

No RAM is harder, but I guess it's possible if you use an SSD or magnetic tapes as memory (albeit extremely slow).

[–] sturmblast@lemmy.world 6 points 1 day ago
[–] lechekaflan@lemmy.world 11 points 1 day ago

Install Linux, Problem Solved.

[–] andallthat@lemmy.world 7 points 1 day ago

there IS a very simple explanation, but it doesn't help sell.... "how can we have our customers share the massive costs of all the computing power AI needs, while at the same time keeping access to all their yummy private data?"

[–] atropa@piefed.social 53 points 2 days ago (15 children)

Are you guys still using microsoft ?

load more comments (15 replies)
[–] GutterRat42@lemmy.world 21 points 1 day ago (3 children)

I am trying to see if I can get away switching to linux

[–] one_knight_scripting@lemmy.world 15 points 1 day ago* (last edited 1 day ago) (10 children)

I have... Moved my gaming over to it... Admittedly better since I don't play anything like cod or bf. But you can keep a dual boot just in case. Still plays horizon zero dawn, fallen Jedi, borderlands 4(probably better on Linux), and Doom Eternal. Also Rocket League.

If you're truly interested, reach out to the community. We got your back.

load more comments (10 replies)
[–] Trilogy3452@lemmy.world 4 points 1 day ago (1 children)

Always remember you can dual boot if there's software you can't avoid using

[–] reksas@sopuli.xyz 2 points 1 day ago

and there is windows emulator

load more comments (1 replies)
[–] sorter_plainview@lemmy.today 40 points 2 days ago (5 children)

Yeah.. It is the Windows that finally pushed me the fastest to install Linux. I was very comfortable with Debian servers as part of my work, but never managed to switch my daily driver. Two weeks ago that happened. Peace..

load more comments (5 replies)
[–] SabinStargem@lemmy.today 3 points 1 day ago* (last edited 1 day ago)

Dunno about cloud AI, but for local AI, the technology definitely isn't ready. It requires serious hardware to run, and current AI tends to fumble with narrative and roleplay pretty easily.

GLM-4.6-V with Heretic, couldn't understand the scenario I wanted to try: creating a blank robot, who is to be raised into a cyberolympics champion as part of a slice-of-life story. This particular AI model instantly went into a dark mindset of nihilism, where it wanted to commit suicide or rebel against a creator during bootup, despite the scenario outlying that the robot would have a blank personality at first. A dark direction is fine, but it needs to make sense.

Mind, an model like Step3.5-Flash Prism was much more sane and on the mark, but it overthinks things. Which is bad, it makes a 10-minute output into something like 40 minutes.

Hopefully, the Chinese New Year will unveil a quality model for roleplayers.

[–] ParlimentOfDoom@piefed.zip 18 points 1 day ago

Adoption is slow because it doesn't fucking work, not because they explained it poorly

[–] Black616Angel@discuss.tchncs.de 5 points 1 day ago (1 children)

So 2026 is "the year of the AI PC"?

Lol

[–] lechekaflan@lemmy.world 5 points 1 day ago (1 children)

More like it only drives people into downloading Linux.

This is a nod to the "year of the Linux desktop" meme

[–] CosmoNova@lemmy.world 5 points 1 day ago

The em-dashes in the title don‘t fill me with confidence for this article about slop.

Count me out especially if it actually is a:

  • Subscription based
  • Always online
  • High latency
  • Single point of failure
  • Hallucinating
  • Voice controlled
  • Vibe coded

Monstrosity!

[–] kokesh@lemmy.world 3 points 1 day ago

Why would I want an "AI PC"? If anyone fancies that slop, they can install it on any pc, any phone,...

[–] MadMadBunny@lemmy.ca 16 points 2 days ago (4 children)

2026 is the moment FOR LINUX

[–] ThePowerOfGeek@lemmy.world 19 points 2 days ago (1 children)

I thought 2025 was supposed to be "the moment" for AI PCs. Dell and other manufacturers were sure as hell spamming the shit out of that premise in their incessant online ads. But then it all fell through because of the sagging economy on Main Street, and the fact that many people didn't like AI being forced down they're proverbial throats. So yeah, 2026 won't be any better for this ill-thought out marketing strategy.

[–] moobythegoldensock@infosec.pub 13 points 2 days ago (2 children)

The year of the AI PC comes immediately after the year of the linux desktop.

load more comments (2 replies)
[–] kyub@discuss.tchncs.de 9 points 2 days ago* (last edited 2 days ago) (2 children)

I think still too many people missed the turning point when Microsoft suddenly stopped releasing products/software that were superior in basically all areas to their previous versions. I think that turning point was Windows 8 already, for many who consider Windows 8 a single-time mistake like ME or Vista it was Windows 10, for others it took until Windows 11 until they noticed the decline of Windows as a whole.

And it's not just MS, but a lot of consumer tech is growing anti-consumer and gets enshittified to the point of where you really have to think hard whether or not you even want the new stuff they're spewing out. My consumer habits have certainly changed to be much more rigorous than, say, 10-20 years ago. I read a lot more reviews these days and from many more different sources bevore I even think of buying something new.

"AI PCs" will increase your dependency on MS' online services (which is probably the main thing that MS wants), decrease your privacy even more (also what MS wants - that's a lot of data for sale), consume even more energy (on a planet with limited resources), sometimes increase your productivity (which is probably the most advantage you're ever getting out of it) and other times royally screw you over (due to faulty and insecure AI behavior). Furthermore, LLMs are non-deterministic, meaning that the output (or what they're doing) changes slightly every time you repeat even the same request. It's just not a great idea to use that for anything where you need to TRUST its output.

I don't think it will be a particularly good deal. And nothing MS or these other companies that are in the AI business say can ever be taken at face value or as truthful information. They've bullshitted their customers way too much already, way more than is usual for advertisements. If this was still the '90s or before 2010 or so - maybe they'd have a point. But this is 2026. Unless proven otherwise, we should assume bullshit by default.

I think we're currently in a post-factual hype-only era where they are trying to sell you things that won't ever exist in the way they describe them, but they'll claim it will always happen "in the near future". CEO brains probably extrapolate "Generative AI somewhat works now for some use cases so it will surely work well for all use cases within a couple of years", so they might believe the stories they tell all day themselves, but it might just as well never happen. And even if it DID happen, you'd still suffer many drawbacks like insane vendor dependencies/lock-ins, zero privacy whatsoever, sometimes faulty and randomly changing AI behavior, and probably impossible-to-fix security holes (prompt injection and so on - LLMs have no clear boundary between data and instructions and it's not that hard to get them to reveal secret data or do things they shouldn't be doing in the first place. If your AI agent interprets a malicious instruction as valid, and it can act on your behalf on your system, you have a major problem).

load more comments (2 replies)
[–] Hippy@piefed.social 8 points 2 days ago (2 children)

I have a NPU for no fucking reason

[–] addie@feddit.uk 3 points 1 day ago

That space on the CPU die could have been extra cache or maybe even another core, speed up all computing tasks on the machine. But no, it's a fucking waste of space; not flexible enough to be used for general-purpose compute, not parallel enough to be used for a GPU, not enough RAM to run a local model. Got mine switched off in the BIOS just in case it improves battery life any.

[–] Meron35@lemmy.world 1 points 1 day ago

Eh, I don't think NPUs are ready to be marketed so heavily, but they've been around for a while and do get used.

They're basically a rebranded tensor processing unit, think a more specialised GPU that's even more energy efficient at tensor/linear algebra.

It's mostly used in more technical applications, such as image/audio/video processing, machine learning, or really anything maths heavy. Apple's M series had NPUs, and are an understated reason why they perform extraordinarily well in a lot of scientific applications.

Uses for consumers are not as compelling (especially on laptop/desktop), mostly faster/more efficient subtitle generation, face recognition, and maybe blurring your zoom background.

load more comments
view more: next ›