this post was submitted on 20 Jan 2026
606 points (99.0% liked)

Technology

78923 readers
3390 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

Big tech boss tells delegates at Davos that broader global use is essential if technology is to deliver lasting growth

you are viewing a single comment's thread
view the rest of the comments
[–] makyo@lemmy.world 9 points 15 hours ago (2 children)

Is that true? I haven’t heard MS say anything about enabling local LLMs. Genuinely curious and would like to know more.

[–] IcedRaktajino@startrek.website 5 points 15 hours ago* (last edited 15 hours ago) (1 children)

Isn't that the whole shtick of the AI PCs no one wanted? Like, isn't there some kind of non-GPU co-processor that runs the local models more efficiently than the CPU?

I don't really want local LLMs but I won't begrudge those who do. Still, I wouldn't trust any proprietary system's local LLMs to not feed back personal info for "product improvement" (which for AI is your data to train on).

[–] Feyd@programming.dev 2 points 14 hours ago

NPU neural processing unit

[–] tal@lemmy.today 2 points 15 hours ago* (last edited 15 hours ago)

That's why they have the "Copilot PC" hardware requirement, because they're using an NPU on the local machine.

searches

https://learn.microsoft.com/en-us/windows/ai/npu-devices/

Copilot+ PCs are a new class of Windows 11 hardware powered by a high-performance Neural Processing Unit (NPU) — a specialized computer chip for AI-intensive processes like real-time translations and image generation—that can perform more than 40 trillion operations per second (TOPS).

It's not...terribly beefy. Like, I have a Framework Desktop with an APU and 128GB of memory that schlorps down 120W or something, substantially outdoes what you're going to do on a laptop. And that in turn is weaker computationally than something like the big Nvidia hardware going into datacenters.

But it is doing local computation.