this post was submitted on 20 Mar 2026
25 points (72.7% liked)

Technology

83040 readers
4556 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

Research.

Researchers have developed a new kind of nanoelectronic device that could dramatically cut the energy consumed by artificial intelligence hardware by mimicking the human brain.

The researchers, led by the University of Cambridge, developed a form of hafnium oxide that acts as a highly stable, low‑energy ‘memristor’ — a component designed to mimic the efficient way neurons are connected in the brain.

top 21 comments
sorted by: hot top controversial new old
[–] phutatorius@lemmy.zip 1 points 2 days ago

Not using unnecessary and shitty LLMs would reduce AI energy usage even more.

[–] Lost_My_Mind@lemmy.world 28 points 4 days ago (1 children)

Yeah. I can believe that forces within the human brain could help AI reduce it's power consumption.

Step 1) Turn off AI.

Step 2) There is no step 2.

At least, that's what my brain thought.

[–] FreddiesLantern@leminal.space 1 points 2 days ago

And it didn’t delete any system files in the process?

[–] veeesix@lemmy.ca 24 points 4 days ago (1 children)

I was half-expecting that new material to be hubris.

[–] technocrit@lemmy.dbzer0.com 2 points 2 days ago

"AI" huh? Keep throwing money in that grifter black hole.

[–] Zak@lemmy.world 15 points 4 days ago (1 children)

could dramatically cut the energy consumed by artificial intelligence hardware

Decreasing the cost of using a resource almost always results in more use of that resource.

Laboratory tests showed the devices could reliably endure tens of thousands of switching cycles

That's not very many when GPUs perform trillions of operations per second.

[–] ryannathans@aussie.zone 5 points 4 days ago (1 children)

It'd probably be far more appropriate for an analogue system where it isn't being switched but it's rather what the model is burned onto

[–] very_well_lost@lemmy.world 3 points 4 days ago (1 children)

This seems like such a glaringly-obvious solution to lower inference cost that surely there must be some fundamental flaw in it... otherwise all of the big AI firms would be doing it, right?

Right...?

[–] ryannathans@aussie.zone 2 points 4 days ago

Takes a while for the technology to become available in ASICs, we still don't have purpose designed silicon for AI. We're using repurposed GPUs with tensor cores scaled up still for pretty much all AI workloads

[–] paraphrand@lemmy.world 10 points 4 days ago
[–] felixwhynot@lemmy.world 8 points 4 days ago

I feel like memristors have been a buzzword for like 20 years now… am I wrong? Is it for real this time?

[–] Prox@lemmy.world 7 points 4 days ago

Relevant

AI boosters are no longer allowed to explain what’s good about AI using the future tense. You can no longer say “it will,” “could,” “might,” “likely,” “possible,” “estimated,” “promise,” or any other term that reviews today’s capabilities in the language of the future.

[–] db2@lemmy.world 5 points 4 days ago (2 children)

Shooting a large rocket full of tech bros directly in to the sun will have a similar effect.

[–] adespoton@lemmy.ca 4 points 4 days ago (1 children)

Shooting a rocket directly into the sun would waste as much energy as current AI data centers, because it would have to shed all the earth’s momentum.

Better to just use a volcano.

[–] db2@lemmy.world 3 points 4 days ago

We can sell tickets for people to take turns pushing. A dollar per participant sounds reasonable to me.

[–] KairuByte@lemmy.dbzer0.com 2 points 4 days ago (1 children)

Oh great, now we’re gonna start polluting the sun? Can’t we contain our garbage to a single planet?

[–] bleistift2@sopuli.xyz 2 points 4 days ago

All of the things you’d be polluting the sun with are already there.

[–] brendansimms@lemmy.world 4 points 4 days ago (1 children)

Cool tech but very much in infancy. Good for them on getting a hype article but this in no way affects anything about computing tofay.

[–] technocrit@lemmy.dbzer0.com 1 points 2 days ago

Good for them on getting a hype article

No thanks. There's more than enough bullshit in "AI".

[–] eleitl@lemmy.zip 1 points 3 days ago

Abstract

The escalating energy consumption of existing artificial intelligence hardware has become a serious global issue that demands immediate action. Neuromorphic computing offers promises to drastically reduce this footprint. Here, we introduce multicomponent p-type Hf(Sr,Ti)O2 thin films for energy-efficient, resistive switching–based neuromorphic devices. We demonstrate interfacial memristors with ultralow switching currents (≤~10−8 A), exceptional cycle-to-cycle and device-to-device uniformities, and retention >105 s. They reveal hundreds of ultralow conductance levels with a modulation range of >50 (without reaching any saturation) and reproducibly satisfy unsupervised learning rules. This performance originates from incorporating a self-assembled p-n heterointerface between p-type Hf(Sr,Ti)O2 and n-type TiOxNy, resulting in a fully depleted space-charge layer asymmetrically extended into Hf(Sr,Ti)O2, a large built-in potential, and extremely low saturation current density under reverse bias. Ultralow conductance modulation is controlled by tuning p-n heterointerface’s energy-barrier height through electro-ionic charge migration. This materials-engineering strategy addresses energy consumption and variability in existing memristors, opening a pathway toward energy-efficient neuromorphic computing systems.