this post was submitted on 13 Aug 2025
5 points (100.0% liked)

Technology

74073 readers
3144 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

I haven't heard about thermodynamic computing before, but it sounds pretty interesting. As IEEE Spectrum explains , "the components of a thermodynamic chip begin in a semi-random state. A program is fed into the components, and once equilibrium is reached between these parts, the equilibrium is read out as the solution. This computation style only works with applications that involve a non-deterministic result ... various AI tasks, such as AI image generation and other training tasks, thrive on this hardware." It sounds almost like quantum computing to my layperson ears. [edit: fixed link]

top 4 comments
sorted by: hot top controversial new old
[–] Quazatron@lemmy.world 2 points 2 days ago* (last edited 2 days ago)

In the early days of computing, people experimented with different signaling frameworks between electronic components (valves and later transistors). Decimal and ternary were used and abandoned because binary is much easier to implement and noise resistant.

What we do today is simulate non -deterministic (noisy) signals in LLM using deterministic (binary) signals, which is a massively inefficient way to do it.

I expect more chips with a neuron-like architecture will be coming out in the next few years. It would certainly be a benefit for the environment, as the cat will not get back in the bag

[–] deegeese@sopuli.xyz 1 points 2 days ago (1 children)

Sloppier compute architecture needed to drive down costs on sloppier method of computing.

[–] finitebanjo@lemmy.world 1 points 2 days ago (1 children)

If it makes AI cheaper then great because AI is a massive fucking waste of power, but other than that I am grossed out by this tech and want none of it.

[–] randomblock1@lemmy.world 1 points 2 days ago

up to 1000x energy consumption efficiency in these workloads

Seems like a win to me