this post was submitted on 02 Dec 2025
253 points (99.2% liked)

Technology

40861 readers
291 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 3 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] tempest@lemmy.ca 2 points 4 days ago (1 children)

The real issue and what those AI dependent companies are banking on is that they can capture a user base and when OpenAI starts to reach the end of the road with LLM improvements and moves to the extract phase it can buy these little companies to ingest their user base.

Everyone else in that space will be instantly fucked since they will now be competing directly with openai while paying their margin but that's the bet they are making.

[–] CatsPajamas@lemmy.dbzer0.com 1 points 4 days ago

They're banking on diffusion eliminating the hallucination problem, but it's too onerous to run very, very large models with it yet. Auto regressive LLMs are a dead end. One that is far away. LLMs are not and we will continue to learn a lot about them as we continue to implement them. Anyone who thought we were at a dead end should use the new Gemini. It's like a GPT 3.5 to GPT 4 level of improvement.