this post was submitted on 20 Nov 2025
421 points (99.1% liked)

Technology

77084 readers
2625 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] floofloof@lemmy.ca 157 points 2 weeks ago (1 children)

"We were still required to find some ways to use AI. The one corporate AI integration that was available to us was the Copilot plugin to Microsoft Teams. So everyone was required to use that at least once a week. The director of engineering checked our usage and nagged about it frequently in team meetings."

The managerial idiocy is astounding.

[–] gravitas_deficiency@sh.itjust.works 51 points 2 weeks ago (3 children)

It’s pretty easy to set up a cron job to fire off some sort of bullshit LLM request a handful of times a day during working hours. Just set it and forget it.

[–] acosmichippo@lemmy.world 35 points 2 weeks ago (1 children)

you could probably even get copilot to write it!

[–] brsrklf@jlai.lu 29 points 2 weeks ago (1 children)

"Prompt yourself with some bullshit so that it looks like you're doing something productive."

Who knows, maybe that's how you attain AGI? What is a more human kind of intelligence than looking for ways to be a lazy fuck?

[–] queerlilhayseed@piefed.blahaj.zone 23 points 2 weeks ago* (last edited 2 weeks ago)

Prompt an LLM to contemplate its own existence every 30 minutes, give it access to a database of its previous outputs on the topic, boom you've got a strange loop. IDK why everyone thinks AGI is so hard.

[–] tias@discuss.tchncs.de 4 points 2 weeks ago (1 children)

Not when you have to do SAML authentication to get a token for your AD account first.

[–] brsrklf@jlai.lu 2 points 2 weeks ago (1 children)

Unless you know that guy working on both API management and the identity provider.

If, hypothetically, someone came to that person with a problem like that, they might do it just for fun. Allegedly.

[–] fibojoly@sh.itjust.works 2 points 1 week ago

That person and I seem to be doing similar jobs because I certainly would take on that challenge (SAML is such a fucking nightmare...)

load more comments (1 replies)
[–] brsrklf@jlai.lu 106 points 2 weeks ago* (last edited 2 weeks ago) (1 children)

Nothing tells that AI is a clever use of your ressources like enforcing a mandatory AI query quota for your employees, and having them struggle to find anything it's good at and failing.

[–] WanderingThoughts@europe.pub 17 points 2 weeks ago

So, it's a DAI requirement

[–] punrca@piefed.world 82 points 2 weeks ago (2 children)

The software engineer acknowledged that AI tools can help improve productivity if used properly, but for programmers with relatively limited experience, he feels the harm is greater than the benefit. Most of the junior developers at the company, he explained, don't remember the syntax of the language they're using due to their overreliance on Cursor.

Good luck for the future developers I guess.

companies that've spent money on AI enterprise licenses need to show some sort of ROI to the bean-counters. Hence, mandates.

Can't wait for AI bubble to pop. If this continues, expect more incidents/outages due to AI generated slop code in the future.

[–] scarabic@lemmy.world 5 points 2 weeks ago* (last edited 2 weeks ago) (5 children)

From what I see, the current is beginning to turn a little toward valuing senior devs more than ever, because they can deal with the downsides of AI. Junior devs, on the other hand, cannot, and their simpler coding work is also more easily replaced by AI. So we’ll see fewer junior dev jobs, but seniors might do fine. I’m not sure that’s good news for the profession as a whole, but its been an extremely long gold rush into software and online services so some correction probably won’t be the end of the trade.

Oh and yes senior devs are still hounded to use AI, because it will get them further, faster. And there are no more junior devs to help. In the hands of a skilled dev, AI tools can be powerful, and they can spare some toil, and help them find their feet in less familiar frameworks and in foreign codebases.

[–] aesthelete@lemmy.world 25 points 2 weeks ago* (last edited 2 weeks ago) (3 children)

The problems in software still remain the same though:

(1) Bureaucracy

(2) Needless process

(3) Pointy headed managers

(4) Siloed teams

(5) Product people who have no idea what they want to build

(6) Shitty, poorly performing legacy code nobody wants to touch

Honestly, AI is just the latest thing that can boost your productivity at starting up some random app. But that was never the difficult part anyway.

[–] squaresinger@lemmy.world 16 points 2 weeks ago (1 children)

This, so much this.

When I think about what limited my performance in the last year it was mostly:

  • Having to get 5 signatures before I am allowed the budget to install some FOSS software on my work PC that the corporation has already approved for use on work PCs
  • Spending 8 months working on a huge feature that was scrapped after 8 months of development
  • Being told that no, we cannot work on another large feature request (of which there are many in the pipeline) because our team said we can only fit that scrapped feature into this year and we are not allowed to replan based on the fact that the feature we were supposed to work on got scrapped by business

And then they tell us to return to office and use AI for increasing efficiency.

It's all an elaborate play performed by upper management to feign being in control and being busy with something. Nobody is actually interested in producing a product, they all just want to further their own position.

[–] Valmond@lemmy.world 2 points 2 weeks ago

The problem is the N+2 is in on it too. And so on. "It just works!"

load more comments (2 replies)
[–] dreadbeef@lemmy.dbzer0.com 2 points 1 week ago* (last edited 1 week ago)

Code is the easiest thing as a dev. AI wont help me because Im already a good coder. Its the interconnectedness between services, dependencies in ownership (who do I talk to when a gateway error occurs vs a a 401 or 403 etc), etc that are the hard problems. Getting the right people together to solve the thing, you know? AI doesnt fix that.

load more comments (3 replies)
[–] jonathan7luke@lemmy.zip 45 points 2 weeks ago

For the FAANG companies, they do it in part so they can then turn around and make those flashy claims you see in headlines like "95% of ours devs use [insert AI product they are trying to sell] daily" or "60% of our code base is now 'written' by our fancy AI".

[–] Septimaeus@infosec.pub 34 points 2 weeks ago* (last edited 2 weeks ago) (1 children)

I’ll admit, some tools and automation are hugely improved with new ML smarts, but nothing feels dumber than hunting for problems to fit the boss’s pet solution.

[–] tyler@programming.dev 5 points 2 weeks ago (2 children)
[–] assaultpotato@sh.itjust.works 23 points 2 weeks ago (1 children)

claude performs acceptably at repetitive tasks when I have an existing pattern for it to follow. "Replicate PR 123, but to add support for object Bar instead of Foo". If I get some of this busy work in my queue I typically just have claude do it while I'm in a meeting.

I'd never let it do refactors or design work, but as a code generation tool that can use existing code as a template, it's useful. I wouldn't pay an arm and a leg for it, but burning $2 while I'm in a meeting to kill chore tasks is worth it to me.

[–] MangoCats@feddit.it 6 points 2 weeks ago

Agree, I've been using claude extensively for about a month, before that for little stuff for about 3 months. It is great at little stuff. It can whip out a program to do X in 5 minutes flat, as long as X doesn't amount to more than about 1000 lines of code. Need a parser to sift through some crazy combination of logic in thousands of log files: Claude is your man for that job. Want to scan audio files to identify silence gaps and report how many are found? Again, Claude can write the program and generate the report for you in 5 minutes flat (plus whatever time the program takes to decode the audio...)

Need something more complex, nuanced, multi-faceted? Yeah, it is still easier to do most of the upper level design stuff yourself, but if you can build a system out of a bunch of little modules, AI is getting pretty good at writing the little modules.

[–] Septimaeus@infosec.pub 6 points 2 weeks ago* (last edited 2 weeks ago) (1 children)

For example the tools for the really tedious stuff, like large codebase refactoring for style keeping, naming convention adherence, all kinds of code smells, whatever. Lots of those tools have gotten ML upgrades and are a lot smarter and more powerful than what I remember from a decade ago (intellisense, jetbrains helper functions, various opinionated linter toolchains, and so forth).

While I’ve only experimented a little with some the more explicitly generative LLM-based coding assistant plugins, I’ve been impressed (and a little spooked) at how good they often were at guessing what I’m doing way before I finished doing it.

I haven’t used the prompt-based LLMs at all, because I’m just not used to it, but I’ve watched nearby devs use them for stuff like manipulating a bunch of files in a repeated pattern, breaking up a spaghetti method into reusable functions, or giving a descriptive overview of some gnarly undocumented legacy code. They seem pretty damn useful.

I’ll integrate the prompt-based tools once I can host them locally.

[–] MangoCats@feddit.it 5 points 2 weeks ago

In the work I have done with Claude over the past months, I have not learned to trust it for big things - if anything the opposite. It's a great tool, but - to anthropomorphize - it's "hallucination rate" is down there with my less trustworthy colleagues. Ask it to find all instances of X in this code base of 100 files of 1000 lines each... yeah, it seems to get bored or off-track quite a bit, misses obvious instances, finds a lot but misses too much to say it's really done a thorough review. If you can get it to develop a "deterministic process" for you (shell script or program) and test that program, then that you can trust more, but when the LLM is in the loop it just isn't all there all the time, and worse: it'll do some really cool and powerful things 19/20 times, then when you think you can trust it it will screw up an identical sounding task horribly.

I was just messing around with it and I had it doing a files organization and commit process for me, was working pretty good for a couple of weeks, then one day it just screwed up and irretrievably deleted a bunch of new work. Luckily it was just 5 minutes of its own work, but still... that's not a great result.

[–] BackgrndNoize@lemmy.world 24 points 2 weeks ago (2 children)

These scummy fucks even put it as a requirement in job descriptions these days

[–] MonkderVierte@lemmy.zip 30 points 2 weeks ago

This is a red flag for corpo culture shenanigans. Dodge the bullet.

[–] floofloof@lemmy.ca 10 points 2 weeks ago

What even is the requirement? "Must be able to ask a chatbot to do stuff"?

[–] supersquirrel@sopuli.xyz 19 points 2 weeks ago* (last edited 2 weeks ago) (3 children)

Then unionize! Nothing else will stop this.

[–] phil@lymme.dynv6.net 11 points 2 weeks ago* (last edited 2 weeks ago) (2 children)
[–] floofloof@lemmy.ca 26 points 2 weeks ago

And it won't be the rich that get hurt when the AI bubble bursts. It will be us.

[–] brianpeiris@lemmy.ca 6 points 2 weeks ago

I'd like the bubble to be true so that we can move past this nonsense phase, and it may well be true, but I could also see it being extended for years potentially, since there's so much money being pumped into it, and governments are also buying into the hype.

[–] devfuuu@lemmy.world 2 points 2 weeks ago (2 children)

Unions is not really a concept that is available to devs. At least around here.

[–] Forbo@lemmy.ml 1 points 2 weeks ago

I just attended an organizer training, and 70% of the people there were devs. Don't believe the corporate bullshit, unions are for everyone.

load more comments (1 replies)
[–] resipsaloquitur@lemmy.world 19 points 2 weeks ago (1 children)

He also said the AI-generated code is often full of bugs. He cited one issue that occurred before his arrival that meant there was no session handling in his employer's application, so anybody could see the data of any organization using his company's software.

It’s only financial software, NBD.

[–] phoenixz@lemmy.ca 8 points 2 weeks ago

Well to be fair, financial data should be public, it would stop so many crimes, so much corruption.

Maybe AI saw the problems that hidden financial data causes and just decided to do the world a favor!

[–] HazardousBanjo@lemmy.world 19 points 2 weeks ago (12 children)

As per usual, those pushing for AI the most are the ones who don't fucking use it.

Is AI good for printing out the syntax, or an example of a library you haven't used before?

Sure, sometimes yes. Sometimes no.

Should it be a requirement to be a regular part of software development?

No. AI hallucinates very often and is imitative in nature, not innovative.

[–] chilicheeselies@lemmy.world 2 points 1 week ago

More generally, noone should be required to do anything particular until it affects the team. Forcing people to work a certain way is beyond stupid.

load more comments (11 replies)
[–] python@lemmy.world 15 points 2 weeks ago (8 children)

I've been refusing to use any AI tools at all and luckily my manager respects that, even if he uses AI for basically everything he does. If the company ever decides to mandate it I'll just have the AI write all my code and commit it with no checks. With the worker's rights here, it'll take several months to fire me anyways.

load more comments (8 replies)
[–] jjjalljs@ttrpg.network 12 points 2 weeks ago

Managers are often idiots in over their heads. AI is really aggravating that problem.

[–] phutatorius@lemmy.zip 6 points 2 weeks ago

My team have been trying it. So far, at best, it costs money but makes no difference in outcomes. Any productivity gains are wiped out by the time needed to diagnose and correct the errors it introduces.

I'd use Clippy before I use any of that time-wasting, unreliable, energy-guzzling crap.

load more comments
view more: next ›