this post was submitted on 21 Aug 2025
499 points (98.3% liked)

Technology

74265 readers
4380 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

Money quote:

Excel requires some skill to use (to the point where high-level Excel is a competitive sport), and AI is mostly an exercise in deskilling its users and humanity at large.

you are viewing a single comment's thread
view the rest of the comments
[–] jubilationtcornpone@sh.itjust.works 113 points 19 hours ago (2 children)

There are things that could be done to improve Excel. For instance, fully integrate python and allow it to be used to create custom functions. Then, maybe one day, VBA can ride off into the sunset where it belongs.

Adding Copilot to Excel is not an improvement because Copilot and all other LLM based platforms frequently barfs out totally incorrect information about how to do something in Excel.

"You do that using formula."

No, I can't, you worthless pile of shit because THAT FORMULA DOESNT EXIST.

[–] CameronDev@programming.dev 62 points 18 hours ago (6 children)

Integrated python scripts in excel sounds like a malware developers dream.

[–] rollerbang@lemmy.world 9 points 8 hours ago (2 children)

I mean... Yeah, but the same can be said for VB?

[–] CameronDev@programming.dev 2 points 1 hour ago

Yeah, but lots more tooling and libraries for Python. Its just one more attack surface 🤷

[–] dual_sport_dork@lemmy.world 3 points 3 hours ago

Especially since VBA can make calls to the Windows API directly and through that avenue do all kinds of funky things to your system.

[–] turkalino@lemmy.yachts 15 points 15 hours ago* (last edited 15 hours ago) (1 children)

Surely there’s some sort of sandboxing that could be done? Like start by disallowing sys calls entirely

[–] CameronDev@programming.dev 6 points 14 hours ago

Definitely, but sandboxes can be escaped, and you can't protect everything via sandbox. Apparently its all cloud anyway, but if it were local and sandboxed, there are still exploits like rowhammer and spectre that may cause further risks.

Its taken years to get browser sandboxes to where they are, and even they get broken every so often.

[–] Semi_Hemi_Demigod@lemmy.world 31 points 18 hours ago (2 children)

And a nightmare for an application developer told to make some app with a spreadsheet for a database scale

[–] CameronDev@programming.dev 39 points 18 hours ago (1 children)

Could result in some very cursed codebases.

"We dont use git, we just update the excel spreadsheet"

[–] Gork@sopuli.xyz 23 points 17 hours ago

I've worked at places where they did that anyway lol

[–] frongt@lemmy.zip 15 points 17 hours ago (1 children)
[–] Zwuzelmaus@feddit.org 7 points 12 hours ago

Is that creepy thing still alive?

[–] elvith@feddit.org 8 points 15 hours ago (3 children)

They foresaw that. That's because python on Excel doesn't run locally, but in the cloud and then returns the result to you: https://support.microsoft.com/en-us/office/introduction-to-python-in-excel-55643c2e-ff56-4168-b1ce-9428c8308545

[–] magikmw@piefed.social 4 points 7 hours ago

That's even worse!

[–] echodot@feddit.uk 2 points 6 hours ago (1 children)

That's the worst possible solution to that problem. Why can't they just develop their own script that's Turing complete but doesn't have any system calls?

[–] chillhelm@lemmy.world 2 points 4 hours ago (1 children)

Or just use Lua compiled without the system calls. This is done by many video games. İt's 2025, there is no need to create new domain specific languages.

[–] tux0r@feddit.org 2 points 2 hours ago

Or use embedded Lisp, like all the cool kids.

[–] CameronDev@programming.dev 9 points 15 hours ago (1 children)

Still sounds like you'd be shipping your data to the cloud, where it can be exfilled from there.

Would potentially be a great phishing tool, just need to trick someone into putting sensitive data into a precooked excel file, and it gets exfilled.

[–] elvith@feddit.org 4 points 14 hours ago

Currently only for business customers which probably use OneDrive or SharePoint anyways, so it's not that they need that to exfiltrate data. But for a phishing/hacking attempt? There are probably some nice possibilities.

[–] jubilationtcornpone@sh.itjust.works 12 points 18 hours ago* (last edited 17 hours ago)

Fair point. Of course that's already a problem with Excel. It would probably have to be disabled by default just like VBA macros.

[–] Godort@lemmy.ca 2 points 18 hours ago

Yeah, no doubt.

Having access to visual basic is dangerous enough, let alone Python

[–] Melvin_Ferd@lemmy.world -1 points 18 hours ago* (last edited 18 hours ago) (3 children)

Yea like what? It's been a big increase in workflow for me.

[–] FauxPseudo@lemmy.world 6 points 16 hours ago (2 children)

Increase in workflow? Like there are more steps to perform the same task? Because workflow isn't work volume or units if output. It's the process that gets the work done.

Did the increase in "workflow" get you more money or more work for the same money?

[–] Melvin_Ferd@lemmy.world 1 points 6 hours ago

Like I spend less time trying to build formulas and I can create formulas and tools I normally wouldn't with it because I can have a conversations about what I want to do and it provides suggestions.

[–] tja@sh.itjust.works 1 points 15 hours ago

I mean... they responded in agreement to a comment that said it's not an improvement. So it seems to me that it also would not increase the money they get out of it.

[–] ryannathans@aussie.zone 6 points 18 hours ago (1 children)

How did it improve your workflow?

[–] Melvin_Ferd@lemmy.world 1 points 6 hours ago (1 children)

I have a conversation about what I want to do and it provides suggestions and formulas or tools I wasn't even aware of that stopped up productivity.

[–] tux0r@feddit.org 2 points 3 hours ago (1 children)
[–] Melvin_Ferd@lemmy.world 1 points 3 hours ago* (last edited 3 hours ago) (1 children)

You know it's easier, efficient and better to have a conversation with an AI rather than read an entire manual that won't contain the thing you need right?

You know, all the things it's advertised to do.

[–] tux0r@feddit.org 2 points 2 hours ago (1 children)

Only to find the "AI" hallucinating functions that won't work. Or won't do the thing you were told they do.

[–] Melvin_Ferd@lemmy.world 1 points 2 hours ago* (last edited 2 hours ago) (1 children)

Most times it works. When it doesn't, it'll still get me 90% of a solution and I can use the manual or other means to finish it. Again a much faster and better approach. I don't need to spend hours of my time reading manuals that barely touch on the knowledge I need just to bash keys and hope for the best. Plus it's conversational so it engages other parts of the brain as a learning tool.

These fucking people use a rubber duck for the same thing

[–] tux0r@feddit.org 2 points 2 hours ago (1 children)

When it doesn’t, it’ll still get me 90% of a solution and I can use the manual or other means to finish it. Again a much faster and better approach.

So you prefer a "90% solution and then read the manual" to "read the manual anyway"?

[–] Melvin_Ferd@lemmy.world 1 points 20 minutes ago (1 children)

Yes. Why would I need to read something that is 99% information that is not important to me or my problem.

[–] tux0r@feddit.org 1 points 11 minutes ago

Because you'll learn the solution for the other problems you'll have some day on the way.

[–] theunknownmuncher@lemmy.world 6 points 18 hours ago (1 children)

Lol you shared your personal experience and got downvoted... lmao even

[–] Melvin_Ferd@lemmy.world -1 points 6 hours ago* (last edited 6 hours ago) (1 children)

Lemmy is propaganda against AI at this point. Not sure what paid for it but it has all the markers. Feels like being in the comment section of ny post articles.

Same energy as talking online about immigrants, nuclear energy or marvel

It's using a community to post toxic and dystopian articles over and over again. Lemmy technology communitys are extremely vile. Not sure why it happened but it's turned toxic

[–] echodot@feddit.uk 4 points 6 hours ago (1 children)

There isn't propaganda against AI, it's totally grassroots because companies are overselling it.

[–] Melvin_Ferd@lemmy.world -2 points 6 hours ago* (last edited 5 hours ago) (2 children)

No it isn't. There is 100% propaganda and media targeting communities to spread it.

The Gap between peoples opinion towards AI in everyday life vs people on Lemmy is massive and a good indicator that Lemmy is astroturfed to be toxic towards it. People who are influenced cannot see it, outsiders can though. It's like seeing right wingers talk about immigrants. They'll never be able to see how their news and media influence them. That is their truth and it's as true to them as hate towards AI is towards lemmings in places like c/technology

Look at the articles posted, the headlines, the appeals used, the comments. It has all the markers of an Astro turf campaign.

[–] echodot@feddit.uk 2 points 1 hour ago (1 children)

The Gap between peoples opinion towards AI in everyday life vs people on Lemmy is massive and a good indicator that Lemmy is astroturfed

By who? Your conspiracy theory makes no sense. Why would anyone want to do that.

[–] Melvin_Ferd@lemmy.world 1 points 23 minutes ago* (last edited 21 minutes ago)

You really can’t imagine why corporations and political groups who spend billions paying people to manufacture narratives and flood feeds might hate the idea of ordinary people suddenly having their own free, on-demand content factory, fact-checker, and megaphone?

That's on both sides of the political spectrum. These AI tools are not just Google chat. You can build with them rapidly. Is it some revolutionary thing? No

But can it be a game changer in some areas? Absolutely.

They moved rapidly with the media on this. Compare headlines for AI to any other yellow journalistic topic. They're identical

[–] lightnsfw@reddthat.com 4 points 5 hours ago (1 children)

Lemmy is pretty consistent with the people I know IRL in terms of opinions on AI.

[–] Melvin_Ferd@lemmy.world -1 points 5 hours ago (3 children)

Not where I am. I haven't met anyone irl that has any spite with AI. They think it's interesting. Have tried it a few times. But nobody is out there saying fuck AI.

[–] theunknownmuncher@lemmy.world 3 points 4 hours ago* (last edited 3 hours ago)

No, I'd definitely agree that AI sentiment overall is pretty negative. I am not such a hardliner, but they are definitely out there. I don't see it as astroturfing at all, to even suggest this is ironic because LLMs are the ultimate astroturfing tool. The institutions capable of astroturfing do support AI and are using it. What institution or organization are you accusing of anti-AI astroturfing, exactly? This question requires an answer for that claim to be taken seriously.

IMO the problem is not LLMs itself, which are very compelling and interesting for strictly language processing and enable software usecases that were almost impossible to implement programmatically before; the problem is how LLMs are being used incorrectly for usecases that they are not suited for, due to the massive investment and hype. "We spent all this money on this so now we have to use it for everything". It's wrong. LLMs are not knowledge stores, they are provably bad at summarization and as a search interface, and they should especially not be used for decision making in any context. And people are reacting to the way LLMs are being forced into all of these roles.

People also take strong issue with their perceived violation of intellectual property and training on copyrighted information, viewing AI generated arts as derivative and theft.

Plus, there are very negative consequences to generative AI that aren't yet fully addressed. Environmental impact. Deepfakes. They're a propaganda machine; they can be censored and reflect biases of the institutions that control them. Parasocial relationships, misguided self-validating "therapy". They degrade human creativity and become a crutch. Impacts on education and cheating. Replacement of jobs and easier exploitation of workers. Surveillance.

All of these things are valid and I hear them all from people around me, not just on the internet.

[–] HugeNerd@lemmy.ca 2 points 4 hours ago

I fed AI all my Lemmy posts and asked it for a portrait of the artist. Not bad, down to my 6 fingers.

[–] lightnsfw@reddthat.com 3 points 5 hours ago

That was the initial impression of it. Now that we've had more experience with it and learned that it can't be relied on, perception has changed. It is oversold and the costs are not worth what we are getting out of it.