this post was submitted on 12 Jan 2026
638 points (96.8% liked)

Technology

78923 readers
2777 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

Across the world schools are wedging AI between students and their learning materials; in some countries greater than half of all schools have already adopted it (often an "edu" version of a model like ChatGPT, Gemini, etc), usually in the name of preparing kids for the future, despite the fact that no consensus exists around what preparing them for the future actually means when referring to AI.

Some educators have said that they believe AI is not that different from previous cutting edge technologies (like the personal computer and the smartphone), and that we need to push the "robots in front of the kids so they can learn to dance with them" (paraphrasing a quote from Harvard professor Houman Harouni). This framing ignores the obvious fact that AI is by far, the most disruptive technology we have yet developed. Any technology that has experts and developers alike (including Sam Altman a couple years ago) warning of the need for serious regulation to avoid potentially catastrophic consequences isn't something we should probably take lightly. In very important ways, AI isn't comparable to technologies that came before it.

The kind of reasoning we're hearing from those educators in favor of AI adoption in schools doesn't seem to have very solid arguments for rushing to include it broadly in virtually all classrooms rather than offering something like optional college courses in AI education for those interested. It also doesn't sound like the sort of academic reasoning and rigorous vetting many of us would have expected of the institutions tasked with the important responsibility of educating our kids.

ChatGPT was released roughly three years ago. Anyone who uses AI generally recognizes that its actual usefulness is highly subjective. And as much as it might feel like it's been around for a long time, three years is hardly enough time to have a firm grasp on what something that complex actually means for society or education. It's really a stretch to say it's had enough time to establish its value as an educational tool, even if we had come up with clear and consistent standards for its use, which we haven't. We're still scrambling and debating about how we should be using it in general. We're still in the AI wild west, untamed and largely lawless.

The bottom line is that the benefits of AI to education are anything but proven at this point. The same can be said of the vague notion that every classroom must have it right now to prevent children from falling behind. Falling behind how, exactly? What assumptions are being made here? Are they founded on solid, factual evidence or merely speculation?

The benefits to Big Tech companies like OpenAI and Google, however, seem fairly obvious. They get their products into the hands of customers while they're young, potentially cultivating their brands and products into them early. They get a wealth of highly valuable data on them. They get to maybe experiment on them, like they have previously been caught doing. They reinforce the corporate narratives behind AI — that it should be everywhere, a part of everything we do.

While some may want to assume that these companies are doing this as some sort of public service, looking at the track record of these corporations reveals a more consistent pattern of actions which are obviously focused on considerations like market share, commodification, and bottom line.

Meanwhile, there are documented problems educators are contending with in their classrooms as many children seem to be performing worse and learning less.

The way people (of all ages) often use AI has often been shown to lead to a tendency to "offload" thinking onto it — which doesn't seem far from the opposite of learning. Even before AI, test scores and other measures of student performance have been plummeting. This seems like a terrible time to risk making our children guinea pigs in some broad experiment with poorly defined goals and unregulated and unproven technologies which may actually be more of an impediment to learning than an aid in their current form.

This approach has the potential to leave children even less prepared to deal with the unique and accelerating challenges our world is presenting us with, which will require the same critical thinking skills which are currently being eroded (in adults and children alike) by the very technologies being pushed as learning tools.

This is one of the many crazy situations happening right now that terrify me when I try to imagine the world we might actually be creating for ourselves and future generations, particularly given personal experiences and what I've heard from others. One quick look at the state of society today will tell you that even we adults are becoming increasingly unable to determine what's real anymore, in large part thanks to the way in which our technologies are influencing our thinking. Our attention spans are shrinking, our ability to think critically is deteriorating along with our creativity.

I am personally not against AI, I sometimes use open source models and I believe that there is a place for it if done correctly and responsibly. We are not regulating it even remotely adequately. Instead, we're hastily shoving it into every classroom, refrigerator, toaster, and pair of socks, in the name of making it all smart, as we ourselves grow ever dumber and less sane in response. Anyone else here worried that we might end up digitally lobotomizing our kids?

(page 2) 50 comments
sorted by: hot top controversial new old
[–] SharkStudiosSK@lemmy.draktis.com 12 points 1 week ago (2 children)

This may be unpopular opinion but in my class today, even the teacher was using ai.. to prepare the entire lecture. Now i believe that learning material shoult be prepared by the teacher not some ai. Honestly i see everybody using ai today to make the learning material and then the students use ai to solve assigments. The way its heading the world everybody will just kinda "represent" an ai, not even think for themselves. Like sure use ai to find information quickly or something but dont depend on it entirely.

[–] user224@lemmy.sdf.org 8 points 1 week ago* (last edited 1 week ago)

I asked a lecturer some question, I think it was what happens when bit shifting signed integers.
He asked an LLM and read the answer.
Similarly he asked an LLM how to determine memory size allocated by malloc. It said that it was not possible, and that was the answer. But a 2009 answer from stack overflow begged to differ.
At least he actually tried it out when I told him.

But at this point I even had my father send me an LLM written slop that was clearly bullshit (made up information about non-existent internal system at our college), which he probably didn't even read as he copied everything including "AI answers may be inaccurate."

load more comments (1 replies)
[–] PierceTheBubble@lemmy.ml 11 points 1 week ago* (last edited 1 week ago) (2 children)

It becomes more apparent to me everyday, we might be headed towards a society, dynamically managed by digital systems; a "smart society", or rather a Society-as-a-Service. This seems to be the logical conclusion, if you continue the line of "smart buildings" being part of "smart cities". With use of IoT sensors and unified digital platforms, data is continuously being gathered on the population, to be analyzed, and its extractions stored indefinitely (in pseudonymized form) by the many data centers, currently being constructed. This data is then used to dynamically adapt the system, to replace the "inefficient" democratic process and public services as a whole. Of course the open-source (too optimistic?) model used, is free of any bias; however nobody has access to the resources required to verify the claim. But given big-tech, historically never having shown any signs of tyranny, a utopian outcome can safely be assumed... Or I might simply just be a nut, with a brain making nonsensical connections, which have no basis in reality.

[–] Meron35@lemmy.world 8 points 1 week ago (1 children)

Nope, this is exactly how surveillance capitalism works

load more comments (1 replies)
[–] prole@lemmy.blahaj.zone 6 points 1 week ago

So Brave New World, only way stupider. No thanks.

[–] SethTaylor@lemmy.world 11 points 1 week ago* (last edited 1 week ago)

I've never seen anything make more people act stupid faster. It's like they're in some sort of frenzy. It's like a cult.

Three years ago and everyone talks about it like life has never and will never exist without it and if you don't use it you're useless to society

So stupid I don't have a nice, non-rage-inducing way to describe it. People are simply idiots and will fall for any sort of marketing scam

"AI: not even once"

[–] NigelFrobisher@aussie.zone 8 points 1 week ago

At work now we’re having team learning sessions that are just one person doing a change incredibly slowly using AI while everyone else watches, but at least I can keep doing my regular work if it’s a Teams call. It usually takes the AI about 45 minutes to decide what I immediately knew needed doing.

[–] lechekaflan@lemmy.world 7 points 1 week ago (2 children)

Thru AI as some glorified meme generators, what oligarchies are now steering millions of people to become... cows.

[–] Bosht@lemmy.world 5 points 1 week ago (6 children)
load more comments (6 replies)
load more comments (1 replies)
[–] Cryxtalix@programming.dev 6 points 1 week ago (2 children)

I think, therefore I am. If they don't think, I'm not so sure.

AI gets increasingly easy and more capable, so there's really no reason to adopt AI early in case you miss out. AI never allows anyone to miss out, the end goal is quite literally to be used by babies and animals. Any preparation you do today, is preparation you don't need to do in the near future as AI strives to take over everything.

Feel free to set AI aside and work on yourself. You won't miss out. AI won't let you miss out.

[–] Disillusionist@piefed.world 6 points 1 week ago

I think you'd probably have to hide out under a rock to miss out on AI at this point. Not sure even that's enough. Good luck finding a regular rock and not a smart one these days.

load more comments (1 replies)
[–] HertzDentalBar@lemmy.blahaj.zone 6 points 1 week ago (1 children)
load more comments (1 replies)
[–] iagomago@feddit.it 5 points 1 week ago (2 children)

As a teacher in a school that has been quite aggressively pushing AI down our curriculum, I have to close an eye in regard to it when it comes to a simple factor of education as a work environment: bureaucracy. Gemini has so far been a lifesaver in checking the accuracy of forms, producing standardized and highly-readable versions of tests and texts, assessment grids and all of the menial shit that is required for us to produce (and which detracts a substantial amount of time from the core of the job, which would be working with the kids).

[–] UnderpantsWeevil@lemmy.world 8 points 1 week ago* (last edited 1 week ago) (1 children)

I mean, the bitter truth of all this is the downsizing and resource ratcheting of public schools creating an enormous labor crisis prior to the introduction of AI. Teachers were swamped with prep work for classes, they were expected to juggle multiple subjects of expertise at once, they were simultaneously educator and disciplinarian for class sizes that kept mushrooming with budget cuts. Students are subject to increasingly draconian punishments that keep them out of class longer, resulting in poorer outcomes in schools with harsher discipline. And schools use influxes of young new teachers to keep wages low, at the expense of experience.

These tools take the pressure off people who have been in a cooker since the Bush 43 administration and the original NCLB school privatization campaign. AI in schools as a tool to bulk process busy work is a symptom of a deeper problem. Kids and teachers coordinating cheating campaigns to meet arbitrary creeping metrics set by conservative bureaucrats are symptoms of a deeper problem. The education system as we know it is shifting towards a much more rigid and ideologically doctrinaire institution, and the endless testing + AI schooling are tools utilized by the state to accomplish the transformation.

Simply saying "No AI in Schools" does nothing to address the massive workload foisted on faculty. It does nothing to address how Teach-The-Test has taken over the educational philosophy of public schooling. And it does nothing to shrink class sizes, to maintain professional teachers for the length of their careers (rather than firing older teachers to keep salaries low), or to maximize student attendance rates - the three most empirically proven techniques to maximizing educational quality.

AI is a crutch for a broken system. Kicking the crutch out doesn't fix the system.

load more comments (1 replies)
load more comments (1 replies)
[–] GMac@feddit.org 4 points 1 week ago* (last edited 1 week ago) (2 children)

Sweden have been leading the way in extracting screens and digital services from schools. Worth reading this: https://www.afterbabel.com/p/sweden-went-all-in-on-screens-in?publication_id=1221094 Plenty of data referenced for the reasons why...

load more comments (2 replies)

Previous tech presented information, made it faster and more available. It also just processed information. AI however claims to do the creativity and decision making for you. Once you've done that you've removed humans from any part of the equation except as passive consumers unneeded for any production.

How you plan on running an economy based on that structure remains to be seen.

load more comments
view more: ‹ prev next ›