Photographer1: Sam, could you give us a goofier face?
*click* *click*
Photographer2: Goofier!!
*click* *click* *click* *click*
This is a most excellent place for technology news and articles.
Photographer1: Sam, could you give us a goofier face?
*click* *click*
Photographer2: Goofier!!
*click* *click* *click* *click*
He looks like someone in a cult. Wide open eyes, thousand yard stare, not mentally in the same universe as the rest of the world.
I have to test it with Copilot for work. So far, in my experience its "enhanced capabilities" mostly involve doing things I didn't ask it to do extremely quickly. For example, it massively fucked up the CSS in an experimental project when I instructed it to extract a React element into its own file.
That's literally all I wanted it to do, yet it took it upon itself to make all sorts of changes to styling for the entire application. I ended up reverting all of its changes and extracting the element myself.
Suffice to say, I will not be recommending GPT 5 going forward.
That's my problem with "AI" in general. It's seemingly impossible to "engineer" a complete piece of software when using LLMs in any capacity that isn't editing a line or two inside singular functions. Too many times I've asked GPT/Gemini to make a small change to a file and had to revert the request because it'd take it upon itself to re-engineer the architecture of my entire application.
I make it write entire functions for me, one prompt = one small feature or sometimes one or two functions which are part of a feature, or one refactoring. I make manual edits fast and prompt the next step. It easily does things for me like parsing obscure binary formats or threading new piece of state through the whole application to the levels it's needed, or doing massive refactorings. Idk why it works so good for me and so bad for other people, maybe it loves me. I only ever used 4.1 and possibly 4o in free mode in Copilot.
It's a lot of people not understanding the kinds of things it can do vs the things it can't do.
It was like when people tried to search early Google by typing plain language queries ("What is the best restaurant in town?") and getting bad results. The search engine had limited capabilities and understanding language wasn't one of them.
If you ask a LLM to write a function to print the sum of two numbers, it can do that with a high success rate. If you ask it to create a new operating system, it will produce hilariously bad results.
You can’t blame the user when the marketing claims it’s replacing entire humans.
Ai assumes too fucking much. I'd used it to set up a new 3D printer with klipper to save some searching.
Half the shit it pulled down was Marlin-oriented then it had the gall to blame the config it gave me for it like I wrote it.
"motherfucker, listen here..."
We moved to m365 and were encouraged to try new elements. I gave copilot an excel sheet, told it to add 5% to each percent in column B and not to go over 100%. It spat out jumbled up data all reading 6000%.
is there any picture of the guy without his hand up like that?
He looks like an old Chuck E Cheese animatronic. Like someone powered him down and he returned to default rest/storage mode.
those are his lying/making up hand gestures. its the same thing trump does with his hands when hes lying or exaggerating, he does the wierd accordian hands.
"Just a few more trillion dollars bro, then itll be ready..." Like a junkie.
So like, is this whole AI bubble being funded directly by the fossil fuel industry or something? Because the AI training and the instantaneous global adoption of them is using energy like it's going out of style. Which fossil fuels actually are (going out of style, and being used to power these data centers). Could there be a link? Gotta find a way to burn all the rest of the oil and gas we can get out of the ground before laws make it illegal. Makes sense, in their traditional who gives a fuck about the climate and environment sort of way, doesn't it?
its like crypto, they wanted to make money of VC funds, and thats probably running dry right now, and the investors are probably going to demand returns very soon. why do you think the massive layoffs started in 2023.
I mean, AI is using like 1-2% of human energy and that's fucking wild.
My take away is we need more clean energy generation. Good things we've got countries like China leading the way in nuclear and renewables!!
All I know is that I'm getting real tired of this Matrix / Idiocracy Mash-up Movie we're living in.
Do you have a source for that? Because given a chatgpt query takes a similar amount of energy to running a hair dryer for a few seconds i find it hard to believe.
a similar amount of energy to running a hair dryer
We see a lot of those kinds of comparisons. Thing is, you run a hair dryer once per day at most. Or it's compared to a google search, often. Again, most people will do a handful of searches each day. A ChatGPT conversation can be hundreds of messages back and forth. A Claude Code session can go for hours and involve millions of tokens. An individual AI inference might be pretty tame but the quantity of them is another level.
If it was so efficient then they wouldn't be building Manhatten-sized datacenters.
ok, but running a hairdryer for 5 minutes is well up into the hundreds of queries which is more than the vast majority of people will use in a week. The post I replied to was talking about it being 1-2% of energy usage, so that includes transport, heating and heavy industry. It just doesnt pass the smell test to me that something where a weeks worth of usage is exceeded by a person drying their hair once is comparable with such vast users of energy.
All the people here chastising LLMs for resource wastage, I swear to god if you aren't vegan...
I mean, they're both bad.
But also, "Throw that burger in the trash I'm not eating it" and "Uninstall that plugin, I'm not querying it" have about the same impact on your gross carbon emissions.
These are supply side problems in industries that receive enormous state subsides. Hell, the single biggest improvement to our agriculture policy was when China stopped importing US pork products. So, uh... once again, thank you China for saving the planet.
Wait so the biggest improvement came when there was a massive decline in demand?
The Demand didn't decline. The state imposed a strict high barrier to trade that prevented it from being fulfilled.
So you did not export that much to China or was there a big "eat more pork" campaign because else where did the demand come from afterwards?
Animal agriculture has significantly better utility and scaling than LLMs. So, its not hypocritical to be opposed to the latter but not the former.