The industry went to shit after non-nerdy people found out there could be a lot of money in tech. Used to be full of other people like me and I really liked it. Now it’s full of people who are equally as enthused about it as they would be to become lawyers or doctors.
Technology
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
The industry went to shit after non-nerdy people found out there could be a lot of money in tech.
I started my undergrad in the early 90’s, and ran into multiple students who had never even used a computer, but had heard from someone that there was a lot of money to be made in computers so they decided to make that their major.
Mind you, those students tended not to do terribly well and often changed major after the first two years — but this phenomenon certainly isn’t anything particularly new. Having been both a student and a University instructor (teaching primarily 3rd and 4th year Comp.Sci subjects) I’ve seen this over and over and over again.
By way of advice to any new or upcoming graduates who may be reading this from an old guy who has been around for a long time, used to be a University instructor, and is currently a development manager for a big software company — if you’re looking to get a leg-up on your competition while you look for work, start or contribute to an Open Source project that you are passionate about. Create software you love purely for the love of creating software.
It’s got my foot in the door for several jobs I’ve had — both directly (i.e.: “we want to use your software and are hiring you to help us integrate it as our expert”; IBM even once offered a re-badged version to their customers) and indirectly (one Director I worked under once told me the reason they hired me was because of my knowledge and passion talking about my OSS project). And now as a manager who has to do hiring myself it’s also something that I look for in candidates (mind you, I also look for people who use Linux at home — we use a LOT of Linux in our cloud environments, and one of my easiest filters is to take out candidates who show no curiosity or interest in software outside whatever came installed on their PC or what they had to work with at school).
That's what happens when everyone rushes to do the same qualification - you get too many people for that area of work. More graduates doesn't magically make more jobs - it just makes more people applying for the same amount of jobs.
I graduated with a degree in Computer Science and Software Engineering from the University of Washington in 2020, during the height of Covid.
After over 3000 handcrafted applications (and many more AI-written ones), I have never been offered a job in the field.
I know of multiple CS graduates who have killed themselves, and so many who are living with their parents and working service/retail.
I think the software engineering rush of the early 2000s will be looked back upon like the San Francisco gold rush in 1949.
...the San Francisco gold rush in 1949.
Classic CS major, making an off-by-one(hundred years) error ;)
What CS subfield? I think it really depends if you were able to specialize somewhat. At least systems programming and lower level coding seems to be somewhat in demand once you get into the field. Even given the current economy we aren't really getting much interest from students.
I'd be happy to review your resume and code samples and provide feedback if you want.
3000? That’s hyperbole right?
No I have a spreadsheet with 3200 lines of submitted applications, which includes both entry level positions and internships. Many with customized cover letters.
When you do the math its not even a strong pace, only about 3/day over 3 years. On a good day I was submitting 12-15.
I even applied to some famous ones, like the time Microsoft opened up 30 entry level positions and received 100,000 applications in 24 hours. It is rumored thet they realized they cannot process 100k apps, so they threw them all away and hired internally.
Whether they actually threw them out or not, that one always sticks with me. Submitting 100k apps is literally a lifetime of human work. All of that wasted effort is a form of social murder in my opinion.
I have twenty years experience and it took me 300+ applications to get my current job.
Times are changing.
I was in a similar boat. Graduated right around the housing crash. If my wife didnt work at the time, we would have been in a terrible spot. It look a good 6 months to get my first job. After that, I haven't had any issues popping into jobs.
Sounds like you got a raw deal. Our industry has many highs and lows when it comes to jobs and work available.
My buddy graduated and took a gap year. That year happened to be the dot com crash. So he kept backpacking for another year then started looking for work. 😁
For me, even graduating in 2022 with an MSc, 6 months is a short time to find a job
2008 was a very difficult job market for sure. Even around 2017 when I graduated it was quite difficult from now. Entry level positions have evaporated in the last 6-7 years
be willing to move
you’re offering salt in the middle of the Pacific
I fled from the Midwest because there were no good jobs outside of the oil and gas industry, and ended up in the Seattle area. Saving up and moving cost 2 years of my life, Im not sure I could do it again.
...and I did apply to some jobs on the west coast, although most of my apps were around Seattle.
But please tell me, where should I have went instead of Seattle?
Honestly Seattle is a pretty good place for tech jobs, it's just that the cost of living isn't much better than California or other big tech hubs.
The major saw an unemployment rate of 6.1 percent, just under those top majors like physics and anthropology, which had rates of 7.8 and 9.4 percent respectively.
The numbers aren't too high although it shows the market is no longer starved for grads.
It's important to understand that this is a standard feature of the capitalist economy where the market is used to determine how many people are needed in a certain field at a point in time. It is not unusual that there's no overarching plan for how many software engineers would be needed over the long term. The market has to go through a shortage phase, creating the effects in wages, unemployment, educational institutions and so on, in order to increase the production of software engineers. Then the market has to go through the oversupply phase creating the opposite effects on wages, unemployment and educational institutions in order to decrease the production of software engineers. The people who are affected by these swings are a necessary part of the ability for the market to compute the next state of this part of the economy. This is how it works. It uses real people and resources to do it. The less planning we do, the more people and resources have to go through the meat grinder in order to decide where the economy goes next. We don't have to do it this way but that's how it's been decided for a while now.
I was doing my CS degree immediately after the 2008 meltdown. At the time there was a massive oversupply of finance people who graduated and couldn't find work. This continued for years. I was always shocked at the time why the university or the government does not project these things and adjust the available program sizes so that kids and their parents don't end up spending boatloads of money and lives in degrees under false promises of prosperity. I didn't have an answer then and people around me couldn't explain it either but many were asking the same question. I wish someone understood it the way I do now.
This should be common knowledge. I recall in the 1990s there was a huge push for truck drivers. Everywhere you went "Be a truck driver! Own your own business! Make six figures!" And only a decade later, employed drivers struggle to make ends meet.
If you see a huge push for a particular job - you better plan your exit.
One eight hundred, five five one, eight nine hundred. Diesel Driving Academy!
I was always shocked at the time why the university or the government does not project these things and adjust the available program sizes so that kids and their parents don't end up spending boatloads of money and lives in degrees under false promises of prosperity. I didn't have an answer then and people around me couldn't explain it either but many were asking the same question.
You are looking at Universities^0 all wrong. Predicting the markets are not their job or role in society.
The primary purpose of a University is research. That research output comes from three primary sources: the faculty, graduate students, and undergraduate students. Naturally undergrads don’t tend to come into the University knowing how to do proper research, so there is a teaching component involved to bring them up to the necessary standards so they can contribute to research — but ultimately, that’s what they exist for.
What a University is not is a job training centre. That’s not its purpose, nor should it be. A University education is the gold standard in our society so many corporations and individuals will either prefer or require University training in exchange for employment — but that’s not the Universities that are enforcing that requirement. That’s all on private enterprise to decide what they want. All the University ultimately cares about is research output.
Hence, if there is valuable research output to be made (and inputs in the form of grants) in the field of “Philosophy of Digital Thanatology” (yes, I’m making that up!), and they have access to faculty to lead suitable research AND they have students that want to study it, they’ll run it as a programme. It makes no difference whether or not there is any industry demand for “ Philosophy of Digital Thanatology” — if it results in grants and attracts researchers and students, a University could decide to offer it as a degree programme.
We have a LOT of degree programmes with more graduates than jobs available. Personally, I’m glad for that. If I have some great interest in a subject, why shouldn’t I be allowed to study it? Why should I be forced to take it if and only if there is industry demand for that field? If that were the case, we’d have nearly no English language or Philosophy students — and likely a lot fewer Math and Theoretical Physics students as well. But that’s not the point of a University. It never has been, and it never should be.
I’ve been an undergraduate, a graduate, and a University instructor in Computer Science. I’ve seen some argue in the past that the faculty should teach XYZ because it’s what industry needs at a given moment — but that’s not its purpose or its role. If industry needs a specific skill, it either needs to teach it itself, or rely on more practical community colleges and apprenticeship programmes which are designed around training for work.
[0] — I’m going to use the Canadian terminology here, which differentiates between “Universities” and “Colleges”, with the former being centres of research education that grant degrees and the latter referring to schools that are often primarily trade and skill focussed that offer more diploma programmes. American common parlance tends to throw all of the above into the bucket of “College” in one way or another which makes differentiating between them more complicated.
What you describe might be true for Canada, but it doesn't apply to all universities. Many universities have two primary tasks: research and education. These are two separate tasks with overlap.
I do find it understandable if publicly funded universities place restrictions on how many students they accept per program as it's their duty to give back go society.
Speaking for the US, major universities may be there for research, but they are a small portion of the mass of schools across the country.
People have mostly been getting degrees to get a good job since at least shortly after WW2. It’s silly to pretend people are going massively in debt without the expectation of a return on that investment.
Nothing against people learning for the joy of learning, but I absolutely hold schools accountable for not making job prospects clear when most of the students are both young and ignorant of the world.
the university or the government does not project these things and adjust the available program sizes
They kinda do, but only the part where they increase program sizes after demand exists and only wind down when the market is saturated. They can't really work too far ahead if they don't know ow something will be in demand and they don't want to tell students to not do something they offer just because there are too many graduates. Add the four or five years to graduation and you get a system that lags behind reality even if the planning was better.
But a well designed post secondary education means graduates can go into similar or related fields, they aren't limited to what is on their diploma except in their own minds.
This explains why people gave me a hard time for getting an anthropology degree....
its like psych degree, i heard people complaining in person about thier psych, yea you arnt going anywhere without a GRADuate degree for these majors, PSY-D/ PHD are the only options for that field, i assume thats what thier saying to you? anthropology might be more difficult, i assume your only going to be teaching at a university witha grad degree, but faculty positions are super-competitive asf, especially if its not a really in-demand degree.
It's finally happening, tech jobs are suffering the same unemployment that the trades had been suffering for years if not decades, only this time around it's probably self-inflicted by the AI bubble.
This isn't the first time this has happened, though.
Where I am and due to its greater practicality, nursing is more popular as a college course than compsci.
I once started as compsci, but instead got a job fixing PCs. Also self-learned basic carpentry and plumbing. Looking at raising livestock in the near future.
Nursing is huuuuge. My nurse friend with a doctorate just landed a $250k base job with 10 weeks paid vacation and a slew of other benefits. Wild.
Plumbing is huge too. If I ever need one, they're booked out like 3+ months unless you want to pay an emergency fee which is like double or triple.
I, too, am raising some livestock. We'll see where it goes. But at least to me it feels more connected and real.
In the 1970s companies started "Stack Ranking" all their employees and firing the bottom 10% in order to replace them or simply using their wages to pay CEOs more.
Companies used to provide workers a pay related sense of justice, a career for life.
Now the media will jump past all this to blame anything but the CEOs and failure of Government to reign in the wage gap via the force of law.
Companies used to provide workers a pay related sense of justice, a career for life.
.... There was a period from the 1940s to the 1970s when this was more common-place. But historically this kind of cut-throat wage squeeze was very normal, particularly in the industrialized American north.
One of the driving forces behind improvements in the American capitalist model, wrt pensions and professional job security and a regulated relationship between business and labor, was European Communism. The allure of the revolutionary communist reconstructions (and less revolutionary socialist rebuilds) drove some significant number of Western professionals into the waiting arms of Papa Stalin and a fair number more into large labor unions and socialist political ideologies.
Without the USSR as foil to the capitalist system, there is less urgency among the capitalist class to negotiate with labor and less optimism among American workers to achieve some kind of superior economic position.
That, combined with an absolute tsunami of corporate propaganda to brainwash civilian workers, a swelling pustule of a police state to cow the lumpen proletariat, and a Global War on Whatever to galvanize young liberals and conservatives alike against the phantom menace of foreign invasion, has supplanted any kind of negotiating between capital owners and their wage cuck workers.
The only thing you have to hope for in the modern day is a big enough 401k such that you can live like a parasite rather than the host.
It's called an oversaturated market. And capitalist fucks replacing people with AI
I don't think this is even the big effect we'll be seeing from AI. think that'll occur over the next 12-24 months, as LLM operationalization occurs and matures the implementations.
This.
At my jobs, AI is just scratching the surface. But they're slowly implementing entire coding bot swarms, so a Product person can report a bug, it gets reviewed by an agent, assessed by an agent, fixed by an agent, and tested by another agent - then PR'd for a dev to review.
This hurts the junior level.
Industry vulnerable to lack of investor money does badly when there is no investor money
If anyone is interested in APL programming send me your resume.
Looking for good software engineers; curious folks.
APL, now thats something I havent heard about in a while.
Similar issues at work with COBOL. Sure I know it but im literally working to get everything out of it.
Not directly related, but do you use an actual APL keyboard or use something with an APL input method, like emacs?
I actually do have apl printed Keycaps because they’re cool :D
Given enough time most people develop a memory. There are different methods of entry, separate layers, backtick input, other macros.
You can try it here https://tryapl.org/
They have some shortcuts in lieu of a full keymap.