You are right man
Programmer Humor
Welcome to Programmer Humor!
This is a place where you can post jokes, memes, humor, etc. related to programming!
For sharing awful code theres also Programming Horror.
Rules
- Keep content in english
- No advertisements
- Posts must be related to programming or programmer topics
If you want to know how computers work, do electrical engineering. If you want to know how electricity works, do physics. If you want to know how physics works, do mathematics. If you want to know how mathematics works, too bad, best you can do is think about the fact it works in philosophy.
all roads lead to philosophy
Everything is philosophy until it becomes science. Unless it's anything to do with politics then it just remains philosophy forever.
Science is a subdiscipline of philosophy.
If you want to know how philosophy works, do sociology...
It's kind of like a horseshoe with philosophy and math at the ends.
If you want to no longer want to know how anything works, do biochemistry
Too real
tbf all good programmers are good at math. Not classic arithmetic necessarily, but at the very least applied calculus. It's a crime how many people used a mathematical discipline every day, but don't think they're "good at math" because of how lazer focused the world is on algebra, geometry and trig as being all that "math" is.
Serious question; how does Calculus apply to programming? I’ve never understood.
PID control is the classic example, but at a far enough abstraction any looping algorithm can be argued to be an implementation of the concepts underpinning calculus. If you're ever doing any statistical analysis or anything in game design having to do with motion, those are both calculus too. Data science is pure calculus, ground up and injected into your eyeballs, and any string manipulation or Regex is going to be built on lambda calculus (though a very correct argument can be made that literally all computer science is built of lambda calculus so that might be cheating to include it)
Lambda calculus has no relation to calculus calculus, though.
Data science is pure calculus, ground up and injected into your eyeballs
Lol, I like that. I mean, there's more calculus-y things, but it's kind of unusual in that you can't really interpret the non-calculus aspects of a neural net.
Graphics programming is the most obvious one and it uses it plenty, but really any application that can be modeled as a series of discrete changes will mostly likely be using calculus.
Time series data is the most common form of this, where derivatives are the rate of change from one time step to the next and integrals are summing the changes across a range of time.
But it can even be more abstract than that. For example, there's a recent-ish paper on applying signal processing techniques (which use calculus themselves, btw) to databases for the purposes of achieving efficient incremental view maintenance: https://arxiv.org/abs/2203.16684
The idea is that a database is a sequence of transactions that apply a set of changes to said database. Integrating gets you the current state of the database by applying all of the changes.
that can't be right. maybe they meant lambda calculus? programmers are definitely good at applied logic, graph theory, certain kinds of discrete math etc. but you're not whipping out integrals to write a backend.
Any function that relies on change over a domain is reliant on concepts that are fundementally calculus. Control systems, statistical analysis, data science, absolutely everything in networking that doesn't involve calling people on the phone to convince them to give you their password, that is all calculus.
"Engineer of Information", please 😎
Depends on the context. When my company proposes me to a client for work I am, but oddly during my yearly performance review I am just some smuck who programs.
looks weird without the clevage
Had a graduate Dev who did not have a fucking clue about anything computer related. How tf he passed his degree I have no idea.
Basic programming principles? No clue. Data structures? Nope.
We were once having a discussion about the limitations of transistors and dude's like "what's a transistor?" ~_~#
Tbh, as a dev knowledge of transistors is about as essential as knowledge about screws for a car driver.
It's common knowledge and in general maybe a little shameful to not know, but it's really not in any way relevant for the task at hand.
Maybe for dev knowledge, but computer science? The science of computers?
What kind of cs degree did you get where you learned about electrical circuits. The closest to hardware I've learned is logic circuit diagrams and verilog.
I mean, I graduated over 20 years ago now, but I had to take a number of EE courses for my CS major. Guess that isn't a thing now, or in a lot of places? Just assumed some level of EE knowledge was required for a CS degree this whole time.
I got my BS in CSci about 15 years ago and it was 100% about programming in java. We didn't learn a fucking thing about hardware and my roommate was an EE major and we had none of the same classes except for calculus.
By the time I graduated java was basically dead. Thanks state college.
My CS program had virtually no programming outside a couple of courses where C was used to implement concepts. Had one applications type course where mostly Java was used.
CS is and should be a specialized math curriculum IMO. Teaching specific programming languages is time that would be better spent teaching theory that can't be taught by dev docs or code bootcamps, as exemplified by your anecdote. Unfortunately nowadays people tend to see degrees as glorified job training programs.
Yeah, EE and CS had a lot of cross over where I went. At least in undergrad, grad school saw them diverge a lot more, but they still never disentangled, parts of each were important to both. Hell we had stuff like A+ labs, and shit.
Java isn't dead, though
Well, computer science is not the science of computers, is it? It's about using computers (in the sense of programming them), not about making computers. Making computers is electrical engineering.
We all know how great we IT people are at naming things ;)
Computational theory would be a better name, but it overlaps with a more specific subset of what is normally called CS.
We could also just call it Software Engineering. That's at least the job everyone gets with a Computer Science degree.
My BS in CS took its roots down to CMOS composition of logic gates and basic EE, on the hardware side, and down to deriving numbers and arithmetic from Boolean logic / predicate calculus, on the philosophy side. Then tied those up together through the theoretical underpinnings of computation and problem solving, like a trunk, and branched back out into the various mainstream technologies that derived from all that. It obviously all depends on the program at the school of choice, I suppose, and I'm sure it's evolved over the years, but it still seems important to have at least some courses that pull back the wizard's curtain to ensure their students really see how it's all just an increasingly elaborate, high-tech version of conceptually simple (in function) machinery carrying out fundamental building blocks of logic.
Anyway, I'm going to go sniff my own cinnamon roll scented farts while gazing in the mirror, now.
I've met people like that too.
It's called cheating, lots of people do it.
Most worthless dev I've met was a graduate of comp sci who couldn't hold a candle compared to a guy that did a dev boot camp.
The best dev I've met so far didn't even have any credentials whatsoever, second next best did 2yr associates.
Tie for 3rd best with associate's and 4yr degree.
I was partnered with that guy for one class in grad school. We were working on a master's degree in software engineering, and the assignment was analysis and changes to an actual code base, and this mofo was asking questions and/or blanking on things like what you mention. I can't remember the specifics but it was some basic building block kind of stuff. Like what's an array, or what's a function, or how do we send another number into this function. I think the neurons storing that info got pruned to save me the frustrating memories.
I just remember my internal emotional reaction. It was sort of "are you fucking kidding me" but not in the sense that somebody blew off the assignment, was rude, or was wrong about some basic fact. I have ADHD and years ago I went through some pretty bad periods with that and overall mental & physical health. I know the panic of being asked to turn in an assignment you never knew existed, or being asked about some project at work and just have no idea whatsoever how to respond.
This was none of those. This was "holy shit, this guy has never done anything, how did he even end up here?"
I’m something of a scientist myself
I mean, nowadays you need to be very smart and educated to google efficiently and avoid all the AI traps, missinformation, stackoverflow mods tripping, reading reddit threads on an issue with half the comments deleted because of the APIcalypse etc... sooo you could argue that you're somewhat of a scientist yourself
Had a discussion with my 8yo niece the other day… turned out the lesson was, sometimes it can be worse to know the wrong thing than to know nothing at all.
If a C- is enough to pass Analysis of Algorithms, then a Computer Science degree can make me a Computer Scientist. :P
You need C++ for computer science, though!
I literally have no idea what this picture means, and at this point I'm too afraid to ask.
Be me, a computer scientist who still struggles with XOR.
I have been coding since I was 10 years old. I have a CS degree and have been in professional IT for like 30 years. Started as a developer but I’m primarily hardware and architecture now. I have never ever said I was a computer scientist. That just sounds weird.
IT stooge != science Sorry fellas.