Even if you are the biggest critic of AI, it's hard to deny that the frontier models are quite good at the sort of stuff that you learn in school. Write a binary tree in C? Check. Implement radix sort in Python? check. An A* implementation? check.
Once upon a time, I had to struggle through these. My code wouldn't run properly because I forgot to release a variable from memory or I was off-by-one on a recursive algorithm. But the struggling is what ultimately helped me actually learn the material [2]. If I could just type out "build a hash table in C" and then shuffle a few things around to make it look like my own, I'd have never really understood the underlying work.
At the same time, LLMs are often useful, but still fail quite frequently in real world work. I'm not trusting cursor to do a database migration in production unless I myself understand and check each line of code that it writes.
Now, as a hiring manager, what am I supposed to do with new grads?
[1] which I think it might be to some extent in some companies, by making existing engineers more productive, but that's a different point
[2] to the inevitable responses that say "well I actually learn things better now because the LLM explains it to me", that's great, but what's relevant here is that a large chunk of people learn by struggling
I’ll be blunt, people don’t want to hire Gen Z because of bad past experiences with that generation.
When 1 in 6 companies are "hesitant to hire Gen Z workers," then yeah, obviously unemployment is going to be higher for them. https://finance.yahoo.com/news/1-6-us-companies-reluctant-10...
That’s just one article, but there are plenty more. Do a basic search for "not hiring Gen Z" or something similar and you’ll find tons of examples. It’s easier for people to believe AI is to blame rather than take the answer straight from the hiring managers mouths.
They don’t want to hire Gen Z because they see them as more hassle than they’re worth. I’m not saying whether that’s true or not, but that’s how a lot of managers and business owners feel right now, and you don’t have to look very hard to figure that out.
I was worried at first but this is an elite journalism product for elites who are facing economic insecurity and not AI
And for some it is giving away skills.
a give and take tool on a need to know basis
The wave of tech layoffs a few years ago were blamed on AI but were so obviously attributable to interest rates and changing tax policies that the idea that the proto-AI tools we had at the time were responsible was laughable.
This shift we at least have better AI tools to blame (though I'd argue they're still not good enough), but we also have a US President who has straight up said that he wants to cause a global recession and has been doing everything in his power to make that happen.
Given that the impact of someone running a bulldozer through the economy like this is well-studied and that we'd predict exactly what were seeing here, attributing the damage to AI is silly verging on responsible. Place the blame where it belongs!
Even those who do get employed, they tend to be underemployed with low wages.
The old excuse was 'automation' was killing jobs.
The lesser old excuse was offshoring.
Now it's AI?
How about we stop inventing excuses and perhaps look at root cause of the 'recent grad' factor. That perhaps requiring university degrees that arent worth anything for jobs that dont need them is the problem?