- cross-posted to:
- aicompanions@lemmy.world
- cross-posted to:
- aicompanions@lemmy.world
Many Gen Z employees say ChatGPT is giving better career advice than their bosses::Nearly half of Gen Z workers say they get better job advice from ChatGPT than their managers, according to a recent survey.
Asking ChatGPT for advice about anything is generally a bad idea, even though it might feel like a good idea at the time. ChatGPT responds with what it thinks you want to hear, just phrased in a way that sounds like actual advice. And especially since ChatGPT only knows as much information as you are willing to tell it, its input data is often biased. It’s like an r/relationshipadvice or r/AITA thread, but on steroids.
You think it’s good advice because it’s what you wanted to do to begin with, and it’s phrased in a way that makes your decision seem like the wise choice. Really, though, sometimes you just need to hear the ugly truth that you’re making a bad choice, and that’s not something that ChatGPT is able to do.
Anyways, I’m not saying that bosses are good at giving advice, but I think ChatGPT is definitely not better at giving advice than bosses are.
I’m not touting the merits of “prompt engineering” but this is a classical case.
Don’t ask “how can I be a more attractive employee” ask “I am a manager at a <thing> company. Describe features and actions of a better candidate/ employee.”
You will get very different answers
Well, I don’t have any experience asking it for career advice, but I have worked with it quite a bit and it’s quite shitty once you get to anything that starts resembling complexity. This is definitely not a tool I’d go to for any advice beyond the simplest ones.
Surprisingly, I’ve had the opposite effect. Wherein, it has increased my productivity by tenfold and has helped with code review and/or confirming various logic, etc. Although, I wouldn’t necessarily take what it tells me as gospel from a recommendation standpoint in terms of my career as a whole. I’ve definitely caught it numerous times being wrong, but the inaccuracies pale in comparison to what it gets right, imo.
Don’t get me wrong, it saved me a ton of time. Just recently I needed some coding help that would probably take me hours of searching. Doesn’t mean I’d trust it with advice, that’s something entirely different than spitting out code that works half of the time.
When I tried to use it for code-related questions, it straight up made things up that didn’t exist.
Same here. The most I get out of might be a pointer to a module that could be a better approach, but the code I get from ChatGPT is usually worthless.
I treat it as my water cooler talk, and maybe I’ll come away with a few new ideas.
Like rubber duckying?
Exactly!