As a fun exercise replace AI with "junior" and "junior" with "mid-level." It holds up pretty well, as a manager you have responsibility for the work your team does and "make everyone put in more hours for no reason" is dumb. Maybe it comes across a bit neglecting of the "juniors" (in particular, it doesn't show any desire for figuring out ways for AI/"the juniors" to grow their responsibilities in a sustainable way).
Imagine reading that version as someone who doesn't know how big companies work. "But then they'll just fire all the mid-level managers, since they don't do any of the actual work!" Haha, boy would you be wrong.
> You must understand what your AI generated code does
Absolutely yes.
> You must be able to do your job if your AI tooling disappears
Absolutely not.
Look, I'm an alright programmer. Not good, far from great. Interpreted languages work for me; add all that strong typing and compilation and it starts to go beyond what I'm interested in. Nonetheless, pre-AI, I have shipped many very functional, production-grade applications for many companies.
Now, I can write stuff in Go, and Rust, and it's fantastic. So much faster. The AI likes the strong typing, the test-ability, predictability, it all makes total sense. I'm using this stuff all the time, but I have not learned any Go; I'm too busy focusing on the parts the AI cannot do for me, like real requirements gathering, architecture, fit and finish, engaging stakeholders, etc. that still require the human touch. Maybe I could have learned some Go using that time, but at the end of the day my employer is paying me for results, not for my edification!
There are now huge parts of my job I cannot do without AI. Sure, it's like 800-1200 bucks a month of extra cost; ok; but with that extra low-5-figs a year of cost I am a much better employee in terms of my capabilities. It's easily delivering ROI for me, and therefore for my employer. Instead of sitting around wishing I had a Go developer to ask for help implementing a simple feature in a Terraform provider, I can just fork it and add what I need, try to submit it upstream for inclusion, etc. and the lack of language specific skills is no longer holding me back.
Take away the tool and I can't do that part of the job anymore, sorry. I can still do a lot, but slower, and honestly it would feel like going from a car back to walking, now; walking's fun, I do it recreationally for the sheer joy, but when there's hundreds of kilometres to cover in a short amount of time, the car is clearly the correct choice. So too is it with AI: we've invented the car for computers and only a fool would pretend he can do everything the same without it now.
'If you can't build a TODO list app using only punchcards, then you can't do your job...'
Obviously our ambitions expand due to better tools. I now commit to and deliver much more work than before LLMs, and — before then — ditto for frontend frameworks, generation 4 languages etc.
There are projects I now start without thinking twice that I never would have considered a few years ago.
That's what productivity looks like, and it makes you more valuable, and your job more secure (up until the ASI kills us all...).
A better analogy would be "the trebuchet for computers".
"but when there's hundreds of kilometres to cover in a short amount of time, the trebuchet is clearly the correct choice."
you point it in the rough direction and distance you want to go, pull the lever, see if you hit your mark, adjust, pull the lever again, etc.
And once you have dialed in the variables for that particular piece of rock that one time, you write it down in a "skill.md" file and announce to everyone on the team "this trebuchet has been carefully calibrated. Trust it with your other rocks too."
> only a fool would pretend he can do everything the same without it now
Unless you're working in a coding sweatshop, I don't see why you need AI to do what people have been doing for decades just fine without breaking a sweat.
Academia is the place with the least coherent policy. In the few institutions I'm aware of the AI rules for, the guide is usually 3 lines long and it is basically we don't promote usage of it, which is a meaningless phrase. Therefore you end up with students who are not supposed to use it unless they are international masters students who require it because of language barriers, and in that scenario, it is basically allow them to use it however they like even if it makes a mockery of the rigour of a degree. Lecturers can use it as and when they wish, then you get researchers who either use it endlessly or not at all. Then upper management who use it instead of using their own brain.
For another type of incoherent policy: don't restrict your employees to 2025 models and then accuse them of being sticks in the mud when they say the models are inadequate.
Imagine reading that version as someone who doesn't know how big companies work. "But then they'll just fire all the mid-level managers, since they don't do any of the actual work!" Haha, boy would you be wrong.
Absolutely yes.
> You must be able to do your job if your AI tooling disappears
Absolutely not.
Look, I'm an alright programmer. Not good, far from great. Interpreted languages work for me; add all that strong typing and compilation and it starts to go beyond what I'm interested in. Nonetheless, pre-AI, I have shipped many very functional, production-grade applications for many companies.
Now, I can write stuff in Go, and Rust, and it's fantastic. So much faster. The AI likes the strong typing, the test-ability, predictability, it all makes total sense. I'm using this stuff all the time, but I have not learned any Go; I'm too busy focusing on the parts the AI cannot do for me, like real requirements gathering, architecture, fit and finish, engaging stakeholders, etc. that still require the human touch. Maybe I could have learned some Go using that time, but at the end of the day my employer is paying me for results, not for my edification!
There are now huge parts of my job I cannot do without AI. Sure, it's like 800-1200 bucks a month of extra cost; ok; but with that extra low-5-figs a year of cost I am a much better employee in terms of my capabilities. It's easily delivering ROI for me, and therefore for my employer. Instead of sitting around wishing I had a Go developer to ask for help implementing a simple feature in a Terraform provider, I can just fork it and add what I need, try to submit it upstream for inclusion, etc. and the lack of language specific skills is no longer holding me back.
Take away the tool and I can't do that part of the job anymore, sorry. I can still do a lot, but slower, and honestly it would feel like going from a car back to walking, now; walking's fun, I do it recreationally for the sheer joy, but when there's hundreds of kilometres to cover in a short amount of time, the car is clearly the correct choice. So too is it with AI: we've invented the car for computers and only a fool would pretend he can do everything the same without it now.
Spoiler alert: if you can't do the job, you're not going to be doing the job much longer.
Obviously our ambitions expand due to better tools. I now commit to and deliver much more work than before LLMs, and — before then — ditto for frontend frameworks, generation 4 languages etc.
There are projects I now start without thinking twice that I never would have considered a few years ago.
That's what productivity looks like, and it makes you more valuable, and your job more secure (up until the ASI kills us all...).
"but when there's hundreds of kilometres to cover in a short amount of time, the trebuchet is clearly the correct choice."
you point it in the rough direction and distance you want to go, pull the lever, see if you hit your mark, adjust, pull the lever again, etc.
And once you have dialed in the variables for that particular piece of rock that one time, you write it down in a "skill.md" file and announce to everyone on the team "this trebuchet has been carefully calibrated. Trust it with your other rocks too."
Unless you're working in a coding sweatshop, I don't see why you need AI to do what people have been doing for decades just fine without breaking a sweat.
What are you working on?
You are obviously unaware of what the silicon valley companies are asking for and commiting to.
https://dora.dev/capabilities/clear-and-communicated-ai-stan...