Obligatory AI post
I have complicated feelings about LLMs. I use ChatGPT to correct my emails, to get clues when debugging, and to look things up. In some ways, it has made my life easier. But at the same time, I do not feel excited about LLMs the same way that 90% of LinkedIn does, nor the CEOs that claim junior engineers will be replaced, and that somehow LLMs will develop into AGI.
LLMs are overrated
As someone who has actually spent a lot of time failing to get an LLM to do what I want, I am much more willing to share Yann LeCun’s perspective that LLMs are not the final answer to AI, and I think that people in positions of power are being incredibly irresponsible by suggesting otherwise, encouraging billions of dollars to be invested into a potential dead end.
LLM-generated code must still be debugged and maintained
While LLM can be used to accelerate the writing of code, it is not likely ever to be able to debug or maintain it, especially in larger software projects. Furthermore, it is harder for a developer to debug and understand code that they didn’t write.
LLMs can be harmful to learning
A significant portion of my job is to learn and apply new technologies. If I don’t do this I really will be left behind. And, quite literally, the only way to learn anything reasonably well is to practice. With generative AI, we now have the option to delegate our output to a machine. What happens if instead of trying to write Golang - a language I am still unfamiliar with - I let an LLM auto-complete it for me? I would fail to acquire the new language.
LLMs rob software developers of at least two opportunities to learn:
- In the case of LLM auto-complete, they are robbed of the opportunity to form a question.
- In the case of LLM chat bots, they could be robbed of the opportunity to type their own code if they’re not careful.
Why do I think #2 is a problem? I believe that typing your own code helps you learn, in the same way that trying to use new vocabulary helps you improve your natural language ability.
I’m fine with getting “left behind”
There’s no question that LLMs can and do provide some benefit to society. I believe there is an open question as to whether there is a net benefit.
If I get “left behind” by not using LLM coding assistants, then so be it. I’d rather continue to have fun writing my own code, and get better at it. Who knows, maybe that’s the real way to not get left behind in the long run.