Yes, better AI will make better guesses about what you mean. But when you’re building serious software, you don’t want guesses—even smart ones. You want to know exactly what you’re building.
Mathematics developed over centuries moved from textual description towards symbol manipulation because natural language is nuanced and not always precise. Believing that AI can guess what one wants in textual form and resolving the nuances of your own words is wrong. If you want to build a plane you take an aeronautics degree, you don’t go to a GPT and ask it for something that floats in air. Even if the GPT guesses that you want a plane and roughly outlines what a plane is.
That is why
People call them “great for prototyping,” which means “don’t use this for anything real.”