Another Quora transplant:
Absolutely not. It will help them a lot, and fairly likely reduce the number of devs needed within a given company (and thus overall, more reliably), but it (or anything like it) cannot totally replace devs.
Why not? Because it’s only a LANGUAGE model, not a KNOWLEDGE model. It’s essentially just a Markov chain on steroids. It strings together words it has seen together in its training data, even if the context of the question, or of prior parts of its answer, makes that new piece of the answer totally wrong. This is such a common thing there’s even a name for it, “hallucinations”. Worse yet, when challenged it will gladly double down on just making up plausible-sounding nonsense, “citing” web pages and studies and so on that do not say what ChatGPT claims, or even do not exist. (Well, at least -3 will, I haven’t heard nearly as many reports on -4.)
That said, devs (and many other people) can use ChatGPT to create a first draft of a function, or even a complete program. (Or in the case of other people, a sales pitch, a poem, etc., and its cousins used for images might produce artwork that can at least inspire an artist to make a different specific image.) It will usually even be decently clean and modular code! But for any fairly complex task, it might not do exactly what you asked for, might make calls to functions that do not exist, etc. A human will need to look it over, test it, etc., and whip it into shape. That’s still going to make the overall process a lot faster, just by replacing how fast our slow little fingers can punch keys.