My wife knows how to prompt chatgpt, but she wouldn't be able to create an app just by putting together what the llm throws at her. Same could be said about my junior engineer colleague; he knows way more than my wife, sure, but he doesn't know what he doesn't know, and it would take a lot of resources and effort for him to put together a distributed system just by following what an llm throws at him.
So, I see the pool of potential prompters just as the pool of potential software engineers: some are good, some are bad, there will be scarcity of the good ones (as in any other profession), and so wages don't necessarily have to go down.
The size of the pie is nowhere near fixed, IMO. There are many things which would be valuable to program/automate, but are simply unaffordable to address with traditional software engineering at the current cost per unit of functionality.
If AI can create a significant increase in productivity, I can see a path to AI-powered programming being just as valuable as (and a lot less tedious than) today.
For a more realistic example - the software side at many companies essentially is the company. They bring products all the way from inception to launch. Yet they tend to get paid less, often much less, than the legal side. The reason is simply that the labor pool for lawyers is much smaller than for software engineers.
If there's not significant barriers to entry for prompt engineering, wages will naturally be low.
Software development has a huge barrier to entry which keeps the labor pool relatively small, which keeps wages relatively high. There's going to be a way larger pool of people capable of 'prompt engineering' which is going to send wages proportionally way down.