I think it's not so much about too expensive, but once you've got the resources it will always be better to switch to an ASIC.
Not a hardware engineer, but it seems obvious to me that any circuitry implemented using an FPGA will be physically bigger with more "wiring" (more resistance, more energy, more heat) than the equivalent ASIC, and accordingly the tolerances will need to be larger so clock speeds will be lower.
Basically, at scale an ASIC will always win out over an FPGA, unless your application is basically "give the user an FPGA" (but this is begging the question—unless your users are hardware engineers this can't be a goal).
Profit is dependent on scale. FPGAs are useful if the scale is so small that an ASIC production line is more expensive than buying a couple of FPGAs.
If the scale is large enough that ASIC production is cheaper, you reap the performance improvements.
Think of it this way: FPGAs are programmed using ASIC circuitry. If you programmed an FPGA using an FPGA (using ASIC circuitry), do you think you'll achieve the same performance as the underlying FPGA? Of course not (assuming you're not cheating with some "identity" compilation). Same thing applies with any other ASIC.
Each layer of FPGA abstraction incurs a cost: more silicon/circuitry/resistance/heat/energy and lower clock speeds.
I'll admit I'm not familiar with the processing requirements of basestations, but the prospect of mass-produced FPGA baseband hardware still seems dubious to me, and I can't find conclusive evidence it being used, only suggestions that it might be useful (going back at least 20 years). Feel free to share more info.
[0] ASIC vs FPGA comparison of RISC-V processor, showing an 18x slowdown (or 94.[4]% reduction), apparently consistent with the "general design performance gap": https://iugrc.journals.ekb.eg/article_302717_7bac60ca6ef9fb9...