The real way, and the way befitting the role of the standards committee is actually putting effort into standardizing a way to talk to and understand the interfaces and structure of a C++ binary at load-time. That's exactly what linking is for. It should be the responsibility of the software using the FFI to move it's own code around and adjust it to conform with information provided by the main program as part of the dynamic linking/loading process... which is already what it's doing. You can mitigate a lot of the edge cases by making interaction outside of this standard interface as undefined behavior.
The canonical way to do your example is to get the address of std::string::length() and ask how to appropriately call it (to pass "this, for example.)
Like, for fuck's sake, we're using red/black trees for hash maps, in std - just because thou shalt not break thy ABI
Are you rather thinking of std::unordered_map? That's the hash map of standard C++, and it's the one where people (rightfully) complain that it's woefully out of date compared to SOTA hashmap implementations. But even there an ABI break wouldn't be enough, because, again, the API guarantees in the Standard (specifically, pointer stability) prevent a truly efficient implementation.
https://github.com/boostorg/boost_unordered_benchmarks/tree/... (Via a response from the boost authors in https://www.reddit.com/r/cpp/comments/yikfi4/boost_181_will_...) has more benchmarks depending on your pattern.
I don't see the point of standardizing name mangling. Imagine there is a standard, now you need to standardize the memory layout of every single class found in the standard library. Without that, instead of failing at link-time, your hypothetical program would break in ugly ways while running because eg two functions that invoke one other have differing opinions about where exactly the length of a std::string can be found in the memory.