Sizecoding productions are an exception, because they tend to use really slow and memory hungry unpackers (can take up to gigabytes of RAM to do their job, for a 4k intro!), and they tend to lack all optimization techniques that could improve speed in exchange for more code (ex: culling, etc...). And flooding memory with millions of generated objects is no problem for these productions, as long as they are not stored in the executable.
But in general, larger binaries are slower to load, simply because there is more bytes to copy from disk to RAM. And although it is not always the case (ex: sizecoding), it usually means a larger memory footprint, resulting in worse CPU cache efficiency, less memory for other apps and filesystem caches, etc... Also, large binaries tend to be a symptom of inefficient code, with too many abstraction layers, poor optimization, etc... Another very common reasons for bloated executables is that they bundle all their libraries, which means that they don't take advantage of shared libraries, requiring the OS to maintain multiple copies of the same library, possibly including some outdated or poorly optimized versions.