10 bit bytes would give us 5-bit nibbles. That would be 0-9a-v digits, which seems a bit extreme.
GI made 10-bit ROMs so that you wouldn't waste 37.5% of your ROM space storing those 6 reserved bits for every opcode. Storing your instructions in 10-bit ROM instead of 16-bit ROM meant that if you needed to store 16-bit data in your ROM you would have to store it in two parts. They had a special instruction that would handle that.
The Mattel Intellivision used a CP1610 and used the 10-bit ROM.
The term Intellivision programmers used for a 10-bit quantity was "decle". Half a decle was a "nickel".
When you think end-to-end for a whole system and do a cost-benefit analysis and find that skipping some letters helps, why wouldn't you do it?
But I'm guessing you have thought of this? Are you making a different argument? Does it survive contact with system-level thinking under a utilitarian calculus?
Designing good codes for people isn't just about reducing transcription errors in the abstract. It can have real-world impacts to businesses and lives.
Safety engineering is often considered boring until it is your tax money on the line or it hits close to home (e.g. the best friend of your sibling dies in a transportation-related accident.) For example, pointing and calling [1] is a simple habit that increases safety with only a small (even insignificant) time loss.
I started off by saying that 0-9a-v digits was "a bit extreme", which was a pretty blatant euphemism — I think that's a terrible idea.
Visually ambiguous symbols are a well-known problem, and choosing your alphabet carefully to avoid ambiguity is a tried and true way to make that sort of thing less terrible. My point was, rather, that the moment you suggest changing the alphabet you're using to avoid ambiguity should also be the moment you wonder whether using such a large number base is a good idea to begin with.
In the context of the original discussion around using larger bytes, the fact that we're even having a discussion about skipping ambiguous symbols is an argument against using 10-bit bytes. The ergonomics or actually writing the damned things is just plain poor. Forget skipping o, O, l and I, 5 bit nibbles are just a bad idea no matter what symbols you use, and this is a good enough reason to prefer either 9-bit bytes (three octal digits) or 12-bit bytes (four octal or three hex digits).
Or addressing 1 TB of memory with 4 bytes, and each byte is the next unit: 1st byte is GB, 2nd byte is MB, 3rd byte is KB, 4th byte is just bytes.
What's the point?