One of the lead designers on Liquid Glass is Chan Karunamuni, who's been at Apple since the early 2010s. If you search for more of the names of the design presenters at this WWDC, you'll find a lot of people with similarly long tenure.
So the theory that it's all Gen Z designers with no experience or talent seems pretty weak.
So I'm sure there's 3 Gen Z folks in a trench coat approving the work of those other Gen Z designers.
All this is just delegating to flavor of the domain "higher powers" instead of trying to grapple with the complexity of reality.
We just have to wait for Gen Alpha to bring back flat design 10 or so years from today.
https://www.hackerneue.com/item?id=44269225
Edit: this appears to be a hot take, so I challenge others to take a step back and consider other protected classes and anti-discrimination laws. They don't call out one race or sex, they say they're all protected and the very act of discriminating is not allowed during hiring. They don't say "you can't discriminate against white people or men but others are fine". That's what the ADEA does.
Look at John Romero, he knocked it out of the park with Doom 1, 2 and some of Quake, but all his projects after have been flops of catastrophic proportions. Look at Jonny Ive's last design mistakes at Apple compared to the early successes that were perfection from all aspects.
Most people can't pull success after success forever, they always bottom out at some point then decline, some sooner than others, especially in a fast changing field like tech. So it's a high chance those senior higher ups at Apple are now dated and out of touch, but still have the high egos and influence from the bygone era. Happens at virtually 100% of the companies.
I don't think that characterization is quite right either. I'm a big fan of Brian Eno's "scenius" phrasing:
> A few years ago I came up with a new word. I was fed up with the old art-history idea of genius - the notion that gifted individuals turn up out of nowhere and light the way for all the rest of us dummies to follow. I became (and still am) more and more convinced that the important changes in cultural history were actually the product of very large numbers of people and circumstances conspiring to make something new. I call this ‘scenius’ - it means ‘the intelligence and intuition of a whole cultural scene’.
Extremely successful people benefit from the scenius within which they get to operate. But as that context changes and evolves over time, they fail to recreate their earlier wild successes - not because they lost any of their skills (although that can also happen), but because the skills aren't sufficient, and the deep, layered conditions that enabled those wild successes just aren't there anymore.
Look at the Solvay Conference. That's a lot of lightning in a bottle all at once.
Though it's beyond me to articulate it, perhaps that was also cultural.
And the other guys from id haven't exactly recaptured the same magic either. It's a shame they broke up, it turns out that the team was way stronger together than any of them has been on their own.
As a tangent, HR departments are very often affected by this as well. As soon as you have large enough HR, they will start generating ideas about how to waste other teams time. They have to justify their existence by organizing some events, trainings, activities, even if they actively harm the bottom line.
First, there will be those who are devoted to the goals of the organization. Examples are dedicated classroom teachers in an educational bureaucracy, many of the engineers and launch technicians and scientists at NASA, even some agricultural scientists and advisors in the former Soviet Union collective farming administration.
Secondly, there will be those dedicated to the organization itself. Examples are many of the administrators in the education system, many professors of education, many teachers union officials, much of the NASA headquarters staff, etc.
The Iron Law states that in every case the second group will gain and keep control of the organization. It will write the rules, and control promotions within the organization.”
But there is no limit to how much additional security you can bring, so they do bring all of it. Recently had to get new Tomcat distribution deployed via Chef tool, of course our own package of it. Now it runs under 2 unix users, each owns various parts of Tomcat. Main startup config (options.sh) is owned by root, to which we will never ever get access, one has to do all changes in a complex approval and build process via Chef. Servers disconnect you after 2-3 mins of inactivity, if you deal with a small cluster you need literally ie 16 putty sessions open which constantly try to logout. And similar stuff everywhere, in all apps, laptops, network etc.
All this means that previously simple debugging now becomes a small circus and fight with ecosystem. Deliveries take longer, everything takes longer. Nobody relevant dares to speak up (or even understands the situation), to not be branded a fool who doesn't want the most security for the bank.
I would be mad if this would be my company, but I go there to collect paychecks and sponsor actual life for me and my family so can handle this. For now at least.
Alternative approach, also from a financial services world: VMs are created with a DSL on top of qemu/firecracker, containers with Dockerfiles. Cyber are part of an image review group alongside other engineers that validates the base images.
But: no interactive access to any of these VMs at all. There’s hypervisors running on bare metal, but SRE teams have that scripted pretty well to the point a physical server can be added in a day or so. It does mean you’ve to be serious about logging, monitoring and health.
This is one instance where we got it right (I think). We do have some legacy servers we’re trying to get rid of. But we’ve learnt we can run even complex vendor apps this way.
Conway’s Law comes to bite us in other ways though! Like I said, it’s a bear.
The poster above is right in that if you create a design team they will want to justify their existence but it's the controls above and around it that is responsible for keeping them in check.
People will cling to those senior leaders and make themselves visible and important to be kept around and be validated and enabled.
I remember a time when microsoft came around the corner with flat design on their phones and the iphone all of a sudden looked outdated. They adopted a flat look shortly after. They did that pretty well.
Thirdly and most important: noone does gaussian blurs, macro and micro transitions better than apple and it‘s a key part of their success. They are taking it one step further now. Even if it doesn‘t improve the experience for users it could help distinguish themselves visually. And there is nothing wrong with that.
I think a lot of folks here would say that there is something wrong with degrading the user experience to achieve a win for branding.
There were parts of Vista that were mostly glass and they still looked fine. The widget picker comes to mind: https://istartedsomething.com/wp-content/uploads/2006/09/gad...
What Apple demonstrated in their first OS demo is not yet finished, and I'm sure they'll add some more frosted glass efects for legibility and such. What they show off in the video looks fine to me, and the explanation that comes with the visuals show that at least from a designer point of view, all of the weird stuff that jumps out in the macOS demo was violating the design principles.
I loved Aero and I bet once Apple adds the diffuse glass to the places it need to for legibility, I'm sure this will look great too.
Apple are much further behind with Siri than they realise.
I think Apple realises it way better than you’re giving them credit for. They simply weren’t able to do anything about it yet, even though they’re clearly trying.
Maybe they just made a bad UI/UX change.
With how badly Apple's VR headset actually sold, I don't think they're going to for a unified AR-first approach just yet. Then again, Apple did think their VR headset was a good idea, so maybe they're just high on their own supply.
Can we stop blaming Gen Z for everything? This happens with every generation.
Is that because a public company or because Tim Cook is a bottom line finance guy?
"they can't pay as much."
Why not? Thought apple had enormous cash reserves.
Does the A-squad include Steve Jobs, who seemed to have been a fan of skeuomorphism:
* https://en.wikipedia.org/wiki/Skeuomorph#Virtual_examples
Does the A-squad include Johnny Ive, who gave us butterfly keyboards and the Touch Bar (where (IIRC) the initial revision of which did not have a separate physical key for ESC)? Though Ive did get rid of skeuomorphism.
By replacing skeuomorphism with minimalism, Ive's anti-skeu was a cure nearly as worse as the disease. They were right to move away from skeuomorphism, but they did so recklessly, giving us a UX where almost all cues for an element being "clickable" were stripped away.
Ive hasn't done a single impressive thing after Jobs' departure. To the extent that Ive did anything noteworthy, it was with Jobs as visionary, product director and tastemaker. Outside of that relationship, his work has been derivative of prior Apple design success, or embarrassingly wrong-footed. Factoring in the lag time of product cycles, it's astonishing how rapidly Apple improved after Ive's departure.
> it's astonishing how rapidly Apple improved after Ive's departure
Is there another Apple? What improvements are you talking about, leave alone astonishing ones?
Not exactly improvements in the traditional sense. More likely cleaning up an intentional mess.
Perhaps people can argue with me: I claim skeuomorphism jumped the shark with the pseudo reel-to-reel playback UI in ... was it the Podcasts app? Or maybe people think it was Notes with the torn edge along the top margin.
Regardless, skeuomorphism seems to have gone too far at some point. Perhaps became overly cute, overly precious, pretty-pretty.
Skeuomorphism was said to have been the thing in early GUI computers, as metaphor or real objects, that helped early users to those interfaces understand them. Dragging a file icon that looked like a dog-eared piece of paper to a trash can icon on the screen (to delete the file) — the most obvious example.
I suspect by the time the Web came around, users had to become more comfortable with being bombarded with all manner of wild UI paradigms and they learned to more or less cope. Skeuomorphism, like training wheels, were perhaps not really needed as much as they had been a decade earlier.
It has been a downward slope since then after the momentum dissipated after his death.
Turns out, I didn’t like the operating system Apple made. I liked the OS Apple made while being curated and directed by Steve Jobs. His taste matched mine in a lot of important ways.
I have no tastes in common with Alan Dye.
When I read "liquid glass" and saw a thumbnail of it I thought I was going to be impressed. Well, no.
Also that Finder screenshot is hilarious, I'm not even sure it's real.
The A-squad design team left Apple 15 years ago.
The B-squad left 5 years ago.
What remains is a sea of Gen Z designers who weren't yet alive when the foggy glass of Windows Vista seemed like a good idea.
Meanwhile, the talent wars are raging, with every AI company offering 7-figure salaries to the best of Apple's prodigies.
Apple is now the old guard. They're no longer cool, and as a public company, cost controls are too stringent; they can't pay as much. What is Apple to do?
They can give the designers a sense of ownership. It's not a question of how (un)qualified the team is; it's a retention play.
Is the design good? The A and B squads would say no. But this is the best Apple can do these days to keep critical talent engaged.
They'll burn a cycle re-learning fundamental lessons in accessibility, retain talent, and cling to the hope that next year they'll have a midwit Siri than can book a flight with a decent looking UI.