- naterolingThis is on the cluster level, while the article is talking about the database level, I believe.
- I mean that when a computer can visually understand a document and reformat and reinterpret it in any imaginable way, who cares how it’s stored? When a png or a pdf or a markdown doc can all be be read and reinterpreted into an infographic or a database or an audiobook or an interactive infographic the original format won’t matter.
- Seeing the Gemini 3 capabilities, I can imagine a near future where file formats are effectively irrelevant.
- This made me do a double-take. Surely you would never do this, right? It seems to be directly counter to the idea of being able to audit changes:
“Event replay: if we want to adjust a past event, for example because it was incorrect, we can just do that and rebuild the app state.”
- Can you write a prompt to optimize prompts?
Seems like an LLM should be able to judge a prompt, and collaboratively work with the user to improve it if necessary.
- This, but for many things.
Paint is ready at the hardware store Table is ready at the restaurant Construction is done on a bridge
All kinds of things that we need a one-time notification for.
- Looking at the trees in the background of the first photo, it’s clear he’s using a longer focal length on the non-iPhone.
He has some good points, maybe, but in general it’s a pretty naive comparison.
- I don’t think control center actually uses the liquid glass elements. They don’t respond to accessibility options like reduce transparency, for one thing.
- Fine-grained reactivity (ie Knockout) was a thing well before React. If anything, React was a response to the deficiencies of fine-grained reactivity.
- I’ve done the same with SQLAlchemy in Python and SQLKata in C#.
Sadly the whole idea of composable query builders seems to have fallen out of fashion.
- I had the same thought, but it sounds like this operates at a much lower level than that kind of thing:
> Then, a physics-based neural network was used to process the images captured by the meta-optics camera. Because the neural network was trained on metasurface physics, it can remove aberrations produced by the camera.
- More than any other Nikon flagship. Title had me thinking they outsold Canon and Sony.
- Software calendars are so poor compared to this. There’s no concept of importance, of impact, of life. Just times and titles, every one equivalent. Digital calendars have hardly evolved since the palm pilot.
- Imagine if apps just… worked like this, somehow. Start off with a realtime visualization and point and click commands, and as you learn them you can evolve into a straight CLI…
- Explaining that semi-obscure reference: Barry Lyndon is a Kubrick film that famously has shots lit by only candlelight. This was accomplished by using extremely ”fast” lenses created by NASA for (I believe) the Apollo missions.
- Absolutely is is. But the tool itself is a medium for that communication. What is Figma for, if not communicating design decisions?
- The problem with these kind of pixel-perfect, inspectable design tools, is that there's no distinction between important details and unimportant details.
For example, if our app uses a letter-spacing of 1.2 for all the body text, and your Figma design uses a letter-spacing of 1.25, is that important? Or is that a mistake?
In something like Figma, being consistent is difficult for designers. But in code, being consistent is the default — exceptions are hard for developers!
There's a fundamental mismatch that just ends up being painful all around.
"The map is not the territory." Trying to get a design doc to 100% accuracy is often a waste of time. Design tools need a way to specify which details are important, and which are not.
- Would this be a usable alternative to Plex’s Watch Together? Could I host a live stream of a movie and have friends watch it in sync?
- 3 points
- 73 points