Preferences

I do wonder in particular about the startup time "time-to-plot" issue. I last used Julia about 2021-ish to develop some signal processing code, and restarting the entire application could have easily taken tens of seconds. Both static precompilation and hot reloading were in early development and did not really work well at the time.

On a 5 year old i5-8600, with Samsung PM871b SSD:

  $ time julia -e "exit"
  real    0m0.156s
  user    0m0.096s
  sys     0m0.100s

  $ time julia -e "using Plots"
  real    0m1.219s
  user    0m0.981s
  sys     0m0.408s

  $ time julia -e "using Plots; display(plot(rand(10)))"
  real    0m1.581s
  user    0m1.160s
  sys     0m0.400s
Not a super fair test since everything was already hot in i/o cache, but still shows how much things have improved.
That was fixed in 1.9. Indeed it makes a huge difference now that you can quickly run for the first time.
This was absolutely not "fixed" in 1.9, what are you talking about. It was improved in 1.9, but that's it. Startup time is still unacceptably slow - still tens of seconds for large codebases.

Worse, there are still way too many compilation traps. Splatted a large collection into a function? Compiler chokes. Your code accidentally moves a value from the value to the type domain? You end up with millions of new types, compiler chokes. Accidentally pirate a method? Huge latency. Chose to write type unstable code? Invalidations tank your latency.

What they did was make the general latency issue concrete by calling it "ttfp" -- time to first plot. Then they optimized that thing, literally the time to get the first plot, through caching and precompilation strategies. What they didn't do was solve the root cause of the latency issue, which is fundamental to the dynamic dispatch strategy that they boast about. So really, they're never going to "fix" it without rethinking the language design.

The problem comes from Julia trying to be two languages at once -- the dynamic language that is useful for quickly generating plots and prototyping; and the production language that's running production code on the backend server, or running the HPC simulation on the supercomputer. They've deliberately staked out the middle point here, which comes with the benefit of speed but the tradeoff is in the ttfp latency. It might be considered the leak of the multiple dispatch abstraction. Yes it can feel like magic when it works, but when it doesn't it manifests as spikes in latency and an explosion of complexity.

In the end I don't know how big the ttfp issue is for Julia. But they've certainly branded it and the existence of the problem has made its way to people who don't use the language, which is an issue for community growth. They've also left themselves open for a language to come on that's "Julia but without ttfp issues".

That's quite the interesting perspective, but I'd say it gives "them" more organization and unified focus than is real. It's an open source language and ecosystem. Folks use it — and gripe about it and contribute to it and improve it — if they like it and find it valuable.

All I can say is that many of "us" live in that tension between high level and low level every day. It's actually going to become more pronounced with `--trim` and the efforts on static compilation in the near term. The fact that Julia can span both is why I'm a part of it.

> what are you talking about.

It's the most annoying thing about hn that people will regularly declare/proclaim some thing like this as if it's a Nobel prize winning discovery when it's actually just some incremental improvement. I have no idea how this works in these people's lives - aren't we all SWEs where the specifics actually matter. My hypothesis is these people are just really bad SWEs.

I'm reminded of the story a couple weeks ago about web assembly Python being "fast", when it turned out "fast" meant as slow as than python typically is. https://wasmer.io/posts/python-on-the-edge-powered-by-webass...
on a macMini (i.e. fast RAM), time to display:

- Plots.jl, 1.4 seconds (include package loading)

- CairoMakie.jl, 4 seconds (including package loading)

julia> @time @eval (using Plots; display(plot(rand(3))))

  1.477268 seconds (1.40 M allocations: 89.648 MiB, 2.70% gc time, 7.16% compilation time: 5% of which was recompilation)

This item has no comments currently.

Keyboard Shortcuts

Story Lists

j
Next story
k
Previous story
Shift+j
Last story
Shift+k
First story
o Enter
Go to story URL
c
Go to comments
u
Go to author

Navigation

Shift+t
Go to top stories
Shift+n
Go to new stories
Shift+b
Go to best stories
Shift+a
Go to Ask HN
Shift+s
Go to Show HN

Miscellaneous

?
Show this modal