Preferences

SatvikBeri
Joined 4,145 karma
satvik.beri@gmail.com

  1. I handle our company's RDS instances, and probably spend closer to 2 hours a year than 2 hours a month over the last 8 years.

    It's definitely expensive, but it's not time-consuming.

  2. That's why I give the LLM a readonly connection
  3. This is pretty much it – when we had follow up interviews with the C++ devs, they had usually only had time to try one or two high-level approaches, and then do a bit of profiling & iteration. The Julia devs had time to try several approaches and do much more detailed profiling.
  4. The closest publicly available problem I can think of is the 1 billion rows challenge. It's got a bigger dataset, but with somewhat simpler statistics – though the core engineering challenges are very similar.

    https://github.com/gunnarmorling/1brc

  5. I posted my CLAUDE.md here, there's really not much to it: https://www.hackerneue.com/item?id=46262674
  6. Are you aware that Julia is a compiled language with a heavy focus on performance? It is not in the same category as NumPy/MATLAB/R
  7. To be clear, the fastest theoretically possible C++ is probably faster than the fastest theoretically possible Julia. But the fastest C++ that Senior Data Engineer candidates would write in ~2 hours was slower than the fastest Julia (though still pretty fast! The benchmark for this problem was 10ms, and the fastest C++ answer was 3 ms, and the top two Julia answers were 2.3ms and .21ms)

    The pipeline was pretty heavily focused on mathematical calculations – something like, given a large set of trading signals, calculate a bunch of stats for those signals. All the best Julia and C++ answers used SIMD.

  8. > Python is sometimes slower (hot loops), but for that you have Numba

    This is a huge understatement. At the hedge fund I work at, I learned Julia by porting a heavily optimized Python pipeline. Hundreds of hours had gone into the Python version – it was essentially entirely glue code over C.

    In about two weeks of learning Julia, I ported the pipeline and got it 14x faster. This was worth multiple senior FTE salaries. With the same amount of effort, my coworkers – who are much better engineers than I am – had not managed to get any significant part of the pipeline onto Numba.

    > And if something is truly performance critical, it should be written or rewritten in C++ anyway.

    Part of our interview process is a take-home where we ask candidates to build the fastest version of a pipeline they possibly can. People usually use C++ or Julia. All of the fastest answers are in Julia.

  9. I'm pretty enthusiastic about LLMs and use them on my 8 year old codebase with ~500kloc. I work at a hedge fund and can trace most of my work to dollars.
  10. For what it's worth, I've been using a fairly minimal setup (24 lines of CLAUDE.md, no MCPs, skills, or custom slash commands) since 3.7 and I've only noticed Claude Code getting significantly better on each model release.
  11. Mine is 24 lines long. It has a handful of stuff, but does refer to other MD files for more specifics when needed (like an early version of skills.)

    This is the meat of it:

      ## Code Style (See JULIA_STYLE.md for details)
      - Always use explicit `return` statements
      - Use Float32 for all numeric computations
      - Annotate function return types with `::`
      - All `using` statements go in Main.jl only
      - Use `error()` not empty returns on failure
      - Functions >20 lines need docstrings
    
      ## Do's and Don'ts
      -  Check for existing implementations first
      -  Prefer editing existing files
      -  Don't add comments unless requested
      -  Don't add imports outside Main.jl
      -  Don't create documentation unless requested
    
    Since Opus 4.0 this has been enough to get it to write code that generally follows our style, even in Julia, which is a fairly niche language.
  12. Maybe you just don't read many books written after the year 2000? It was a pretty common word even before ChatGPT: https://books.google.com/ngrams/graph?content=delve&year_sta...

    But perhaps the most famous source is Tolkien: "The Dwarves tell no tale; but even as mithril was the foundation of their wealth, so also it was their destruction: they delved too greedily and too deep, and disturbed that from which they fled, Durin's Bane."

  13. > How did I do this? Well I jumped online, using a mix of my early life experience coding emulators and hacking and looked into the x86(-64) architecture manuals to figure out the correct opcodes and format for each instruction. … Just kidding, that’s horrible. I asked ChatGPT

    Ok but if you do want to play with writing binary code manually I recommend Casey Muratori's performance course

  14. It's doable if you pick a very focused topic. In my first year of using Julia, I gave a talk on gradually adding Julia to a large Python codebase. Very few people could give a similar talk because (1) Julia is a fairly niche language, (2) most of the people who understood Julia <> Python interop knew it too well, and had forgotten all the common beginner challenges.
  15. You can still use git worktrees in a colocated repository. jj workspaces are a different, but similar capability that provide some extra features, at the cost of some others.
  16. "When is jj useful" is a different question from "when are workspaces/git worktrees useful"

    I find jj overall most useful for separating the concept of a machine-level commit history that saves every change from a human-readable commit history . jj has really nice tools for cleaning up your commits for review while still retaining all the mechanical changes in case you need to get more granular. (Of course, there are many other tools to do this, like magit – I just find jj to work best with my brain.)

    Workspaces/worktrees are best when you have long-running tasks where the state of the disk is important. Local "CI" is a good example – kick off a long test run on one workspace while starting a new task in another. Another example these days is stuff with Agentic LLMs, e.g. I might create one workspace and ask Claude Code to do a deep investigation of why our AWS costs went up.

  17. I'm in a similar position, I want to learn both eventually but chose to start with Rust because it has several really strong-seeming (like the intro book, or Rust Atomics and Locks), while Zig doesn't have many books yet.
  18. If it's been around for a while, look at the last year's worth of projects and estimate the total delay caused by the specific piece of tech debt. Go through old Jira tickets etc. and figure out which ones were affected.

    You don't need to be anywhere close to exact, it's just helpful to know whether it costs more like 5 hours a year or 5 weeks a year. Then you can prioritize tech debt along with other projects.

  19. I usually run one agent at a time in an interactive, pair-programming way. Occasionally (like once a week) I have some task where it makes sense to have one agent run for a long time. Then I'll create a separate jj workspace (equivalent of git worktree) and let it run.

    I would probably never run a second agent unless I expected the task to take at least two hours, any more than that and the cost of multitasking for my brain is greater than any benefit, even when there are things that I could theoretically run in parallel, like several hypotheses for fixing a bug.

    IIRC Thorsten Ball (Writing an Interpreter in Go, lead engineer on Amp) also said something similar in a podcast – he's a single-tasker, despite some of his coworkers preferring fleets of agents.

  20. My wife is deaf, and we had one kid in 2023 and twins in 2025. There's been a noticeable improvement baby cry detection! In 2023, the best we could find was a specialized device that cost over $1,000 and has all sorts of flakiness/issues. Today, the built-in detection on her (android) phone + watch is better than that device, and a lot more convenient.

This user hasn’t submitted anything.

Keyboard Shortcuts

Story Lists

j
Next story
k
Previous story
Shift+j
Last story
Shift+k
First story
o Enter
Go to story URL
c
Go to comments
u
Go to author

Navigation

Shift+t
Go to top stories
Shift+n
Go to new stories
Shift+b
Go to best stories
Shift+a
Go to Ask HN
Shift+s
Go to Show HN

Miscellaneous

?
Show this modal