ddavis
Joined 711 karma
- ddavisWhen you hit types like that type aliases come to the rescue; a type alias combined with a good docstring where the alias is used goes a long way
- Literally dealing with this right now. My wife got what appears to be a (very expensive) counterfeit item that is technically non-returnable (not laying down without a fight). Kind of cathartic to see this pop up.
- I don’t think the person quoted is implying that it should be that way, merely pointing out a discovery that builders have made: they _can_ get a symbolic bonus. One can skip building to code… do a quick and bad job and move on to the next job, saving cost and moving onto the next paying job more quickly. That “bonus” doesn’t exist if you build to code (and of course it shouldn’t exist, but neither should the bonus that does exist, your stick should prevent it).
- I have a similar experience. I was a devoted PhD student working long hours taking on a lot of responsibility. It burned me out, hurting my productivity. I have mixed feelings about it; I love the friends I made and the things I learned, but I don’t think I should have had to suffer what I suffered. Simultaneously I’m somewhat glad I experienced it then, because now I work in tech and I’ll _never_ work outside of business hours (I’ll hack on personal projects I consider fun if I feel like it). And I’m more productive than my colleagues that do. There’s something mysterious about the contemporary PhD, not all good and not all bad.
- The organization and formatting of the single .tex file is such that one could almost read the source alone. Really nice. Also, I had no idea that GitHub did such a good job rendering the LaTeX math in markdown, it's imperfect but definitely good.
- Been waiting to see what Astral would do first (with regards to product). Seems like a mix of artifactory and conda? artifactory providing a package server and conda trying to fix the difficulty that comes from Python packages with compiled components or dependencies, mostly solved by wheels, but of course PyTorch wheels requiring specific CUDA can still be a mess that conda fixes
- I really like Berkeley Mono and I don’t regret my old purchase, but my Emacs and Terminal configs have been rocking Pragmata Pro for a while now. Looking at the version 2 release notes it appears that Berkeley Mono has some new condensed widths (something I think keeps me using Pragmata Pro). Will have to take it for a spin.
- OpenMP is great. I’ve done something similar to your second case (thread local objects that are filled in parallel and later combined). In the case of “OpenMP off” (pragmas ignored), is it possible to avoid the overhead of the thread local object essentially getting copied into the final object (since no OpenMP means only a single thread local object)? I avoided this by implementing a separate code path, but I’m just wondering if there are any tricks I missed that would allow still a single code path
- Nope, I don't know how to do it at all- that's why I have to ask AI!
- It's something I know how to do after figuring it out myself and discovering the potential sharp edges, so I've made it into a fun game to test the models. I'd argue that it's a great prompt (to keep using consistently over time) to see the evolution of this wildly accelerating field.
- My favorite thing to ask the models designed for programming is: "Using Python write a pure ASGI middleware that intercepts the request body, response headers, and response body, stores that information in a dict, and then JSON encodes it to be sent to an external program using a function called transmit." None of them ever get it right :)
- The back door relied on a couple of Linux package management systems (if I’m recalling correctly, it had .deb and .rpm checks, see https://marc.info/?l=openbsd-misc&m=171227941117852&w=2)
- Yea there's some nuance here. But one of the beautiful things about Python is the existence of PEPs. One would have to write a PEP to get a new dunder method added to CPython. During the review/discussion of the PEP it would come up that one of the most popular ecosystems in all of Python would be impacted by adding a builtin __array__ dunder method, for example. It just wouldn't happen. It makes sense to me that the biggest packages associated with a language can have some impact on the way the language moves forward, even if they are not part of the core implementation. For example, the impact of PEP 563 on Pydantic (another wildly popular package outside of core) caused it to be rolled back.
- > Keep in mind that you're not meant to invent your own dunder methods. Sometimes you'll see third-party libraries that do invent their own dunder method, but this isn't encouraged and it can be quite confusing for users who run across such methods and assume they're "real" dunder methods.
I don't think this is good advice. Example: the dunder methods implemented and used by the Scientific Python/PyData community (__array__, __array_ufunc__, __array_func__, etc.) are so, so, so important to that ecosystem.
- Really only usable on a personal machine. Having to disable SIP is a non starter for work computers
- reminds me a bit of gnuplot.
- 5ish+ years ago Austin had more of the lower level tech jobs, but definitely not anymore (consistent spectrum now).
- Dan Patrick is a terrible human being. Totally unsurprising behavior from him.
- I’ve never used one. I’ve declared init.el bankruptcy a few times in those 12 years but I always build from vanilla Emacs- lots of use-package use though.