4
points
nafizh
Joined 4,399 karma
- nafizhThe BBC has increasingly been anti-semitic. First, the baseless allegations of starving and murdering children in Gaza and now this. Troubling times.
- check out cs 336 stanford, they cover DPO/GRPO and relevant parts needed to train LLMs.
- There's no way I am using such an important piece of life as an IDE from Google just because I know they are going to kill it within 3 years, if it survives that much. Probably will die with the Windsurf guy jumping ship again.
- How would you onboard a software engineer who doesn't know zig if you were to do so? Learning tips?
- Still use pinboard which seems more than enough for me. But I would happily pay for a pinboard that has better search and more responsive customer service.
- I still haven't got access to GPT-5 (plus user in US), and I am not really super looking forward to it given I would lose access to o3. o3 is a great reasoning and planning model (better than Claude Opus in planning IMO and cheaper) that I use in the UI as well as through API. I don't think OpenAI should force users to an advanced model if there is not a noticeable difference in capability. But I guess it saves them money? Someone posted on X how giving access to only GPT-5 and GPT-5 thinking reduces a plus user's overall weekly request rate.
- It works fine I would say. In the absence of a bigger screen with proportional DPI (kindle scribe is 300 which is one of the highest in terms of screen size), kindle scribe is one of the better options still IMO.
- Isn't remarkable pro the same size as Kindle scribe? Their website says 10.8 x 7.8 inches.
- I wish they would do a bigger size kindle scribe. I read pdfs all day on my scribe, and often I wish the screen was bigger so the font size would be large.
- 2 points
- A lot of this can be explained by just supply and demand. In central New Jersey, the number of people who need to get a home is always increasing, but number of houses being built is extremely low, and even those houses start from 900k-1M.
- To be honest, when I say it has significantly worsened, I am comparing to the time when GPT-4 just came out. It really felt like we were on the verge of 'AGI'. In 3 hours, I coded up a complex piece of web app with chatgpt which completely remembered what we have been doing the whole time. So, it's sad that they have decided against the public having access to such strong models (and I do think it's intentional, not some side-effect of safety alignments though that might have contributed to the decision).
- My use of ChatGPT has just organically gone down 90%. It's unable to do any sort of task of non-trivial complexity e.g. complex coding tasks, writing complex prose that conforms precisely to what's been asked etc. Also I hate the fact that it has to answer everything in bullet points, even when it's not needed, clearly rlhf-ed. At this point, my question types have become what you would ask a tool like perplexity.
- This has exactly been my experience for at least the last 3 months. At this point, I am thinking if paying that 20 bucks is even worth anymore which is a shame because when gpt-4 first came out, it was remembering everything in a long conversation and self-correcting itself based on modifications.
- Thanks for the reply. Is the code available by any chance?
- What's the technology behind the blog? It's clean, minimal and beautiful.
- Can 100% confirm this. You have to keep reminding it what I said just 2-3 hops ago.