Preferences

wvenable
Joined 20,414 karma
https://www.codaris.com

  1. I disagree. Satya doesn't give a crap about Windows; he's the cloud guy. Over 40% of Microsoft's revenue is cloud. Another 20% is office (which is also heading towards cloud). Windows revenue is a measly 9% -- even less than gaming.

    Windows is what it is because it's really not important to Microsoft to anymore. It's effectively unmoored from the rest of organization and left to fight for some kind of financial relevance in an organization that doesn't care about it anymore.

  2. It seems to me you're coming in with a negative preconceptions (e.g. "insipid, pandering chatbot"). What part about coding is fun for you? What part is boring? Keep the fun bits and take the boring bits and have the LLM do those.
  3. Maybe the problem with understanding the benefits of AI is that you are relying on other people to use AI properly. As the direct user myself, I don't have that problem.

    I'm using it to make things better rather than just producing. Even just putting it in agent mode and saying "look at all my code and tell me where I can make it better" is an interesting exercise. Some suggestions I take, some I don't.

  4. > I'll have to take your word for it, I have yet to see a PR that used AI that wasn't slop.

    How would you know a non-slop PR didn't use AI?

    Why would I accept slop out of the AI? I don't. So I don't have any.

    I don't understand the disconnect here. Some people really want to be extremely negative about this pretty amazing technology while the rest of us have just incorporated it into our workflow.

  5. That question completely misunderstands what AI is for. Why would I just do it when the AI did it for me in less time that I could myself and mechanically in a way that is arguably harder for a human to do? AI is surprisingly good at identifying all the edge cases.
  6. I think half the people who think AI is incredibly dumb and can't understand why anyone is using it is because they're using the free samples. This whole thing is so horribly expensive that they lose money even on people who pay therefore the free samples are necessarily as minimal as they can get away with.

    The free samples worked famously initially to get people to try it initially, though.

    But whenever that free Gemini text pops up in my search, I know why people think it's stupid. But that's not the experience I have with paid options.

  7. I've been posting recently how I refactored a few different code bases with the help of AI. Faster code, higher quality code, overall smaller. AI is not a hammer, it's a Lathe: incredibly powerful but only if you understand exactly what you're doing otherwise it will happily make a big mess.
  8. Any AI product I pay for is great. Any AI product I don't pay for is terrible.
  9. Comparing LLMs to NFT isn't fair. Being able to talk to you computer and have it understand you and even do the things you ask is literally StarTrek technology.

    I've never seen a technology so advanced be so dismissed before.

  10. > I use zero so-called "AI" features in my day to day life. None. Not one.

    I know so many people who made that same argument, if you can call it that, about smartphones.

    I recently listened to a podcast (probably The Verge) talking about how an author was suddenly getting more purchases from his personal website. He attributed it to AI chatbots giving his personal website as the best place to buy rather than Amazon, etc. An AI browser might be a way to take power away from all the big players.

    > And it's not for a lack of trying, the results are just not what I need or want, and traditional browsing (and search engines, etc.) does do what I want.

    I suspect I only Google for about 1/4 of things I used to (maybe less). Why search, wade through dubious results, etc when you can just instantly get the result you want in the format you want it?

    While I am a techie and I do use Firefox -- that's not a growing niche. I think AI will become spectacularly better for non-techies because it can simply give them what they ask for. LLMs have solved the natural language query issue.

  11. Raw data isn't copyrightable. You can't copyright the contents of phone book, for example.
  12. If the headline said "Using Python to obtain one of the rarest license plates" you wouldn't think twice.
  13. > rarely or never looking at the code generated.

    My interpretation is that you can look at the code but vibe coding means ultimately you're not writing the code, you're just prompting. It would make sense to prompt "I'd like variable name 'bar' to be 'foo' instead." and that would still be vibe coding.

  14. > Reading the code and actually understanding the code is not that the same thing.

    Ok. Let me be more specific then. I'm "understanding" the code since that's the point.

    > I'm unsure the AI can do the high level understanding since I have never gotten it to produce said understanding without explicitly telling it.

    My experience has been the opposite: it often starts by producing a usable high-level description of what the code is doing (sometimes imperfectly) and then proposes refactors that match common patterns -- especially if you give it enough context and let it iterate.

    > "Rewrite x.c, y.c, z.c into C++ buildings abstractions to make it more ergonomic" generally won't recognise the DSL and formalise it in a way that is very easy to do in C++, it will just make it "C++" but the same convoluted structure exists.

    That can happen if you ask for a mechanical translation or if the prompt doesn't encourage redesign. My point was literally make it well-designed idiomatic C++ and it did that. Inside of the LLM training is a whole bunch of C++ code and it seems to be leaning on that.

    I did direct some goals (e.g., separating device-specific code and configuration into separate classes so adding a device means adding a class instead of sprinkling if statements everywhere). But it also made independent structural improvements: it split out data generation vs file generation into pipeline/stream-like components and did strict separation of dependencies. It's actually well designed for unit testing and mocking even though I didn't tell it I wanted that.

    I'm not claiming it has human-level understanding or that it never makes mistakes -- but "it can't do high-level understanding" doesn't match what I'm seeing in practice. At minimum, it can infer the shape of the application well enough to propose and implement a much more ergonomic architecture, especially with iterative guidance.

    I had to have it introduce some "bugs" for byte-for-byte matching because it had generalized some of the file generation and the original C code generated slightly different file structures for different devices. There's no reason for this difference; it's just different code trying to do the same thing. I'll probably remove these differences when the whole thing is done.

  15. I've had similar experiences with compiled languages in one form or another throughout my career as well. I don't think JavaScript is particularly special that we need to call it out for things like TypeScript, minification, or bundling.
  16. I do a lot of C++ programming and that's really over selling the issues. You don't have to read an entire book of variable initialization to do it correctly. And using STL types are a lot safer than passing pointers around.

    It's actually far easier to me to tell that it's not leaking memory or accessing some unallocated data in the C++ version than the C version.

    A simple language just pushes complexity from the language into the code. Being able to represent things in a more high-level way is entirely the point of this exercise because the C version didn't have the tools to express it more cleanly.

  17. This seems like an issue but in all my practical experience it really isn't. TypeScript becomes JavaScript with the types removed. Then you tree-shake, minify, and whatever is executed is no where near what you actually wrote but at the same it totally is the same because that's no different than any other compilation process.

    Occasionally, I have to remember that JavaScript has no types at runtime but it's surprisingly not that often.

  18. I’m not saying individual programmers consciously set out to eliminate jobs, or that every programmer's work directly replaces someone. But the historical and structural reality of the profession is that software development, as a field, has consistently produced automation that reduces the amount of human labor required.

    That pattern is bigger than any one of us and it's not a moral judgment. It's simply part of what technology does and has always done. AI is a continuation of that same trend we've all participated in, whether directly or indirectly. My point is that to stop now and say "look at all these jobs being eliminated by computers" is several decades too late.

  19. I'll be the first to say I've abandoned a chat and started a new one to get the result I want. I don't see that as a net negative though -- that's just how you use it.
  20. Programmers are the last people on earth who should complain about job loss due to automation. Our entire jobs since the beginning has been automating people out of jobs and we've done a wonderful job of that for decades. Entire classes of jobs no longer exist. Although I'm not personally responsible for anyone losing their job I'm certainly responsible for less people being hired.

    AI is just the next step and not even a particularly large leap. We already needed less law secretaries due to advances of technology. We killed most journalism two decades ago. Art and Music had Photoshop and autotune. Now we've actually achieved something we've literally been striving for since the dawn of computing -- the ability to speak natural language to a computer and have it do what we ask. But it's just one more step.

This user hasn’t submitted anything.