Preferences

sp527
Joined 1,468 karma

  1. > The hacker, who asked for anonymity because he feared retaliation from the company, said he reported the vulnerability to Doublespeed on October 31

    Lmao. Nice.

  2. It's almost like a lot of our technologies were pretty mature already and an AI trained on 'what has been' has little to offer with respect to 'what could be'.
  3. > If or when that happens, I think the economy would morph into a new thing completely focused on serving the whims of those "owners."

    I think you might be a little behind on economic news, because that's already happening. And it's also rapidly reshaping business models and strategic thinking. The forces of capitalism are happily writing the lower and middle classes out of the narrative.

    https://www.wsj.com/livecoverage/stock-market-today-dow-sp50...

  4. Maybe most men don't want to see other men naked? Does this have to be any more complicated than that? It always struck me as really weird tbh.
  5. I think what you're missing here is that China would be (is?) happy to fund the development, because it's in their national interest and necessary in order for their companies to stay competitive, so long as there are trade restrictions on chips. Another framing for this is that China and certain other entities (e.g. content distribution channels like Meta, Youtube) have a strong incentive to 'commoditize their [AI] complement' (https://gwern.net/complement).
  6. It's hard to think of someone less qualified to opine on the direction of AI than this guy.
  7. Oh trust me I know. I worked at Palantir well before it was public and had firsthand experience of Alex Karp. He would draw incomprehensible stick figure box diagrams on a whiteboard for F100 CEOs, ramble some nonsensical jargon, and somehow close a multimillion dollar pilot. The guy is better at faking it than high-end escorts. It doesn't surprise me that this has fooled degens around the world, from Wall Street to r/wallstreetbets. Incredibly, even Damadoran has thrown in the towel and opened a position, while still admitting he has no idea what they do.
  8. It's such an egregiously bad error, you almost have to wonder if Altman did it intentionally for publicity (which does seem to be working).
  9. > This is why Python or JavaScript are great languages to start with (even if JavaScript is a horrible language)

    The author was hemorrhaging credibility all along the way and then this comment really drove home what he is: a bike shedder who probably deliberately introduces complexity into projects to keep himself employed. If you read between the lines of this post, it is clearly a product of that mindset and motivation.

    'AI is only good at the simple parts that I don't like, but it's bad at the simple parts I do like and that are my personal expertise and keep me employed.'

    Yeah okay buddy.

  10. That's what Anduril is for
  11. I can't even begin to imagine what sort of mind could observe the quality of Big Tech's software output and conclude that there's nothing wrong with their hiring process.
  12. And yet if you flip the paradigm entirely, you pretty much get coding bootcamps, which certainly don't have a great track record either. The answer is probably some more ideal balance between theory and practice, like Waterloo's CS program.
  13. Any green energy project that isn't nuclear is a waste of money and resources. Nuclear is now being pursued in earnest by the tech industry itself. There's no problem here.
  14. You really don't understand subtext...
  15. This is a very primitive, pre-genAI type of thought process.

    People like you can only see copyright infringement when it's blatantly staring you in the face, like Studio Ghibli style AI images. Why is it obvious in that case? Because we have a large enough preexisting sample of Studio Ghibli style frames from anime to make it obvious.

    But move closer to zero-shot generation, which anyone with a modicum of knowledge on this subject understands was the directional impetus for generative AI development, and the monkey brain short circuits and suddenly creatives who want to protect their art can go fuck themselves, because money.

    You may not find common cause with multi-millionaire artists trying to protect their work now, but you certainly will in hindsight if the only fiscally-sustainable engines of creativity left to us in the future are slop-vomiting AI models.

  16. "You could parachute [Sam] into an island full of cannibals and come back in 5 years and he'd be the king."

    http://paulgraham.com/fundraising.html

  17. Literally the founder of Y Combinator all but outright called Sam Altman a conniving dickbag. That’s the consensus view advanced by the very man who made him.
  18. Somebody please explain Dunning-Kruger to Hotz. The secondhand embarrassment reading this is nearly unbearable.

This user hasn’t submitted anything.