2
points
machdiamonds
Joined 654 karma
- machdiamonds parentI don't use my Quest 2 because it's so annoying trying to keep the screen in focus with the fresnel lenses.
- Pretty funny to read the comments from xAI's initial announcement now.
- Pretty simple explanations for all of those:
- xAI opens sources models with a 6 month lag, look at Grok 1
- No one else stopped development, so why should he?
- He owns Twitter, why wouldn't it be okay for him to train on Tweets?
- Shouldn't be surprising with how hard everyone was pushing remote work.
- He said they were using Bing and Google on the Dwarkesh podcast.
- Do you actually have any examples of FSD v12 hitting emergency vehicles?
- I think he said they would open source models after they have been released for 6 months or something like that.
- It's not too hard to believe it is a coincidence when the most followed person on a platform shows up in your feed, especially if you follow tech accounts.
- It's hard to guess these cards' real performance uplifts. According to Nvidia, H100 is 11x faster than A100, but that's definitely not true in most cases. If Gaudi3 is legitimately 4x faster than Gaudi2, it should be a very good value proposition compared to even the B100. I'm really curious whether Intel will be able to compete with X100 using Falcon Shores or not. Regardless, I don't think Nvidia's margins are sustainable.
- Hi, what differentiates double from Cursor?
- Not sure why people on HN can't understand that companies actually need to make money to survive.
- It's definitely not by accident, it would be very easy to turn off community notes for tweets flagged as ads.
- There are still some good subreddits, like "LocalLLama", but most of them seem to have been ruined by mods. I think this was the case even before the whole API issue, which just accelerated the decline.
- You guys just believe whatever you read. Things that can easily be debunked with common sense. For example, there's a lot of yellow in this factory tour he did:
- Emad (Stability AI) thinks it's a 300B model https://twitter.com/EMostaque/status/1727373950685200674
- "journalism"
- Just like for humans, in terms of copyright, only the output should matter, not the training data.
- It is delusional to think LLMs won't reduce the number of SWEs required. This should have been obvious from the first time you used GPT4.
- There is no reason that AI can't be better than humans at requirements elicitation and communication. AI can have necessary domain knowledge about the company and could absorb extensive information provided. It could use analogies to explain concepts at a level comfortable for anyone.
- Also, be born with an extremely IQ.
- Actual translation: Tesla doesn't want people scalping their cars, and are using a technique many other manufacturers do.
- Anthropic doesn't care about consumer products. Their CEO believes that the company with the best LLM by 2026 will be too far ahead for anyone else to catch up.
- It's obvious Twitter was overstaffed but xAI has less than 20 employees so definitely not the case there.
- Oh, the horror, people will be able to read things they can Google!
- Leave it to HN to be negative about a team going from nothing to training a model competitive with a world class lab like Meta in 4 months.