- snarkconjecture parentIt's more like saying pi is approximately "3..14". Easily corrected syntax errors aren't as bad as semantic errors.
- 2 points
- Versions numbers for LLMs don't mean anything consistent. They don't even publicly announce at this point which models are built from new base models and which aren't. I'm pretty sure Claude 3.5 was a new set of base models since Claude 3.
What do mean by "it's a 1.0" and "3rd iteration"? I'm having trouble parsing those in context.
- Not really. Dirac's trick works entirely at a depth of two logs, using sqrt like unary to increment the number. It requires O(n) symbols to represent the number n, i.e. O(2^n) symbols to represent n bits of precision. This thing has arbitrary nesting depth of logs (or exps), and can represent a number to n bits of precision in O(n) symbols.
- 109 points
- 3 points
- I think it's better phrased as "find the best rule", with a tacit understanding that people mostly agree on what makes a rule decent vs. terrible (maybe not on what makes one great) and a tacit promise that the sequence presented has at least one decent rule and does not have multiple.
A rule being "good" is largely about simplicity, which is also essentially the trick that deep learning uses to escape no-free-lunch theorems.
- For the individuals shown in the graph, this buys about $6k per American (and after the first year you can't do it again).
- There are two kinds of naturalness principle in physics, sometimes called "technical naturalness" and "Dirac naturalness" respectively.
Dirac naturalness is as you describe: skepticism towards extremely large numbers, end of story. It has the flaw you (and every other person who's ever heard it) point out.
Technical (or t'Hooft) naturalness is different, and specific to quantum field theory.
To cut a long story short, the "effective", observable parameters of the Standard Model, such as the mass of the electron, are really the sums of enormous numbers of contributions from different processes happening in quantum superposition. (Keywords: Feynman diagram, renormalization, effective field theory.) The underlying, "bare" parameters each end up affecting many different observables. You can think of this as a big machine with N knobs and N dials, but where each dial is sensitive to each knob in a complicated way.
Technical naturalness states: the sum of the contributions to e.g. the Higgs boson mass should not be many orders of magnitude smaller than each individual contribution, without good reason.
The Higgs mass that we observe is not technically natural. As far as we can tell, thousands of different effects due to unrelated processes are all cancelling out to dozens of decimal places, for no reason anyone can discern. There's a dial at 0.000000.., and turning any knob by a tiny bit would put it at 3 or -2 or something.
There are still critiques to be made here. Maybe the "underlying" parameters aren't really the only fundamental ones, and somehow the effective ones are also fundamental? Maybe there's some reason things cancel out, which we just haven't done the right math to discover? Maybe there's new physics beyond the SM (as we know there eventually has to be)?
But overall it's a situation that, imo, demands an explanation beyond "eh, sometimes numbers are big". If you want to say that physical calculations "explain" anything -- if, for example, you think electromagnetism and thermodynamics can "explain" the approximate light output of a 100W bulb -- then you should care about this.
- She's saying that a different model -- one of the three disagreeing methods for distance ladder measurements -- must be wrong, because they disagree with each other. But if one or more of those models are wrong, then there's not much evidence that the LambdaCDM model is wrong.
Conversely, the hypothesis that LambdaCDM is wrong does nothing to explain why the distance ladder methods disagree.
She clearly isn't saying that any model is infallible, she's just saying that clear flaws with one set of models throw into question some specific accusations that a different model is wrong.
You actually need to pay attention to the details; the physicists certainly are. Glib contrarianism isn't very useful here.
- It's the magnetic field that has the arbitrary sign convention. You can't determine the direction of a magnetic field from observations without using the right hand rule.
- That definition doesn't work well because you can have changes in entropy even if no energy is transferred, e.g. by exchanging some other conserved quantity.
The side note is wrong in letter and spirit; turning potential energy into heat is one way for something to be irreversible, but neither of those statements is true.
For example, consider an iron ball being thrown sideways. It hits a pile of sand and stops. The iron ball is not affected structurally, but its kinetic energy is transferred (almost entirely) to heat energy. If the ball is thrown slightly upwards, potential energy increases but the process is still irreversible.
Also, the changes of potential energy in corresponding parts of two Carnot cycles are directionally the same, even if one is ideal (reversible) and one is not (irreversible).
- It's not a dupe.
"I commissioned a professional voice actor to give a full dramatic reading of that blog post."
- R squared has extremely nice properties. If you add a bunch of linearly uncorrelated variables together, then compute R² between each variable and the sum, the R² values will sum to 1.
- I don't think the study authors would endorse the claim that this meaningfully lends credence to Penrose's theories about consciousness. Quantum effects are all over chemistry; that's nowhere near sufficient to provide much evidence of quantum computation happening, which in turn would not necessarily suggest any kind of explanation of consciousness.
Nobody is skeptical about quantum effects (of this sort) happening in biochemistry. All of the skepticism Penrose faces is about the quantum computation and consciousness parts. So this paper won't change anyone's minds about Penrose.
- Quantum computers are generally more powerful in the sense that they can solve a larger set of problems in polynomial time.
i.e. BPP is contained in BQP but the converse is thought to be false.
- That sounds like the opposite of what graycat was suggesting.
- You could say that as well. Gamma is often (usually?) defined as any radiation that originates from nuclear state transitions rather than electrons; this tends to be very high energy but can overlap the range of EM radiation from electronic transitions. Th229 is the extreme outlier.
- How so? Isn't the Markov property just a consequence of basic QM?
- For sufficiently large N, it's impossible to prove Halt_N correct.
(The N required depends on your axioms.)
- This follows fewer of the principles on the "Brutalist Web Design" link than HN does. In particular, it's slower, has less obvious links, breaks my back button, and has (imo) distracting unnecessary decoration.
I would also bet it uses JavaScript for the "hyperlinks" but haven't checked.
- "Ah yes, you wanted to know where the electro-weak mixing angle came from, didn’t you? It’s the sum of the angle of tilt of the Earth’s axis, and the angle of inclination of the Moon’s orbit. Well, not exactly, because the particle experiments are more difficult to do than the astronomical ones, but it’s pretty close."
This is absurd crackpot nonsense.
- Why would a disappointingly simple explanation be a hint that there are gaps in our models? Or am I parsing that part wrong?
I disagree that it's really that simple an explanation, though. Our best theoretical prediction for dark energy is that it should be denser by dozens of orders of magnitude. It's not remotely clear why the actual value seems to be so small, especially since it's not exactly zero.
- Yup. On which note, the Higgs' symmetry breaking should really have changed the vacuum energy density by something orders of magnitude larger than the density of dark energy...
- Big rip scenarios are inconsistent with dark energy having a constant value of w=-1, which is what the author is discussing in that paragraph. The article is wrong.
- It makes more sense if you've learned general relativity.
- Both of those are pressureless, so they could be dark matter but not dark energy.
Except that primordial black holes are ruled out (as 100% of dark matter) by various observational bounds, and antimatter would be just as visible as normal matter and not "dark" at all.
- Momentum is just mv, not mv².
- Neither the OP nor the top answer on Reddit mentioned the Higgs boson. You're right, but I'm not sure who you're rebutting.
- General Relativity is a straightforward consequence of basically any form of string theory.