28 degrees Celsius is not 30% warmer than 21 degrees Celsius. This same stat rendered in Fahrenheit would say 70 degrees -> 82 degrees, or 17%. In kelvin it would be 294 -> 301, or 2.3%
Or we could invent a new measure indexed to Celsius but offset by 20 degrees, and declare a 1 -> 8 change, a whopping 700%.
The trouble (of course) is that Celsius properly is not a proper unit, but a "scale", or a "unit of difference" (equal to kelvin), or even torsor[1].
The trouble with the kelvin here is that if you see the 7 kelvin increase as a proportion of the 295K starting temperature the you only get a 2% increase. Nobody is going to buy your newspaper if you're putting up weak numbers like that.
[0] https://mathematicalcrap.com/2024/03/05/the-feynman-story/ [1] https://math.ucr.edu/home/baez/torsors.html
At home (Christchurch, NZ) we often get dry cold which can be pleasant: however when we do get the occasional vile damp cold I personally call it "London cold" because it made so much impression upon me in my 20s.
The only upside is that your lips don’t split in winter.
From your comment I asked my brother and he said he agreed - something like he didn't like the Auckland 14° days that felt like 5°
Ok the other hand, -2C in London is crisp and invigorating and entirely preferable in every possible way.
Difference between 10 and 30 degrees C feels, way, way bigger than 37 to 42.
One goes from straight up cold to quite hot, while the other just goes from very hot to even more very hot.
At 42 degrees your body can’t cool down and this causes a lot of deaths or even the “casual” fainting described in the article due to hyperthermia. This goes way beyond your subjective feeling.
I thought this thread was discussing the subjective feelings of temperature.
Talking about relative temperature differences without anchoring to an absolute reference is meaningless. Using percentages is even worse.
It just takes a long time for your body temperature to increase, thus you have a while to find a cooler spot.
Celsius is more logical:
(1) the endpoints of Celsius are boiling/melting point of water (at standard atmospheric pressure). The lower endpoint of Fahrenheit is the lowest temperature Fahrenheit could achieve using a mixture of water, ice and ammonium chloride-using the freezing point of pure water is more logical than using the freezing point of an ammonium chloride solution-water is fundamental to all known life, ammonium chloride solutions don’t have the same significance (and why ammonium chloride instead of sodium chloride or potassium chloride? of the salts readily available to Fahrenheit, the ammonium chloride solution had the lowest freezing point)
(2) Fahrenheit initially put 90 degrees between his two “endpoints” (ammonium chloride solution freezing point and human body temperature), then he increased it to 96. Celsius having 100 degrees between its endpoints is more logical than 90 or 96
(3) while for both Celsius and Fahrenheit, there is error in the definition of their endpoints (the nominal values are different from the real values, because our ability to measure these things accurately was less developed when each scale was originally devised, and some unintentional error crept in)-the magnitude of that error is smaller for Celsius than for Fahrenheit
(4) nowadays, all temperature units are officially defined in terms of Kelvin - and Celsius has a simpler relation to Kelvin than Fahrenheit (purely additive versus requiring both addition and multiplication)
(5) Celsius is the global standard for everyday (non-scientific) applications, not Fahrenheit, and it is more logical to use the global standard than a rarely used alternative whose advantages are highly debatable at best
No, it is not. Americans say this only because they're used to it. The common arguments are that is that it is more precise, and 'you see temperatures from zero to one hundred degrees Fahrenheit throughout the year'.
Firstly, the problem with Fahrenheit is that its realisation is inaccurate by modern standards—which is why every single non-SI unit is now just an exact multiple of a corresponding SI unit with the same dimensions; the mediaeval and early modern definitions having been entirely forgotten. A bowl of salted ice and his own armpit? Truly an 18th-century invention.
Next, the extra precision that a difference of one degree Fahrenheit gives you is frankly useless. Within a single room one can experience a difference of over five degrees Celsius or more, depending on what's in the room—a radiator or air conditioner running, or the gas stove in a kitchen, or a high-end PC. Forget rooms. On the human body itself there can be a two to three degree Celsius difference between the extremities and the thorax/torso/head. Any field that requires extreme precision will naturally end up using SI units, so kelvin (or some related scientific unit). (Excluding the absolutely crazy bunch of American machinists who like using thousandths and ten-thousandths of an inch—at this point the joke writes itself).
As for climates, there are places that see very little difference in temperature, and definitely not the 'nice 0 – 100' range that Americans claim. Even in the US there are places like southern Louisiana and Florida that have borderline tropical climates, and don't really go below ~15 °C or above 35 °C.
All of this is not really logical either, and all end up being a manifestation of familiarity.
I’ve grown up with Celsius and never felt the need to use decimals in day to day weather discussion… Many air conditioners let you go up by half a degree C and that’s more than enough precision, more than I’ve ever felt was necessary in everyday conversation.
That's not really the point.
0° F: It's cold outside. 100° F: It's hot outside.
0° C: It's cold outside but not really that cold. 100° C: Dead.
0 K: Dead. 100 K: Dead.
These things are the case for humans regardless of whether you live in a place that actually gets cold or hot outside.
Like people tell me that the US customary system is “more human scale and intuitive” but I literally cannot picture, say, 15 inches or ten feet - it just means nothing to me unless I mentally convert to centimetres or meters.
So much of these arguments boil down to “I grew up with this system so I can intuitively use it, so it must be superior”
This is essentially every American argument for USC or Imperial units. In fact, there are actually legitimate reasons why some legacy units are superior—for instance the duodecimal or sexagesimal systems which have many more factors than the decimal. But every other argument is a variation of 'it's better because I know it better'.
300K: beach weather.
350K: you're distilling your own moonshine right ?
(worksforme)
> These things are the case for humans
Who says so?
0 °C is very cold by many people's standards. About half the human population lives within the tropics. In fact I'd like to see Americans walk around in the UK wearing just a T-shirt and bermudas when it's barely above freezing, and insist 'it's not really that cold, it's only 32 °F'.
The Fahrenheit scale is how far your are from your own body temperature. It was designed that 100 is the temperature of a human. (Adjusted later to 98.6 due to inaccuracies.)
0 was designed to be as cold as you can get with ice and salt (also ended up being slightly inaccurate).
> Maybe because I was brought up with centigrade it makes more sense to me.
Yup. People brought up on Fahrenheit think it is superior. For temperature neither argument is objectively better. (In contrast to imperial distance measurement with non-powers of 10 and factions, where there are good arguments against it, with temperature both scales are ultimately arbitrary.)
I think Celsius is objectively better in that:
(1) its endpoints–freezing and boiling point of water–are more natural / less arbitrary / more fundamental than Fahrenheit's–coldest temperature you can reach with salt and ice to average human body temperature. Water is a fundamental substance to all known life; the freezing point of pure water is much more fundamental than the freezing point of a water + NaCl mixture (actually apparently Fahrenheit used ammonium chloride not sodium chloride, which is arguably even more arbitrary than sodium chloride would be). If you imagine some extraterrestrial civilisation independently invents a temperature scale, they'd be more likely to come up with something close to Celsius than something close to Fahrenheit
(2) while both scales contain some error in that the nominal value of their endpoints differs from the real value, the error is greater for Fahrenheit
(3) According to Wikipedia, Fahrenheit didn't have 100 degrees between his two endpoints, he originally had 90 then increased it to 96 – given base 10 is the standard number base used by humans, 100 divisions is less arbitrary than 90 or 96
(4) nowadays, all other temperature scales are officially defined in terms of Kelvin – and Celsius has a simpler relationship to Kelvin than Fahrenheit does (for Celsius it is purely an additive offset, for Fahrenheit it involves both addition and multiplication)
(5) conforming to the global standard is objectively better than sticking with an alternative which lacks clearcut advantges
If anything Fahrenheit should be less insane because at least the artificial 0 is likely to stay much further away in the data they're quantifying so the percentages stay reasonable.
I would still say that the in the Rankine scale percentage increases make sense, and Fahrenheit changes to not.
The thing that matters isn't the slope, but the zero point; "X% farther from absolute zero" is a useful measurement, "X% farther from an arbitrary zero point" is not. Especially when negative or zero temperatures are involved.
Kelvin is refined measurements used to relate to a wider scale of temperatures. Celsius is a metric human scale subset of Kelvin.
Edit: this comment was deeply stupid for obvious reasons and I regret trying to interact with other people when I should be asleep.
The equivalent would be saying that going from 600k to 700k was a 100% increase... compared to 500k.
It's not completely meaningless, to be fair. Saying 10°C to 20°C is a 100% increase has the meaning of "it's twice as far from freezing", which isn't totally meaningless (kind of like saying Everest is twice as high as Mont Blanc, which really means "its summit is twice as far from sea level").
If your ”whatever” target instead was 50k, is the argument that going from 100k to 200k would be 400%?
If the argument was saying there's something special about 100% being 100 quantity then.. no? I don't really know where to go from there, what I said still holds with a 100k target, but I'm not going to be able to give 'another' example where the 100 quantity is meaningful because it isn't for degrees either. It's the freezing point at 0 that makes it work better for centigrade than Fahrenheit, imo.
== The Victoria Line average temperature in August last year was 60% higher in temperature than the average external temperature that month, measured at 19.5 degrees. ==
Certainly for January it must have been hundreds of percent higher.
And what would the numbers be for e.g., the Moscow metro in winter months where the average outside temperature is negative?
I apologise.
This is done when people rate the efficiency of home heating: SCOP is a function of the heat pump's ability to hit a particular temperature, for instance.
I'd guess the baseline temperature on the tube should be 21C maximum. Percentages don't make sense here, but 7C over the target temperature (for instance) is pretty bad in those terms. I'd be surprised if TfL hadn't set that somewhere.
Also worthy of note is that it sounds like the tube is a prime source of heat for a district heating system. Win win, perhaps.
If you go from freezing water to boiling, it's only 37% hotter!
It’s a huge increase, if not for the reasons they describe.
Everyone knows where the zero is in Celsius using countries anyway and days in the negative are so rare in the UK you can discount them (plus they are none inside the tube).
Conversely, the increase in the average annual temperatures across all Underground lines from 2013 to 2024 was merely *seven percent*, placing Victoria’s temperature rise vastly above that."
Using percentages to talk about changes in non-Kelvin temperatures is crazy.