Preferences

"The average temperatures on the Victoria line have risen by almost seven degrees since 2013 – nearly a *30%* increase.

Conversely, the increase in the average annual temperatures across all Underground lines from 2013 to 2024 was merely *seven percent*, placing Victoria’s temperature rise vastly above that."

Using percentages to talk about changes in non-Kelvin temperatures is crazy.


Yep. That 30% is a bad use of statistics.

28 degrees Celsius is not 30% warmer than 21 degrees Celsius. This same stat rendered in Fahrenheit would say 70 degrees -> 82 degrees, or 17%. In kelvin it would be 294 -> 301, or 2.3%

Or we could invent a new measure indexed to Celsius but offset by 20 degrees, and declare a 1 -> 8 change, a whopping 700%.

You see the opposite effect with reporting on the DJIA, where a 500 point move is treated like a big event even though it's is not nearly as significant today as it was 30 years ago. They ignore the more relevant percentage change in favor of the more sensational representation.
That's not statistics as much as sensationalism. Every morning on WSJ at about 7am you'll see a 'live' update about how the markets and futures are in a 'selloff' or 'surging' even if the percentages are like 0.3% in either direction.
Even Kelvin is the wrong model. What we need is temperature, humidity and air speed to interfere anything meaningful. ISO 7730 is even more precise. Any meaningful discussion should use the models from there.
But 28C or 82F is swelteringly hot in Britain so it kinda makes sense even it's incorrect.
Yeah. It's obviously incorrect in the sense celsius has no meaning as an absolute temperature scale and so its not 30% more anything, but in terms of colloquial meaning the average Brit probably does see it as ~30% further along an indoor temperature scale from "someone put the heating on please" to "crikey, it's sweltering in here"
This is not okay. If it makes sense for you, your sense for percentages is wrong.
Feynman was complaining about this error appearing in textbooks back in the sixties[0].

The trouble (of course) is that Celsius properly is not a proper unit, but a "scale", or a "unit of difference" (equal to kelvin), or even torsor[1].

The trouble with the kelvin here is that if you see the 7 kelvin increase as a proportion of the 295K starting temperature the you only get a 2% increase. Nobody is going to buy your newspaper if you're putting up weak numbers like that.

[0] https://mathematicalcrap.com/2024/03/05/the-feynman-story/ [1] https://math.ucr.edu/home/baez/torsors.html

To make matters worse, not all ranges and percentages on that scale are equal, whether they're the same in absolute or relative terms. Humans have a narrow relevant "operational" temperature band. Even 20 degrees between 10-30C feel like nothing compared to the 5 degrees between 37-42C.
Not to mention that wet bulb temperature, measuring the effect of humidity, is actually the most important measure in those temperature ranges.
Humidity in London is also awful as the temperature gets closer to freezing. I found the damp cold in London to be common over the year and truly horrid (a reason to never live there).

At home (Christchurch, NZ) we often get dry cold which can be pleasant: however when we do get the occasional vile damp cold I personally call it "London cold" because it made so much impression upon me in my 20s.

Humidity in summer is just as bad. At 30 degrees C London feels stifling hot because of the humidity
‘London cold’ can be found a bit north, in Auckland.

The only upside is that your lips don’t split in winter.

I haven't experienced that (but I've only spent months in Auckland or north of Auckland).

From your comment I asked my brother and he said he agreed - something like he didn't like the Auckland 14° days that felt like 5°

Your upper band of 30C is making this Englishman sweat just thinking about it. Which I think proves your point about narrow operational temperature even further.
You’re right in principle but that’s probably the worst example you could have given. So bad an example that I think it could easily be argued to disprove your point.
For us people not living in Arabia or South Texas the difference between 37 and 42 Celsius is indeed quite important. 37 feels pretty hot, but yet livable, while 42 (and even 40 for me) means that nothing non-urgent should bring me out of the house.
And, inversely, the five degree drop from, say, 3C to -2C represents means water can and will freeze, which is another massive change in livability.
Indeed. At 3C in London, the humidity seeps into every pore and settles into your bones. Riding a bike at 3C, unless you're wearing a balaclava and a ski mask, is an exercise in pure pain, as the wetness sublimating off your face has approximately the same effect on your facial nerves as being flayed.

Ok the other hand, -2C in London is crisp and invigorating and entirely preferable in every possible way.

I moved countries just so I can experience 20C weather instead of 30C weather. It is very noticeable haha
Where I live, the current daily temperature is between 33 in the morning and hits forty something mid day, so I have very recent experience with those 5 degrees, and I completely disagree with your assessment.

Difference between 10 and 30 degrees C feels, way, way bigger than 37 to 42.

One goes from straight up cold to quite hot, while the other just goes from very hot to even more very hot.

I bet you’re casually discounting the fact that you spend most of that time with 40+C temps in an air conditioned space.

At 42 degrees your body can’t cool down and this causes a lot of deaths or even the “casual” fainting described in the article due to hyperthermia. This goes way beyond your subjective feeling.

You right in the sense that I rarely spend more than 10 minutes in either 37 nor 42 degrees at a time.

I thought this thread was discussing the subjective feelings of temperature.

I think the whole thread is to show the Celsius scale in particular but temperature degrees have different impacts in different places in general, but especially in the range supporting human life. One degree can mean nothing or be the difference between being alive or dead, depending on where that degree is on the scale and maybe other compounding factors like humidity.

Talking about relative temperature differences without anchoring to an absolute reference is meaningless. Using percentages is even worse.

I'm with you: very hot to more very hot. I think the real issue in the fainting story is that he/she had to take their coat off. 42 is just not life threatening if one knows when to take one's coat off (and drink water).
42C is eventually life threatening naked if the humidity is high enough, especially if you’re in the sun.

It just takes a long time for your body temperature to increase, thus you have a while to find a cooler spot.

Then just don't use percentages, and rely on people realizing that a 7 degree difference is big!
There are few or no human scale situations where percentages of absolute temperature are meaningful, absolute zero is too far away and we live in a too narrow range of temperatures. Unless you're in a scientific context just don't use percentages on absolute temperature, only on rates.
People who complain about Fahrenheit vs. Celsius are correct to the degree (sorry) that the Celsius degree unit of difference is the standard in a lot of engineering calculations. But Celsius as a temperature scale is no more logical than Fahrenheit, which is arguably more practical for day to day use--and Kelvin is more likely required for a lot of engineering and chemical calculations anyway.
> But Celsius as a temperature scale is no more logical than Fahrenheit

Celsius is more logical:

(1) the endpoints of Celsius are boiling/melting point of water (at standard atmospheric pressure). The lower endpoint of Fahrenheit is the lowest temperature Fahrenheit could achieve using a mixture of water, ice and ammonium chloride-using the freezing point of pure water is more logical than using the freezing point of an ammonium chloride solution-water is fundamental to all known life, ammonium chloride solutions don’t have the same significance (and why ammonium chloride instead of sodium chloride or potassium chloride? of the salts readily available to Fahrenheit, the ammonium chloride solution had the lowest freezing point)

(2) Fahrenheit initially put 90 degrees between his two “endpoints” (ammonium chloride solution freezing point and human body temperature), then he increased it to 96. Celsius having 100 degrees between its endpoints is more logical than 90 or 96

(3) while for both Celsius and Fahrenheit, there is error in the definition of their endpoints (the nominal values are different from the real values, because our ability to measure these things accurately was less developed when each scale was originally devised, and some unintentional error crept in)-the magnitude of that error is smaller for Celsius than for Fahrenheit

(4) nowadays, all temperature units are officially defined in terms of Kelvin - and Celsius has a simpler relation to Kelvin than Fahrenheit (purely additive versus requiring both addition and multiplication)

(5) Celsius is the global standard for everyday (non-scientific) applications, not Fahrenheit, and it is more logical to use the global standard than a rarely used alternative whose advantages are highly debatable at best

> which is arguably more practical for day to day use

No, it is not. Americans say this only because they're used to it. The common arguments are that is that it is more precise, and 'you see temperatures from zero to one hundred degrees Fahrenheit throughout the year'.

Firstly, the problem with Fahrenheit is that its realisation is inaccurate by modern standards—which is why every single non-SI unit is now just an exact multiple of a corresponding SI unit with the same dimensions; the mediaeval and early modern definitions having been entirely forgotten. A bowl of salted ice and his own armpit? Truly an 18th-century invention.

Next, the extra precision that a difference of one degree Fahrenheit gives you is frankly useless. Within a single room one can experience a difference of over five degrees Celsius or more, depending on what's in the room—a radiator or air conditioner running, or the gas stove in a kitchen, or a high-end PC. Forget rooms. On the human body itself there can be a two to three degree Celsius difference between the extremities and the thorax/torso/head. Any field that requires extreme precision will naturally end up using SI units, so kelvin (or some related scientific unit). (Excluding the absolutely crazy bunch of American machinists who like using thousandths and ten-thousandths of an inch—at this point the joke writes itself).

As for climates, there are places that see very little difference in temperature, and definitely not the 'nice 0 – 100' range that Americans claim. Even in the US there are places like southern Louisiana and Florida that have borderline tropical climates, and don't really go below ~15 °C or above 35 °C.

All of this is not really logical either, and all end up being a manifestation of familiarity.

I'm not generally a Fahrenheit defender, but I think it's silly to deny the user-friendliness of having more integer values in the day-to-day temperature range, without going too far out of the two-digit measurement range. It lets you have a little more precision without being much effort to do casual math on. Milli-kelvin are far too small, a single kelvin is too big a perceptual range, and decimals are too annoying when we just want to talk about the weather.
I’m legitimately surprised by this idea - surely people in countries that use Fahrenheit don’t go around saying things like “Oh I thought it was going to be 54 degrees but it’s actually 55, so much different!”

I’ve grown up with Celsius and never felt the need to use decimals in day to day weather discussion… Many air conditioners let you go up by half a degree C and that’s more than enough precision, more than I’ve ever felt was necessary in everyday conversation.

> As for climates, there are places that see very little difference in temperature, and definitely not the 'nice 0 – 100' range that Americans claim.

That's not really the point.

0° F: It's cold outside. 100° F: It's hot outside.

0° C: It's cold outside but not really that cold. 100° C: Dead.

0 K: Dead. 100 K: Dead.

These things are the case for humans regardless of whether you live in a place that actually gets cold or hot outside.

It literally doesn’t matter at all what’s 0 and 100 though. Honestly if you’re familiar with one system, the scale feels intuitive to you, and if you’re familiar with the other then the other one does.

Like people tell me that the US customary system is “more human scale and intuitive” but I literally cannot picture, say, 15 inches or ten feet - it just means nothing to me unless I mentally convert to centimetres or meters.

So much of these arguments boil down to “I grew up with this system so I can intuitively use it, so it must be superior”

> I grew up with this system so I can intuitively use it, so it must be superior

This is essentially every American argument for USC or Imperial units. In fact, there are actually legitimate reasons why some legacy units are superior—for instance the duodecimal or sexagesimal systems which have many more factors than the decimal. But every other argument is a variation of 'it's better because I know it better'.

0°C tells you the very practical information that it's freezing outside and that you must be careful, or you can expect snow. For F° you have to know the value by heart.
250K: quite cold.

300K: beach weather.

350K: you're distilling your own moonshine right ?

(worksforme)

> 0° C: It's cold outside but not really that cold.

> These things are the case for humans

Who says so?

0 °C is very cold by many people's standards. About half the human population lives within the tropics. In fact I'd like to see Americans walk around in the UK wearing just a T-shirt and bermudas when it's barely above freezing, and insist 'it's not really that cold, it's only 32 °F'.

It would definitely be crazy in Fahrenheit, but in centigrade I think it makes some sort of intuitive (if not scientific) sense. (Together with the sea-level assumption we always make in casual temperature discussion anyway.)
It makes just as much intuitive sense in Fahrenheit as it does in centigrade.
Maybe because I was brought up with centigrade it makes more sense to me. The centigrade number is how far you are from water freezing. If it goes up 100% then you are twice as far away. I'm not aware that doubling the fahrenheit number has a similar easy to understand meaning?
> The centigrade number is how far you are from water freezing

The Fahrenheit scale is how far your are from your own body temperature. It was designed that 100 is the temperature of a human. (Adjusted later to 98.6 due to inaccuracies.)

0 was designed to be as cold as you can get with ice and salt (also ended up being slightly inaccurate).

> Maybe because I was brought up with centigrade it makes more sense to me.

Yup. People brought up on Fahrenheit think it is superior. For temperature neither argument is objectively better. (In contrast to imperial distance measurement with non-powers of 10 and factions, where there are good arguments against it, with temperature both scales are ultimately arbitrary.)

> For temperature neither argument is objectively better.

I think Celsius is objectively better in that:

(1) its endpoints–freezing and boiling point of water–are more natural / less arbitrary / more fundamental than Fahrenheit's–coldest temperature you can reach with salt and ice to average human body temperature. Water is a fundamental substance to all known life; the freezing point of pure water is much more fundamental than the freezing point of a water + NaCl mixture (actually apparently Fahrenheit used ammonium chloride not sodium chloride, which is arguably even more arbitrary than sodium chloride would be). If you imagine some extraterrestrial civilisation independently invents a temperature scale, they'd be more likely to come up with something close to Celsius than something close to Fahrenheit

(2) while both scales contain some error in that the nominal value of their endpoints differs from the real value, the error is greater for Fahrenheit

(3) According to Wikipedia, Fahrenheit didn't have 100 degrees between his two endpoints, he originally had 90 then increased it to 96 – given base 10 is the standard number base used by humans, 100 divisions is less arbitrary than 90 or 96

(4) nowadays, all other temperature scales are officially defined in terms of Kelvin – and Celsius has a simpler relationship to Kelvin than Fahrenheit does (for Celsius it is purely an additive offset, for Fahrenheit it involves both addition and multiplication)

(5) conforming to the global standard is objectively better than sticking with an alternative which lacks clearcut advantges

Fahrenheit is about half the size of increments without decimals or fractions which, having grown up with it ,seems useful. Yeah, I kinda want to know about 32 degrees but that doesn't seem a huge cognitive load. Knowing sub-zero is fricking cold is a decent benchmark as is knowing that >100 degrees is fricking hot. Yeah, 212 degrees for boiling is a bit weird but don't really need that much and that's only at standard pressure anyway.
Reading this quote made me finally realize why the name centigrade exists. It’s a gradient scale of 100.
Reading this comment about the previous quote made me finally realize why the name centigrade exists.
Why? The slope of the Fahrenheit scale is different to the Celsius and Kelvin scales, but the slope of the latter two does match.
The slope of the scales has no bearing on whether percentages are meaningful here. The problem with both systems when it comes to percentages is that neither system has 0 set to a natural 0. This leads to an entirely arbitrary point on the scale where decreases in the unit will approach a 100% difference and then suddenly start decreasing again.

If anything Fahrenheit should be less insane because at least the artificial 0 is likely to stay much further away in the data they're quantifying so the percentages stay reasonable.

Ah right, okay, that makes sense.
The slope of the Fahrenheit scale matches that of the Rankine scale.

I would still say that the in the Rankine scale percentage increases make sense, and Fahrenheit changes to not.

The thing that matters isn't the slope, but the zero point; "X% farther from absolute zero" is a useful measurement, "X% farther from an arbitrary zero point" is not. Especially when negative or zero temperatures are involved.

Ok. Then please explain what % does the temperature rise when going from 0 Celsius to 5 Celsius!
Or -1 to 1 Celsius, for that matter.
Obviously it's -200%, which means that going from -1C to 1C is a drastic decrease in warmness!
Early measurements were done by individuals and they were idiosyncratic to the process of discovery/calibration.

Kelvin is refined measurements used to relate to a wider scale of temperatures. Celsius is a metric human scale subset of Kelvin.

In both it makes a sort of intuitive sense. 7% of the way from freezing to boiling is a meaningful way to visualise temperature; 7% of the way from ice melting in a bath of salt to slightly above Mrs Fahrenheit's armpit temperature is also meaningful, although perhaps a little idiosyncratic.

Edit: this comment was deeply stupid for obvious reasons and I regret trying to interact with other people when I should be asleep.

The issue is a percentage of a Celsius value is not that. For example, an increase from 1°C to 2°C is a "100% increase", but is only 1 percentage point from freezing to boiling.
You could say things like that with anything in percentages? 100% increase in your pension from 100k to 200k is only 10% (increase, to 20% total) of your target 1M, or whatever.
100k to 200k is a 100% increase in absolute, but a 10 percentage point increase to your target of 1M. The difference between the example you give and the one in the article is that 0 in the case of your pension meaningfully refers to its emptiness, but in the case of Celsius, it has no "emptiness" interpretation.

The equivalent would be saying that going from 600k to 700k was a 100% increase... compared to 500k.

It's not completely meaningless, to be fair. Saying 10°C to 20°C is a 100% increase has the meaning of "it's twice as far from freezing", which isn't totally meaningless (kind of like saying Everest is twice as high as Mont Blanc, which really means "its summit is twice as far from sea level").

But in your example, the 10% has nothing at all to do with the increase of 100%.

If your ”whatever” target instead was 50k, is the argument that going from 100k to 200k would be 400%?

Yes? I didn't see any link in comment I replied to between 100% and 100deg besides it happening to be the same number - I already changed that to 1M, changing it to 50k no different either.

If the argument was saying there's something special about 100% being 100 quantity then.. no? I don't really know where to go from there, what I said still holds with a 100k target, but I'm not going to be able to give 'another' example where the 100 quantity is meaningful because it isn't for degrees either. It's the freezing point at 0 that makes it work better for centigrade than Fahrenheit, imo.

And they manage to make it even more crazy by also comparing it to average external temperatures.

== The Victoria Line average temperature in August last year was 60% higher in temperature than the average external temperature that month, measured at 19.5 degrees. ==

Certainly for January it must have been hundreds of percent higher.

And what would the numbers be for e.g., the Moscow metro in winter months where the average outside temperature is negative?

Yikes. I posted this, and I missed that, something I realised soon into my first year physics degree lab. I learnt more than just dipping calculators in liquid nitrogen for fun.

I apologise.

Maybe the right thing to do is to measure from ideal room temperature? From zero doesn't make any sense but setting an anchor temperature makes sense.

This is done when people rate the efficiency of home heating: SCOP is a function of the heat pump's ability to hit a particular temperature, for instance.

I'd guess the baseline temperature on the tube should be 21C maximum. Percentages don't make sense here, but 7C over the target temperature (for instance) is pretty bad in those terms. I'd be surprised if TfL hadn't set that somewhere.

Also worthy of note is that it sounds like the tube is a prime source of heat for a district heating system. Win win, perhaps.

And only in specific contexts with Kelvin.

If you go from freezing water to boiling, it's only 37% hotter!

I don’t know if I’m worried about it. While the measurement makes little scientific sense it makes intuitive sense, and, importantly, the intuitive implications are the scientific implications.

It’s a huge increase, if not for the reasons they describe.

So how many decibels louder is it now?
Is it? I think it puts the Victoria rise in perspective to the other lines quite effectively.

Everyone knows where the zero is in Celsius using countries anyway and days in the negative are so rare in the UK you can discount them (plus they are none inside the tube).

I came here to say this. It rankled me no end. Good to see that this is the top comment because we’re a scientifically literate crowd.
I logged in just to give this an upvote :-)

This item has no comments currently.

Keyboard Shortcuts

Story Lists

j
Next story
k
Previous story
Shift+j
Last story
Shift+k
First story
o Enter
Go to story URL
c
Go to comments
u
Go to author

Navigation

Shift+t
Go to top stories
Shift+n
Go to new stories
Shift+b
Go to best stories
Shift+a
Go to Ask HN
Shift+s
Go to Show HN

Miscellaneous

?
Show this modal