You only get fire risks when the things they touch are themselves tiny (like dust), so they're unable to absorb and spread the heat.
A similar thing happens when you bake with tinfoil. The foil will be at like 350 F, but you can still touch it basically immediately if you're willing to gamble that nothing with thermal mass is stuck to it where you can't see. It just doesn't have enough thermal mass on its own to burn you, but if there's a good-sized glob of cheese or water or something on the other side you can really be in for a nasty surprise.
"The thermal conductivity of aluminum is 237 W/mK, and that of tin is only 66.6 W/mK, so the thermal conductivity of aluminum foil is much better than that of tin foil. Due to its high thermal conductivity, aluminum foil is often used in cooking, for example, to wrap food to promote even heating and grilling, and to make heat sinks to facilitate rapid heat conduction and cooling."
You're not weaponizing Gell-Mann amnesia against us are you?
Both because you probably shouldn't breathe that shit in, and also magnesium and titanium dust are very enthusiastic about combusting. Everyone knows about magnesium but nobody knows titanium is almost as surly.
Iron dust too. Make sure to keep it away from your pre-lit candles:
Almost ANY small particle in a light-density air suspension (dust cloud) will ignite. Certainly anything that oxidizes is prone to going WHOOF! around flames.
This includes non-dairy creamers, paint spray, insecticide sprays (canned or pumped), and sawdust tossed over a fire.
My next band will be named Velveeta Disfigurement. The stuff never unmelts.
I'm sure that would lead to other issues (sure, ejecting it would move you, but you could just always eject it in the opposite of the direction you want to go, which is how spaceships work in the first place), but what if you had super-cooled ice in a thermos-like enclosure, and as you needed to cool you pulled some out, let it melt, then vaporized it, then superheated the steam, then vented that out the back?
I'm not sure you can practically superheat the ballast without just causing more heat that you have to deal with. Maybe a heat pump works? Something about that feels vaguely wrong.
Then there's the fact that heat is very difficult to get rid of when in space. The ISS's radiators are much bigger than its solar panels. If you wanted to have a very-long eva spacesuit you'd have to have radiators much bigger than your body hanging off of it. Short evas are handled by starting the eva with cold liquids in the suit and letting them heat up.
All of the mockups of starships going to Mars mostly fail to represent where they're going to put the radiators to get rid of all the excess heat.
I was curious about this! The Extravehicular Mobility Units on the ISS have 8 hours of life support running on 1.42 kg of LiOH. That releases ~2 kJ per gram used, so .092 watts.
The 390 Wh battery puts out an average of 50 watts.
And the human is putting out at minimum 100 watts with bursts of 200+.
Long term it's probably reasonable to need at least 200 watts of heat rejection. That's about a square meter of most radiator, but it needs to be facing away from the station. You could put zones on the front/back and swap them depending on direction, as long as you aren't inside an enclosed but evacuated area, like between the Hubble and the Shuttle. The human body has a surface area of roughly 2 m^2 so its definitely not enough to handle it- half of that area is on your arms or between your legs and will just be radiating onto itself.
It's also not very feasible to have a sail-sized radiator floating around you. You'd definitely need a more effective radiator- something that absorbs all your heat and glows red hot to dump all that energy.
EDIT: Apparently the Apollo suits did this. An interesting detail is that they used sublimation (evaporating ice directly to vapor), because I suppose that's a lot more practical to exchange the heat.
That also makes nuclear totally infeasible- since turbines are inefficient you'd need 2.5x as many radiators to reject waste heat. Solar would be much lighter.
https://en.wikipedia.org/wiki/Spacecraft_thermal_control#Rad...
(How hot? I won't quote a number, but space nuclear reactors are generally engineered around molten metals).
The S6W reactor in the seawolf submarines run at ~300 C and produce 177 MW waste heat for 43 MWe. If the radiators are 12 kg/m^2 and reject 16x as much heat (call it 3600 W/m^2) then you can produce 875 watts of electricity per m^2 and 290 watts at the same weight as the solar panels. Water coolant at 300 C also needs to be pressurized to 2000+ PSI, which would require a much heavier radiator, and the weight of the reactor, shielding, turbines and coolant makes it very hard to believe it could ever be better than solar panels, but it isn't infeasible.
Plus, liquid metal reactors can run at ~600 C and reject 5x as much heat per unit area. They have their own problems: it would be extremely difficult to re-liquify a lead-bismuth mix if the reactor is ever shut off. I'm also not particularly convinced that radiators running at higher temperatures wouldn't be far heavier, but for a sufficiently large station it would be an obvious choice.
Also the radiated heat from the Sun won't have much effect if the heat sink panels are facing perpendicular to the sun with two sides pointing sideway to deep space to radiate away the heat.
I know it is much hotter, but that's way way hotter and they only find it at a "wall" way farther out.
This is more the temperature of the solar wind, dwarfing the steady state temperature you'd reach from the photonic solar radiation at any distance. The Sun's blackbody varies from like 5000K to 7000K, you won't see objects heated in the solar system heated higher than that even with full reflectors covering the field of view of the rear with more sun and being near the surface of the sun, other than a tiny amount higher from stellar wind, tidal friction, or nuclear radiation from the object's own material I don't think.
Yes! The tiny number of particles are moving really fast, but there are very few of them. We are talking about vacuum that is less than 10^-17 torr. A thermos is about 10^-4 torr. The LHC only gets down to 10^-10 torr. At those pressures you can lower the temperature of a kilometer cube by 10 thousand kelvin by raising the temperature of a cubic centimeter of water by 1 kelvin. There is very little thermal mass in such a vacuum which is why temperature can swing to such wild levels.
This is also why spacecraft have to reject heat purely using radiation. Typically you heat up a panel with a lot of surface area using a heat pump and dump the energy into space as infrared. Some cooling paints on roofing do this at night which is kind of neat.
Suits are insulating for a reason. You want to prevent heating on the sun side and prevent too much cooling on the space side. Your body is essentially encapsulated in a giant thermos.
Cooling is achieved using a recirculating cold water system that is good for a few hours of body heat. Water is initially cooled by the primary life support system of the spacecraft before an EVA. Pretty much it starts off pretty cold and slowly over time comes up to your body heat. Recent designs use evaporative cooling to re-cool the water.
Life support systems are so cool.
Temperature is just the heat of particles moving. In the extreme case of a handful of N2 molecules moving at 1% the speed of light, it has a temperature of something like 9 billion Kelvin. But it's not going to heat you up if it hits you.
I didn't like the Avatar films except for the starships, which are among the more physically realistic in construction including massive radiators. They'd probably need to be even bigger though IRL if you're talking about something loony like an antimatter rocket.
I think you’re missing the key point - heat transfer. The reason we feel hot at the beach is not solely because of heat we absorb directly from solar energy. Some of the heat we feel is the lack of cooling because the surrounding air is warm, and our bodies cannot reject heat into it as easily as we can into air that is cool. And some is from heat reflecting up from the sand.
Theres a heat wave across much of the US right now. Even when the sun goes down it will still be hot. People will still be sweating , doing nothing, sitting on their porches. Because the air and the surrounding environment has absorbed the sun’s heat all day and is storing it.
That’s what you’re neglecting in your analysis of space.
https://en.wikipedia.org/wiki/Atmospheric_window
https://en.wikipedia.org/wiki/Passive_daytime_radiative_cool...
for PDRC there are a couple good videos about it from NightHawkInLight https://youtu.be/N3bJnKmeNJY?t=19s, https://youtu.be/KDRnEm-B3AI and Tech Ingredients https://www.youtube.com/watch?v=5zW9_ztTiw8 https://www.youtube.com/watch?v=dNs_kNilSjk
> An absorption refrigerator is a refrigerator that uses a heat source to provide the energy needed to drive the cooling process. Solar energy, burning a fossil fuel, waste heat from factories, and district heating systems are examples of heat sources that can be used. An absorption refrigerator uses two coolants: the first coolant performs evaporative cooling and then is absorbed into the second coolant; heat is needed to reset the two coolants to their initial states.
https://www.scientificamerican.com/article/solar-refrigerati...
> Fishermen in the village of Maruata, which is located on the Mexican Pacific coast 18 degrees north of the equator, have no electricity. But for the past 16 years they have been able to store their fish on ice: Seven ice makers, powered by nothing but the scorching sun, churn out a half ton of ice every day.
There is no physical process that turns energy into cold. All "cooling" processes are just a way of extracting heat from a closed space and rejecting it to a different space. You cannot destroy heat, only move it. That's fundamental to the universe. You cannot destroy energy, only transform it.
Neither link is a rebuttal of that. An absorption refrigerator still has to reject the pumped heat somewhere else. Those people making ice with solar energy are still rejecting at minimum the ~334kj/kg to the environment.
An absorption refrigerator does not absorb heat, it's called that because you are taking advantage of some energy configurations that occur when one fluid absorbs another. The action of pumping heat is the same.
Then there are things like fusion reactors where the temperature is in the millions of degrees and the whole point of the design is to keep the heat in.
Edit: although interestingly in an electric arc, often the electrons have a higher kinetic energy (temperature) than the heavier ions and atoms in the plasma. It's a highly non-equilibrium situation. That plays into your "high temperature, slow transfer" thing quite nicely: even the atoms within the plasma don't reach the full temperature of the electrons.
If it were really that hot we'd never observe the CMB at a balmy 2.7K.
The Parker Solar probe encounters a similar situation where it has to handle high amounts of direct radiation, but the latent/ambient environment is full of incredibly hot particles at very low density (because they are so hot) which means it isn't that hard to make the probe survive it.
Temperature, it would seem, is an idea that would only have developed at the bottom of a gravity well.