It... isn't. That would change wildly depending on which sea/ocean you get your saltwater from (more salt = colder freezing point).
It really is defined relative to a very specific brine mixture (in the most scientifically generous origin story - some say he literally just measured the coldest winter day he could). Well except it isn't anyway, because like all US units nowadays it's defined against metric units (namely the Kelvin, just like 0°C is actually defined to be 273.15 K).
I live in the United States and although I grew up here using Fahrenheit, I switched to Celsius almost 10 years ago. Part of my reason for switching was the rest of the world was using Celsius and every time they would mention the temperature, I had no clue if that was very hot, or just right and kept having to convert, so since there were not that many countries that used Fahrenheit, I switched. I still know what the comfortable range is in Fahrenheit, but now I also know in Celsius as I use it every day. Also, I no longer appear to be an old curmudgeon that is resistant to using a system the rest of the world already uses.
I had once heard described that fahrenheit's best feature is that you can go "oh, 1-100, 'sheesh, that's really cold!' to 'hoof, that's pretty hot!'" and yeah, while I was in the US where most temperatures (RIP Florida) change all the time, that sure was convenient.
However, living in a country that always stays in the 80-100 range, the 'oh fuck, the water's freezing' to 'oh fuck, the heat death of the sun is upon us' range is a MUCH more useful scale to knowing if we've been struck by some sort of apocalyptic event today
The range for livable temperatures follows a more reasonable scale. Hot is really high numbers. Cold is low. The exact temperature is more precise because the range is larger.
Celsius is fine for scientists but for the regular person Fahrenheit has a better range.
Also I'm biased.
As someone who moved to the US later in life, I learned to use fahrenheit because there's no way to talk to anyone about the weather or cooking otherwise.
If you need to do the same one day, don't bother trying to convert in your head. Just learn the numbers conversationally. Familiarize yourself with how the weather feels with the number the weather app shows.
I can't convert at all but I can use both C and F in conversation because one rarely needs exact numbers anyway. You learn the ballparks pretty quick.
See, that's the problem with these "Fahrenheit is more intuitive" arguments. They are catered to a very specific country with a very specific climate. For me, 25-30 ºC is an average late spring day.
FWIW Fahrenheit has more precision for the temperatures you most commonly feel. Day-to-day you're likely to feel temps between 10-32°C (range of 22°), which is 50-90°F (range of 40°). It might not seem like a big deal, but I can tell a difference in my house when setting my thermostat from 68°F to 69°F; conversely, if I turn my thermostat to C mode both values get rounded to 20.
But yes, as an American, I think of CPU temps in terms of C, I know water freezes at 0°C/32°F, I know water boils at 100°C but have never committed to memory what it is in F, and in chem classes we always use C/K.
I like to refer to them as Freedom units and Communist units (in jest, obviously). I will say, though, that Fahrenheit feels like a more precise scale for measuring temperature even if the units are goofy.
I don't get the precision argument. It really doesn't matter for personal use because you wouldn't feel the difference anyways and if you really needed it to be as precise as possible (for... I don't know, science) you'd use decimals. And if you're sciencing, you'd use the system that allows easy conversion, which is metric.
What additional arguments besides personal experience would you give to back this precision claim?
Temperature scales are arbitrary by nature, and the criteria behind their definition can be useful or not. Fahrenheit's isn't that much useful compared to Celsius' or Kelvin's.
I'm not arguing on Fahrenheit's behalf or saying it IS more precise. I just said it "feels" more precise because you have finer increments in whole numbers. 70 degrees F is about 21 degrees C while 90 degrees F is about 32 degrees. 20 degrees of increment in F versus 12 in C which feels more precise. It's the same way metric length measurements feel more precise because there are whole number millimeters rather than fractional inches.
I have no strong opinion any one way, other than I feel like everyone should endeavor to be comfortable converting between various systems of measurement.
F is kinda nice for weather as a scale of 1 to 100 of really cold feeling to really hot feeling. But for anything scientific or calibration related, C is great
Disagree. Celsius is super helpful for determining if it's gonna snow or not, a key weather thing where I live. Humid and cold and below 0? Snow. Humid and cold and above 0? Rain or freezing rain.
Also helps with plants. Below 0? Frost.
I'd argue you can't get more intuitive than 0 is cold, below 0 is very cold. Celsius also plays nice with round numbers, every 5 or 10 degrees is a change in feeling. 0 is cold, 5 out is cooler, 10 out is cool, 15 is moderate, 20 is comfortable, 25 is room and warm, 30 is hot, 35+ is very hot. Every ten degrees we're doing big changes. 0 is frozen, 10 is cool, 20 is comfortable, 30 is hot. 32 being frozen doesn't feel as intuitive.
Fahrenheit is better for weather, and I'll fight anyone about it.
We use Celsius in the lab because it makes math easier, it's great.
But Fahrenheit is basically a 0-100 scale of how hot it is outside and that makes perfect sense for describing outside conditions relative to human sensory perception.