Fever is not 100F. A fever is defined as 100.4F. Why 100.4 when 100 is a much easier to remember and handle number? Because fever is defined in humans as 38C, and that converts to 100.4F.
That’s a sigfig error. A fever is 38C, which is 2 significant digits. Converting to 100° F goes up an order of magnitude so you get a free sigfig, but unless the original number was 38.0C, you don’t get that 0.4, you’re implying precision that the original measurement never gave you.
Fever is not 100F. A fever is defined as 100.4F. Why 100.4 when 100 is a much easier to remember and handle number? Because fever is defined in humans as 38C, and that converts to 100.4F.
That’s a sigfig error. A fever is 38C, which is 2 significant digits. Converting to 100° F goes up an order of magnitude so you get a free sigfig, but unless the original number was 38.0C, you don’t get that 0.4, you’re implying precision that the original measurement never gave you.
Who defines it like that? I’m asking because I wouldn’t be surprised if the definition differs between orgs
It’s actually an irrational number, but for most purposes 100.4159F is a perfectly reasonable approximation.