Skip Navigation

You're viewing part of a thread.

Show Context
168 comments
  • I don't really care how many representations a number has, so long as those representations make sense. 2 = 02 = 2.0 = 1+1 = -1+3 = 8/4 = 2x/x. That's all fine, we can use the basic rules of decimal notation to understand the first three, basic arithmetic to understand the next three, and basic algebra for the last one.

    0.999... = 1 requires more advanced algebra in a pointed argument, or limits and infinite series to resolve, as well as disagreeing with the result of basic decimal notation. It's steeped in misdirection and illusion like a magic trick or a phishing email.

    I'm not blaming mathematicians for this, I am blaming teachers (and popular culture) for teaching that tools are inflexible, instead of the limits of those systems.

    In this whole thread, I have never disagreed with the math, only it's systematic perception, yet I have several people auguing about the math with me. It's as if all math must be regarded as infinitely perfect, and any unbelievers must be cast out to the pyre of harsh correction. It's the dogmatic rejection I take issue with.

    • 0.999… = 1 requires more advanced algebra in a pointed argument,

      You're used to one but not the other. You convinced yourself that because one is new or unacquainted it is hard, while the rest is not. The rule I mentioned Is certainly easier that 2x/x that's actual algebra right there.

      It’s as if all math must be regarded as infinitely perfect, and any unbelievers must be cast out to the pyre of harsh correction

      Why, yes. I totally can see your point about decimal notation being awkward in places though I doubt there's a notation that isn't, in some area or the other, awkward, and decimal is good enough. We're also used to it, that plays a big role in whether something is judged convenient.

      On the other hand 0.9999... must be equal to 1. Because otherwise the system would be wrong: For the system to be acceptable, for it to be infinitely perfect in its consistency with everything else, it must work like that.

      And that's what everyone's saying when they're throwing "1/3 = 0.333.... now multiply both by three" at you: That 1 = 0.9999... is necessary. That it must be that way. And because it must be like that, it is like that. Because the integrity of the system trumps your own understanding of what the rules of decimal notation are, it trumps your maths teacher, it trumps all the Fields medallists. That integrity is primal, it's always semantics first, then figure out some syntax to support it (unless you're into substructural logics, different topic). It's why you see mathematicians use the term "abuse of notation" but never "abuse of semantics".

168 comments