User:Larryv/0.999...
An organized account of the argument I had over whether 0.999… did indeed equal 1. Here are some other "discussions."
An article that debunks all false proofs that 0.999... =1
https://www.filesanywhere.com/fs/v.aspx?v=8b69658b5d62707cb3a5
Scroll down and click on
Proof that 0.999 not equal 1.pdf
0.999... ≠ 1 (??)
Look, an integer is not a decimal. A decimal is not an integer. Mathematic proofs are irrelevant and irrespective to reality. 0.999... will always be 0.000...1 off from 1. Your revert of my edit marks you as pretentious, pedantic, and altogether unpleasant. In short, there is a good chance no one but those who hate you will come to your funeral, and many, many people will attend it.--ttogreh 23:54, 25 October 2006 (UTC)
- If mathematics isn't representative of reality, what is? And further: A decimal need not have a fractional part. A decimal number is any number in the base-10 system. It may or may not have a fractional part; integers are perfectly acceptable decimals. Such "integers," in other bases, are not decimials (e.g., 5 in hex). In addition, numbers like 5.12 in other bases are not decimals, either. Larry V (talk | contribs) 00:02, 26 October 2006 (UTC)
Reality is representative of reality. Everything else is an abstraction; an allegory meant to aid in our movement through reality. Integers, decimals, binary, octal, base 10, hex... none of these truly exist. They are thought constructs meant to help real problems get real solutions. As such, the intellectual concept of an integer is not the same as the intellectual concept of a decimal. Do you understand? Mathematics does not dictate reality; reality dictates mathematics. --ttogreh 00:10, 26 October 2006 (UTC)
- ttogreh: I can only assume that you are actually just using your account to troll, and aren't thinking clearly about what you are saying. 1/9. = .11111..., 8/9. = .88888.... 1/9 + 8/9 = 9 / 9 = 1. We know that this final statement is true, for any reasonable definition of addition of fractions. That said, for .9999... to not equal one must be a failure on the part of using a decimal notation. -- mmt
I am not a troll. I truly believe what I say to be true, and as such, I find it very hard to take your pedantry seriously. However, that said; you are making my argument for me. All recursive decimals from 0.111... to 0.999... are failures. They are an exposure of a human flaw in abstract thought. 1/9 != 0.111... because recursive decimals do not end, while integers do. "Might as well be" is not "exactly equal", and it will always be this way. I sincerely do not understand why you cannot concede this.--ttogreh 22:05, 26 October 2006 (UTC)
- Okay, let's go:
- Look, an integer is not a decimal. A decimal is not an integer.
- The set of rational integers consists of the positive natural numbers (1, 2, 3, …), their negatives (-1, -2, -3, …), and zero. A decimal number is any number expressed in the base-10 system (i.e., as the sum of multiples of powers of ten). Now, I take it that you are using "decimal" to mean a base-10 number with a fractional component (e.g., 8.123). Indeed, a decimal fraction might not be an integer. However, any integer in the base-10 system is, by virtue of being in base-10, a decimal. I see that you have a problem with infinite decimal expansions. Well all numbers are infinite decimal expansions. "4" is shorthand for "4.000…"; "9.452" is short for "9.452000…"; etc. Any number that is not expressed as an infinite decimal expansion is just a shortcut, since physically expressing the entire expansion is, naturally, impossible.
- Mathematic proofs are irrelevant and irrespective to reality.
- No, mathematical proofs reflect reality. Proofs, by definition, must show that mathematics is true – this is the defintion of proving something. How are proofs irrelevant to reality? The goal of mathematics is to model reality! If proofs conflicted with reality, then all of mathematics would have absolutely no foundation in anything, and this is not the case.
- 0.999... will always be 0.000...1 off from 1.
- Wrong. "0.999…" has an infinite number of decimal places following its decimal point. "0.000…1" does not; the expression is meaningless. The ellipsis is supposed to signify that the zeros continue forever. If this were the case, there would be no end to attach the "1" to. If there is an end, the number expressed has a finite decimal expansion, in contrast to the infinite expansion of "0.999…". If you add these two numbers, the result is not 1 but 1.000…999… – which is obviously not equal to 1. Here's another way to look at it. If two numbers are different, you can always find another number that lies between the two. For example, 1.5 lies between 1 and 2. You cannot find a number that lies between 0.999… and 1. If it existed, it would be a number consisting of "1" at the end of an infinite line of zeros; 0.000…1 is not that number. That number cannot exist due to the definition of infinity and so forth.
- Your revert of my edit marks you as pretentious, pedantic, and altogether unpleasant. In short, there is a good chance no one but those who hate you will come to your funeral, and many, many people will attend it.
- No response. "If you go in for argument, take care of your temper. Your logic, if you have any, will take care of itself." –Joseph Farrell
- Reality is representative of reality.
- I thought reality was reality, but okay. No arguments here.
- Everything else is an abstraction; an allegory meant to aid in our movement through reality. Integers, decimals, binary, octal, base 10, hex... none of these truly exist. They are thought constructs meant to help real problems get real solutions. … Mathematics does not dictate reality; reality dictates mathematics.
- I have no qualms with your assertion that pure mathematics is not real in and of itself. "1" does not exist on its own; "binary" is not something that can be experienced. Mathematics is indeed a tool to model reality.
- As such, the intellectual concept of an integer is not the same as the intellectual concept of a decimal.
- What is an "intellectual concept"? Concepts, by default, relate to the intellect. What's a "nonintellectual concept"? I'm assuming you mean something like "The popular concept of an integer is not the same as the popular concept of a decimal" – concepts that are held by most people. Well, most people have a flawed concept of "decimal." When most people think "decimal," they are really thinking "decimal fraction." Thus, the problem is not that the concepts themselves, but the fact that most people have the wrong concept. What's more, if taken at face value, your statement is correct: Integers and decimals are not the same thing. However, decimals can be integers, and integers can be decimals (and are if they are base-10).
- All recursive decimals from 0.111... to 0.999... are failures.
- No, because – again – all numbers are actually infinite decimal expansions, albeit sometimes expressed in shorthand. And what is a "failure"? infinite decimal expansions are nothing more than sums. For instance:
- Let's say you continued this infinitely. What is illegitimate about this expression? This isn't a "failure" of the base-10 system, this is just a sum with an infinite number of parts, but a finite conclusion.
- They are an exposure of a human flaw in abstract thought.
- No, they are simply proof of a fact of reality, which is that no number base system can accomodate all numbers. Hexadecimal has even more recursive decimals than base 10, because 16 has only one prime factor, as opposed to 10, which has 2. There is no way to express all numbers without recursion or infinite expansion in a single base system. And see the previous point. What is "flawed" about an infinite sum? Sure, you could say that you can't "really" add so far, but there is nothing inherently wrong with it.
- 1/9 != 0.111... because recursive decimals do not end, while integers do.
- Indeed, the expansions of recursive decimal fractions have no end. However, how can you say that the quantity itself doesn't end? It's one number. It is a finite quantity. Let's say that 0.999… "doesn't end." Then you'd be saying that 0.999… goes on forever to infinity. But it doesn't; it's clearly greater than, say, 0.5, and less than, say, 2. A number expressed as an infinite expansion need not be equal to infinity.
- It might go against one's intuition to say that 0.999… = 1. This doesn't matter. Intuition counts for nothing. Without proof or evidence, assertions are worthless. If 0.999… ≠ 1, then one should be able to mathematically prove it. The fact of the matter is, there are no rigorous proofs for this. None. And you can't just say that "in reality," 0.999… ≠ 1. This is not a proof.
You hit all of my points, save one; that "might as well be" is not "exactly equal". I find this quite telling. Mathematic proofs are irrespective to reality, and I can use 0.999... to represent 1 when calculating something, but that does not change the fact that recursive decimals are no where near as elegant as integers, and that 0.999... and 1 are two different things. Elegance matters. --ttogreh 04:53, 27 October 2006 (UTC)
- How are mathematical proofs irrespective to reality? They represent reality. If they did not, then they would not be proofs, since they would advocate falsehoods.
- Elegance has nothing to do with whether something is true or not. Just because 0.999… might be "less elegant" than has absolutely nothing to do with whether the two are equal.
- Again, you claim that 0.999… and 1 are two different things. You're right, they are two different things: two different expressions for the exact same concept. They are two different ways to express a unit; others include 3/3, 5/5, pi/pi, , and so forth. Since they represent the same thing, they are exactly equal.
- For the above reason, it is incorrect to say that 0.999… "might as well be" 1 because this means that they are not quite equal. For practical purposes, pi might as well be 3.14159 26535 89793 23846 26433 83279 50288 41971 69399 37510. might as well be 2.71828 18284 59045 23536. At low θ, might as well be . "Might as well be" implies that the compared values are not equal but are very very close. However, 0.999… is equal to 1.
- -Larry V (talk | contribs) 06:23, 27 October 2006 (UTC)
You know, I have to say, if my beliefs and musings were obviously untrue and without merit, you would ignore me. I am just text on a screen to you. Yet, you continue to defend a cognitively dissonant mathematical assertion that has no substantive relevance on reality against the musings of a complete and total stranger. This leads me to suspect that one of these two things is true about you; your grasp of reality is dependent on mathematic theory, or you simply cannot let even the smallest thing go, marking you as an irretrievable pedant. Perhaps you should ask yourself why my assertions bother you so. That said, 1/3 + 1/3 + 1/3 = 1 != 0.333... + 0.333... + 0.333... = 0.999..., always, and forever.--ttogreh 06:58, 27 October 2006 (UTC)
- I had continued to argue this because I initially believed that it would result in some sort of legitimate debate. However, I see that you are failing to hold a cohesive discussion and are resorting to making unfounded insults and judgments about my character, thus proving to me that your argument regarding the prime topic is specious beyond the faintest shadow of a doubt. Insulting me does not make you correct. That said, you are certainly welcome to your own opinion – in the process refusing to listen to clear and accurate reasoning – but please refrain from further discussion on this topic, in which your knowledge is clearly lacking; it would benefit all who are actually interested, and would have the nice side effect of not making you look like an ignoramus. Thank you for your time. Adieu for now. -Larry V (talk | contribs) 07:47, 27 October 2006 (UTC)
Forgive this one trespass. Consider two iron bars one meter away from one another. Halve the distance between them. Halve it again, and again, and again. According to mathematics, the two iron bars will never meet because the distance between them can always, and forever, be halved again. In reality, the iron atoms of the two bars will eventually become so close that atomic force will physically prevent both bars from getting any closer than they already are. Think about that. There will always be another nine. Even if the difference between 1 and 0.999... is infinitesimal, the difference can never be removed! Humans are flawed; why should our math be perfect?--ttogreh 09:25, 27 October 2006 (UTC)
- No, according to intuition, the distance will never be eliminated. What mathematics really says is that one can take the initial distance as 1, and then:
- which is equal to the initial distance. "Atomic forces" have nothing to do with the math; they are an external influence. They must be accounted for, surely, but they do not have any bearing on the ideal, theoretical logic. If anything, what prohibits this infinite halving from actually happening is the quantization of space, as well as the subtle fact that one cannot keep doing something for eternity. And this, I think, is the issue here. You are thinking about 0.999… in the sense of someone sitting down and writing down 0.999…. No matter how many nines he writes down, the number he has will always be 10-n away from 1. No matter how many nines he writes down, there will always be more nines to write. However, this is taking the number he writes as a finite expansion. If you take 0.999… up to any decimal point, the number up to there will indeed be less than 1 … but then there are infinitely more decimal points to go. This process of expansion is what approaches but never reaches one. This is not equal to the final result of that expansion. The final result is a finite number, and finite numbers cannot "approach" anything. Three does not approach four. Ten does not approach 11. And 0.999… does not approach 1; it is 1. And again, if there is any difference between two numbers, one should be able to find an infinite number of real numbers lying between them. Since 0.999… and 1 are two finite, distinct numbers, one should be able to find at least one number between them. But this is impossible. The supposed "0.000…1" is illegitimate, because it expresses an infinite decimal expansion… with an end. The difference is 0.000…, which is simply zero. And if there the difference between two numbers is zero, then they are the same. If x - y = 0, then x = y. If this is not true, there should be a proof for it. Anything that is true should be provable. And yet there are no proofs that 0.999… ≠ 1.
- Whether mathematics is perfect or not has nothing to do with whether humans are perfect or not. Mathematics transcends humanity. The bases of mathematics would exist whether humans existed or not. There would still be "ones" of things, even if we weren't here. The "mathematics" of extraterrestrial life, if it existed, would parallel ours. It would certainly not have the same symbols and conventions, but the ratio of a circle's circumference to its diameter would be the same. They would have the concept of "one," even if they didn't call it "one" or "1". No, mathematics must be perfect because it models an idealized universe. One can never actually express 0.999…, because any attempt would really result in 0.999…9, which is not an infinite expansion. But the complete number 0.999… still equals one. The fact that no one can actually write it out does not change this.
- I don't know about you, but I am losing the drive and time to continue this. I have made my assertion and presented my arguments; I will bother you no further about the matter. Thank you for playing, and au revoir.
- Larry V - Your arguments are seriously flawed. About the only one that deserves a response is:
"And again, if there is any difference between two numbers, one should be able to find an infinite number of real numbers lying between them."
True, but then both the mathematical objects must be rational numbers. 0.999... is a quasi-number masquerading as a rational number. In fact, irrational numbers do not exist, only incommensurable magnitudes. 173.228.7.80 (talk) 17:31, 3 March 2013 (UTC)