Did I have a stroke?
Did I have a stroke?
Any my argument is that 3 ≠ 0.333…
After reading this, I have decided that I am no longer going to provide a formal proof for my other point, because odds are that you wouldn’t understand it and I’m now reasonably confident that anyone who would already understands the fact the proof would’ve supported.
Wait, the mandelbrot set is two-dimensional.
OG-Wan Kenobi.
You’re welcome.
Whether or not copyright law has been violated is not a question of morality.
This assertion dismisses the ethical considerations often intertwined with legal principles.
No, that’s stupid. Copyright is a purely legal framework. That’s it, end of story. If you still don’t understand, reread the entire discussion.
At the risk of being pedantic, I should point out that morality doesn’t come into the question. Copyright is a matter of law, and nothing else. Personally, I don’t consider it a legitimate institution; the immorality is how companies wield it like a cudgel to entrench their control over culture.
That happened before Trump’s first term, and Biden beat Trump before 1/6. Take all that into context.
If I remember, I’ll give a formal proof when I have time so long as no one else has done so before me. Simply put, we’re not dealing with floats and there’s algorithms to add infinite decimals together from the ones place down using back-propagation. Disproving my statement is as simple as providing a pair of real numbers where doing this is impossible.
Now I’m sad because I remember wishing Bernie had won.
She’s not Trump. People would vote for her.
Wait, .45 seems like a really low amount of water in your blood. Are we not counting the cytoplasm in the blood cells?
Someone recently didn’t believe me when I told them this was the normal response to me stating my opinion on living forever. Thank you for providing an example.
By definition, mathematics isn’t witchcraft (most witches I know are pretty bad at math). Also, I think you need to look more deeply into Occam’s razor.
I can’t help but notice you didn’t answer the question.
each digit-wise operation must be performed in order
I’m sure I don’t know what you mean by digit-wise operation, because my conceptuazation of it renders this statement obviously false. For example, we could apply digit-wise modular addition base 10 to any pair of real numbers and the order we choose to perform this operation in won’t matter. I’m pretty sure you’re also not including standard multiplication and addition in your definition of “digit-wise” because we can construct algorithms that address many different orders of digits, meaning this statement would also then be false. In fact, as I lay here having just woken up, I’m having a difficult time figuring out an operation where the order that you address the digits in actually matters.
Later, you bring up “incrementing” which has no natural definition in a densely populated set. It seems to me that you came up with a function that relies on the notation we’re using (the decimal-increment function, let’s call it) rather than the emergent properties of the objects we’re working with, noticed that the function doesn’t cover the desired domain, and have decided that means the notation is somehow improper. Or maybe you’re saying that the reason it’s improper is because the advanced techniques for interacting with the system are dissimilar from the understanding imparted by the simple techniques.
Fair, but that still uses logic, it’s just using false premises. Also, more than the argument what I’d be taking seriously is the threat of imminent violence.
It depends on the convention that you use, but in my experience yes; for any equivalence relation, and any metric of “approximate” within the context of that relation, A=B implies A≈B.
People generally find it odd and unintuitive that it’s possible to use decimal notation to represent 1 as .9~ and so this particular thing will never go away. When I was in HS I wowed some of my teachers by doing proofs on the subject, and every so often I see it online. This will continue to be an interesting fact for as long as decimal is used as a canonical notation.
don’t sipport infinite decimals properly
Please explain this in a way that makes sense to me (I’m an algebraist). I don’t know what it would mean for infinite decimals to be supported “properly” or “improperly”. Furthermore, I’m not aware of any arguments worth taking seriously that don’t use logic, so I’m wondering why that’s a criticism of the notation.
I hope people who aren’t powered by bitterness live forever.
Your opinion is incorrect as a question of definition.
You had in the previous paragraph.
Yes, however the problem is that you are speaking on matters that you are clearly ignorant. This isn’t a question of different axioms where we can show clearly how two models are incompatible but resolve that both are correct in their own contexts; this is a case where you are entirely, irredeemably wrong, and are simply refusing to correct yourself. I am an algebraist understanding how two systems differ and compare is my specialty. We know that infinite decimals are capable of representing real numbers because we do so all the time. There. You’re wrong and I’ve shown it via proof by demonstration. QED.
They are just symbols we use to represent abstract concepts; the same way I can inscribe a “1” to represent 3-2={ {} } I can inscribe “.9~” to do the same. The fact that our convention is occasionally confusing is irrelevant to the question; we could have a system whereby each number gets its own unique glyph when it’s used and it’d still be a valid way to communicate the ideas. The level of weirdness you can do and still have a valid notational convention goes so far beyond the meager oddities you’ve been hung up on here. Don’t believe me? Look up lambda calculus.