“What I’m taking about” is that they are not easy to conceptualize intuitively.
If I were a skeptic of real numbers, I’d tell you that talking about an infinite decimal expansion that never terminated and contains no repeating pattern is nonsense. I’d say such a thing doesn’t exist, because you can’t specify a single example by writing down its decimal expansion — by definition. So if that’s the only idea you have to convince a skeptic, you’ve already failed and are out of the game. To convince the skeptic, you’d have to develop a more sophisticated method to show indirectly an example of a real number that is not rational (for instance, perhaps by proving that, should sqrt(2) exist, it cannot be rational).
I guess we are talking about different things. It seems to me that it's trivial to imagine then conceptually. They go on forever and most of them never repeat? Sounds good to me. Sqrt(2) never repeats? sure, whatever. I never found the proofs of this stuff very interesting.
Now, I am a skeptic of their use in physics / science. But that's a different question, and more about pedagogy than the raw content of the theories.
With that approach, all anyone has to say is that you'd have to provide infinite information to specify an example and that the way these objects interact is completely undefined; therefore you haven't defined or done anything at all. You are indeed simply imagining something -- and nothing more. You can imagine whatever you want, but nobody else is inclined to believe that what you imagine exists or behaves in the intended manner.
Beyond that, if a skeptic were inclined to accept the existence of objects with "infinite information content" by definition, they could then ask you to simply add two of them together. That would most likely be the end of it -- trying to add infinite non-repeating decimal expansions does not act intuitively. To answer this type of question in general, you would have to prove that the set of all infinite decimal expansions, if we grant its existence, has a property called completeness, as you would eventually discover that you would have to define addition x+y of these numbers as a limit: x+y = lim_{k -> infinity} (x_k+y_k) where {x,y}_k = the rational number obtained by truncating {x,y} after k digits. You must prove this limit always exists and is unique and well-defined. And even having done all that work, you still couldn't give a single example of one of these numbers without additional nontrivial work, so a skeptic could still easily reject all of this.
This is far beyond what you could reasonably expect the typical middle school student or even general member of the adult population to follow and far more difficult than simply defining complex numbers as having the form x+iy.
yes, I am describing imagining something. Imagine taking decimals and letting them go on without ending. That is conceptualizing them intuitively. It is easy.
I don't really know what you're arguing about. You are describing the sorts of things that have to be solved to construct them rigorously. But I don't know why. No one is talking about that.
I was talking about that, specifically, the relative difficulty of defining reals from rationals vs complex numbers from reals. You replied to me. :)
Moreover, I disagree that you have imagined real numbers. I don’t think you’ve imagined a single real number at all in the manner you describe. Why should I believe you've even described anything that isn't rational to begin with? For instance, 0.999... is the same as 1. Why should I not think that whatever decimal expansion you're imagining is, similarly, equivalent to a rational number we already know about? Occam's razor would reasonably suggest you're just imagining different representations of objects already accounted for in the rationals. After all, an infinite amount of precision captured by an infinite nonrepreating string of digits could easily just converge back to a number we already know.
I am very confused why you are continually talking about rationals as if they are not real. every real number is also a rational number, in the usual conception of things, are they not? Perhaps you are distinguishing the two? like regarding 1.000 as an equivalence classes of cauchy sequences is not the same as 1.000 as the equivalence class of a/a?
because when I picture 1.000 I am clearly imagining a real number. Likewise if I imagine pi, as defined any way you like.
My language was sloppy, but I'll admit I thought it was pretty obvious that we were talking about defining the rest of the reals starting from the rationals -- obvious enough that it didn't need clarification. I can't edit my prior comment, but you may imagine it has been amended in the obvious way with that clarity made explicit rather than implicit and reply to it again if you're interested in continuing the conversation.
If I were a skeptic of real numbers, I’d tell you that talking about an infinite decimal expansion that never terminated and contains no repeating pattern is nonsense. I’d say such a thing doesn’t exist, because you can’t specify a single example by writing down its decimal expansion — by definition. So if that’s the only idea you have to convince a skeptic, you’ve already failed and are out of the game. To convince the skeptic, you’d have to develop a more sophisticated method to show indirectly an example of a real number that is not rational (for instance, perhaps by proving that, should sqrt(2) exist, it cannot be rational).