Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I have MS in math and came to a conclusion that C is not any more "imaginary" than R. Both are convenient abstractions, neither is particularly "natural".




How do you feel about N?

It’s only “natural” up to a point. I’ve never seen 10^100 of something so there’s that.

The number of ways to shuffle a deck with 70 cards, approximately?

"The number of x" is not "natural".

This is caveman logic and I support it.


I'm not the person you're asking, but I also have an MS in math and the same opinions.

Most mathematicians see N as fundamental -- something any alien race would certainly stumble on and use as a building block for more intricate processes. I think that position is likely but not guaranteed.

N itself is already a strange beast. It arises as some sort of "completion" [0] -- an abstraction that isn't practically useful or instantiatable, only existing to make logic and computations nice. The seeming simplicity and unpredictability of primes is a weird artifact of supposedly an object designed for counting. Most subsets of N can't even be named or described in any language in finite space. Weirder still, there are uncountable objects behaving like N for all practical purposes (see first-order Peano arithmetic).

I would then have a position something along the lines of counting being fundamental but N being a convenient, messy abstraction. It's a computational tool like any of the others.

Even that though isn't a given. What says that counting is the thing an alien race would develop first, or that they wouldn't immediately abandon it for something more befitting of their understanding of reality when they advanced enough to realize the problems? As some candidate alternative substrates for building mathematics, consider:

C: This is untested (probably untestable), but perhaps C showing up everywhere in quantum mechanics isn't as strange as we think. Maybe the universe is fundamentally wavelike, and discreteness is what we perceive when waves interfere. N crops up as a projection of C onto simple boundary conditions, not as a fundamental property of the universe itself, but as an approximate way of describing some part of the universe sometimes.

Computation: Humans are input/output machines. It doesn't make sense to talk about numbers we'll physically never be able to talk about. If naturals are fundamental, why do they have so many encodings? Why do you have to specify which encoding you're using when doing proofs using N? Primes being hard to analyze makes perfect sense when you view N as a residue of some computation; you're asking how the grammatical structure of a computer program changes under multiplication of _programs_. The other paradoxes and strange behaviors of N only crop up when you start building nontrivial computations, which also makes perfect sense; of course complicated programs are complicated.

</rant>

My actual position is closer to the idea that none of it is natural, including N. It's the Russian roulette of tooling, with 99 chambers loaded in the forward direction to tackle almost any problem you care about and 1 jammed in pointing straight down at your foot when you look too closely at second-order implications and how everything ties together. Mathematical structures are real patterns in logical space, but "fundamental" is a category error. There's no objective hierarchy, just different computational/conceptual trade-offs depending on what you're trying to do.

[0] When people talk about N being fundamental, they often talk about the idea of counting and discrete objects being fundamental. You don't need N for that though; you need the first hundred, thousand, however many things. You only need N when talking about arbitrary counting processes, a set big enough to definitely describe all possible ways a person might count. You could probably get away with naturals up to 10^1000 or something as an arbitrary, finite primitive sufficient for talking about any physical, discrete process, but we've instead gone for the abstraction of a "completion" conjuring up a limiting set of all possible discrete sets.


N pretty much is "arbitrary-length information theory". As soon as you leave the realm of the finite, you end up with N. I'm not convinced that any alien civilization could get very far mathematically or computationally without reinventing N somewhere, even if unintentionally (e.g, how does one state the halting problem).



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: