N pretty much is "arbitrary-length information theory". As soon as you leave the realm of the finite, you end up with N. I'm not convinced that any alien civilization could get very far mathematically or computationally without reinventing N somewhere, even if unintentionally (e.g, how does one state the halting problem).
The idea is we can't actually prove a non-computable real number exists without purposefully having axioms that allow for deriving non-computable things. (We can't prove they don't exist either, without making some strong assumptions).
> The idea is we can't actually prove a non-computable real number exists without purposefully having axioms that allow for deriving non-computable things.
I am talking about constructivism, but that's not entirely the same as saying the reals are not uncountable. One of the harder things to grasp one's head around in logic is that there is a difference between, so to speak, what a theory thinks is true vs. what is actually true in a model of that theory. It is entirely possible to have a countable model of a theory that thinks it is uncountable. (In fact, there is a theorem that countable models of first order theories always exist, though it requires the Axiom of Choice).
I think that what matters here (and what I think is the natural interpretation of "not every real number is computable") is what the theory thinks is true. That is, we're working with internal notions of everything.
I'd agree with that for practical purposes, but sometimes the external perspective can be enlightening philosophically.
In this case, to actually prove the statement internally that "not every real number is computable", you'd need some non-constructive principle (usually added to the logical system rather than the theory itself). But, the absence of that proof doesn't make its negation provable either ("every real number is computable"). While some schools of constructivism want the negation, others prefer to live in the ambiguity.
I hold that the discovery of computation was as significant as the set theory paradoxes and should have produced a similar shift in practice. No one does naive set theory anymore. The same should have happened with classical mathematics but no one wanted to give up excluded middle, leading to the current situation. Computable reals are the ones that actually exist. Non-computable reals (or any other non-computable mathematical object) exist in the same way Russel’s paradoxical set exists, as a string of formal symbols.
Formal reasoning is so powerful you can pretend these things actually exist, but they don’t!
I see you are already familiar with subcountability so you know the rest.
What do you really mean exists - maybe you mean has something to do with a calculation in physics, or like we can possibly map it into some physical experience?
Doesn't that formal string of symbols exist?
Seems like allowing formal string of symbols that don't necessarily "exist" (or well useful for physics) can still lead you to something computable at the end of the day?
Like a meta version of what happens in programming - people often start with "infinite" objects eg `cycle [0,1] = [0,1,0,1...]` but then extract something finite out of it.
They don’t exist as concepts. A rational number whose square is 2 is (convenient prose for) a formal symbol describing some object. It happens that it does not describe any object. I am claiming that many objects described after the explosion of mathematics while putting calculus on a firmer foundation to resolve infinitesimals do not exist.
List functions like that need to be handled carefully to ensure termination. Summations of infinite series deal are a better example, consider adding up a geometric series. You need to add “all” the terms to get the correct result.
Of course you don’t actually add all the terms, you use algebra to determine a value.
You can go farther and say that you can't even construct real numbers without strong enough axioms. Theories of first order arithmetic, like Peano arithmetic, can talk about computable reals but not reals in general.
The author mentioned that the theory of the complex field is categorical, but I didn't see them directly mention that the theory of the real field isn't - for every cardinal there are many models of the real field of that size. My own, far less qualified, interpretation, is that even if the complex field is just a convenient tool for organizing information, for algebraic purposes it is as
safe an abstraction as we could really hope for - and actually much more so than the real field.
The real field is categorically characterized (in second-order logic) as the unique complete ordered field, proved by Huntington in 1903. The complex field is categorically characterized as the unique algebraic closure of the real field, and also as the unique algebraically closed field of characteristic 0 and size continuum. I believe that you are speaking of the model-theoretic first-order notion of categoricity-in-a-cardinal, which is different than the categoricity remarks made in the essay.
I went into this expecting it to be a criticism in the context of low-level systems, and was a bit surprised when it ended up being about distributed. The mismatch the author is describing, I think, is not really about functional programmers so much as everyone not working directly on modern distributed or web software. For being upfront about all the ways distributed programming is different, this is actually one of the best intros I have seen for the non-distributed programmer on what distributed programming is really like.
I still want to believe that future programming languages will be capable of tackling these issues, but exploring the design space will require being cognizant of those issues in the first place.
As far as I'm aware, Rust's trait system is more closely related to Haskell's type class system than to actual object-oriented programming. As a type class system, it is fine; it is a different mindset than classic OOP. Rust happens to also use this same system for something more closely resembling traditional objects, but this is much more restricted than either.
The only conclusion I can draw from this is that some engineers are not great at arguing the merits and challenges of a programming language. GC vs non-GC should be one of the first and most straightforward decisions made when picking a language. It's hard to tell in this situation given that there are no concrete examples of what the arguments were, but if one is seriously considering Go for a domain, then they don't actually need the complexity a non-GC language brings.
If anything, maybe this says there is room for a Rust-like GC'd language.
The only conclusion I can draw from your comment is that you wanted to provide an example of the sort of flawed reasoning that the article is in fact discussing.
Sometimes, and for some people, the right language can actually simplify problems and hide complexity that isn't immediately necessary to the problem at hand. The whole idea is to abstract away models and worry about them separately. This is one of the motivations for the various synthetic theories that are getting researched in the last several years - the one mentioned here happens to be for Category Theory.
This doesn't work for everyone though; some people find it easier to stay grounded in specific models.
Skepticism of a ZFC axiom in particular could just be in terms of its standard status. I don't think anyone debates that ZFC in a particular logic doesn't imply this or that, but people can get into philosophical questions about whether it is the right foundation. There are also purely mathematical reasons to care - an extra axiom may allow you to produce more useful math, but it also potentially blocks you from other interesting math by keeping you out of models where, e.g., Choice is false.
Location: USA
Remote: Yes
Willing to relocate: No
Technologies: C#, Rust, Haskell, C, Functional Programming, SQL, Parquet, Git, OData, some C++, limited Python. Started playing with Datafusion.
Resume/CV: Upon request
Email: v5d72jvtu at mozmail dot com
LinkedIn: https://www.linkedin.com/in/eric-gorelik-45662312b/ (preferred)
Hi HN! I'm Eric, a senior software engineer who has worked in the intersection of the databases and compilers spaces.
I have nearly 10 years experience at Microsoft as a developer on a partially-lazy functional query language.
Before that, I concentrated heavily on compilers, programming language theory, and formal logic in college
(with some OS kernel and computer algebra stuff on top), and I remain passionate about them. In the years since I've gained knowledge of modern OLAP database internals.
Recently I've worked on Parquet integration and optimization in this query language and Microsoft more broadly.
On the side I experiment with programming language design, especially with applying math-inspired ideas to low-level programming.
I'm looking for a fully-remote position. If any of this sounds interesting, please reach out.
reply