Hacker Newsnew | past | comments | ask | show | jobs | submit | more Syzygies's commentslogin

Nice! There is a Japanese feel to the lead graphic, their prevalence of cartoon imagery, that one might not recognize without having traveled in Japan.

Is the design debate public? I'd imagine it would make great reading.


The top right character definitely looks like Matz!


In grad school around 1980 I took a cab home from a midnight showing of the reggae film "The Harder They Come". The cab driver asked me out of the blue, "Is it true you can't tell the difference between +i and -i?"

Cambridge, MA but still ... unexpected.

If someone hands you a blank board representing the complex numbers, and offers to tell you either the sum or the product of any two places you put your fingers, you can work out most of the board rather quickly. There remains which way to flip the board, which way is up? +i and -i both square to -1.

This symmetry is the camel's nose under the tent of Galois theory, described in 1831 by Évariste Galois before he died in a duel at age twenty. This is one of the most amazing confluences of ideas in mathematics. It for example explains why we have the quadratic formula, and formulas solving degree 3 and 4 polynomials, but no general formula for degree 5. The symmetry of the complex plane is a toggle switch which corresponds to a square root. The symmetries of degree 3 and 4 polynomials are more involved, but can all be again translated to various square roots, cube roots... Degree 5 can exhibit an alien group of symmetries that defies such a translation.

The Greeks couldn't trisect an angle using a ruler and compass. Turns out the quantity they needed exists, but couldn't be described in their notation.

Integrating a bell curve from statistics doesn't have a closed form in the notation we study in calculus, but the function exists. Statisticians just said "oh, that function" and gave it a new name.

Roots of a degree 5 polynomial exist, but again can't be described in the primitive notation of square roots, cube roots... One needs to make peace with the new "simple group" that Galois found.

This is arguably the most mind blowing thing one learns in an undergraduate math education.


The original post is written by AI so I will read it briefly, but your comment is fascinating. I got through undergrad math by brute force memorization and taking the C. Or sometimes the C-. The underlying concepts were never really clear to me. I did take a good online calculus class later that helped.

However, I have questions: "Turns out the quantity they needed exists, but couldn't be described in their notation" What is this about? Sounds interesting.

"Statisticians just said "oh, that function" and gave it a new name." What is this?

I never understood there is a relationship between quadratic equations and some kind of underlying mathematic geometric symmetry. Is there a good intro to this? I only memorized how to solve them.

And the existential question. Is there a good way to teach this stuff?


> I never understood there is a relationship between quadratic equations and some kind of underlying mathematic geometric symmetry.

In a polynomial equation, the coefficients can be written as symmetric functions of the roots: https://en.wikipedia.org/wiki/Vieta%27s_formulas - symmetric means it doesn't matter how you label the roots, because it would not make sense if you could say "r1 is 3, r2 is 7" and get a different set of coefficients compared to "r1 is 7, r2 is 3".

Since the coefficients are symmetric functions of the roots, that means that you can't write the roots as a function of the coefficients - there's no way to break that symmetry. This is where root extraction comes in - it's not a function. A function has to return 1 answer for a given input, but root extraction gives you N answers for the nth root of a given input. So that's how we're able to "choose" roots - consider the expression (r1 - r2) for a quadratic equation. That's not symmetric (the answer depends on which one we label as r1 and which we label as r2), so we can't write that expression as a function of the coefficients. But what about (r1 - r2)^2? That expression IS symmetric - you get the same answer regardless of how you label the roots. If we expand that out we get r1^2 - 2r1r2 + r2^2, which is symmetric, which means we can write it as a function of the coefficients. So we've come up with an expression whose square root depends on the way we've labeled the roots (using Vieta's formulas you can show it's b^2-4c, which you might recognize from the quadratic equation).

Galois theory is used to show that root extraction can only break certain types of symmetries, and that fifth degree polynomials can exhibit root symmetries that are not breakable by radicals.


> What is this [Greek notation] about?

The Greeks "notation" was a diagram full of points labeled by letters (Α, Β, Γ, ...) with various lines connecting them and a list of steps to do with an unmarked ruler and a compass, some of which added new points. But those tools alone can't be used to describe cube roots of arbitrary numbers (or, equivalently, trisections of arbitrary angles).

> What is this [statisticians' function]?

The integral of the bell curve (normal distribution) is called its cumulative distribution function (CDF). The CDF of the normal distribution is closely related to a special function called the "error function" erf(x).

> Is there a good intro to [the symmetry interpretation of the quadratic formula]?

There's some discussion at Wikipedia's article about the quadratic formula: https://en.wikipedia.org/wiki/Quadratic_formula#By_Lagrange_...


What is so unbelievably frustrating about math education is that these interesting questions are not even hinted at until far, far down the line (and before people make the assumption I was educated outside the US).

I avoided math like the plague until my PhD program. Real analysis was a program requirement so I had to quickly teach myself calculus and get up to speed—and I found I really, really liked it. These high level questions are just so interesting and beyond the rote calculation I thought math was.

I hope I can give my daughter a glimpse of the interesting parts before the school system manages to kill her interest altogether (and I would welcome tips to that end if anyone has them).


Reminds me of "A Mathematician's Lament" [0]. I'd prefer to link a non-PDF copy, but nothing came up with a casual search.

[0] https://en.wikipedia.org/wiki/A_Mathematician's_Lament


Geometric proofs are really accessible. You don't need any algebra to prove Pythagoras' theorem, or that the sum of the inner angles of a triangle is 180 degrees, for example. Compass and straight-edge construction of simple figures is also fun.


> However, I have questions: "Turns out the quantity they needed exists, but couldn't be described in their notation" What is this about? Sounds interesting.

There are hierarchies of numbers (quantities) in mathematics, just as there are hierarchies of patterns (formal languages) in computer science, based on how difficult these objects are to describe. The most widely accepted hierarchy is actually the same in math and CS: rational, algebraic, transcendental.

In math, a rational number is one that can be described by dividing two integers. In CS, a rational pattern is one that can be described by a regular expression (regex). This is still "division": Even when we can't do 1-x or 1/x, we can recognize the pattern 1/(1-x) = 1 + x + x^2 + x^3... as "zero or more occurrences of x", written in a regex as x*.

In math, an algebraic number is one can be found as a root of a polynomial with integer coefficients. The square root of 2 is the poster child, solving x^2 - 2 = 0, and "baby's first proof" in mathematics is showing that this is not a fraction of two integers.

In CS, an algebraic pattern is one that can be described using a stack machine. Correctly nested parentheses (()(())) is the poster child here; we throw plates on a stack to keep track of how deep we are. The grammars of most programming languages are algebraic: If the square root of math is like nested parentheses, then roots of higher degree polynomials are like more complicated nested expressions such as "if then else" statements. One needs lots of colors of plates, but same idea.

In math, everything else (e, Pi, ...) is called trancendental. CS has more grades of eggs, but same idea.

One way to organize this is to take a number x and look at all expressions combining powers of x. If x^3 = 2, or more generally if x is the root of any polynomial, then the list of powers wraps around on itself, and one is looking at a finite dimensional space of expressions. If x is transcendental, then the space of expressions is infinite.

So where were the Greeks in all this? Figuring out where two lines meet is linear algebra, but figuring out where a line meets a circle uses the quadratic formula, square roots. It turns out that their methods could reach some but not all algebraic numbers. They knew how to repeatedly double the dimension of the space of expressions they were looking at, but for example they couldn't triple this space. The cube root of 2 is one of the simplest numbers beyond their reach. And "squaring the circle" ? Yup, Pi is transcendental. Way out of their reach.

When you have a hammer you see nails. When you have a circle you see doubling.

Yes, this is all Galois theory.


can you tell if you flip -1 and 1? or just the i and -i axis is special


You can: the equation x^2 = x holds for 1, but not for -1, so you can separate them. There is no way to write an equation without mentioning i (excluding cheating with Im, which again can't be defined without knowing i) that holds for i, but not -i.


Yes. I was surprised there was no mention of "false sharing".

https://en.wikipedia.org/wiki/False_sharing

Rather than incrementing each counter by one, dither the counters to reduce cache conflicts? So what if the dequeue becomes a bit fuzzy. Make the queue a bit longer, so everyone survives at least as long as they would have survived before.

Or simply use a prime length queue, and change what +1 means, so one's stride is longer than the cache conflict concern. Any stride will generate the full cyclic group, for a prime.


My understanding is that there is no global ordering anyway: allocation of a node has a total order, but actually writing data to a node can be arbitrarily delayed. At this point might as well use separate queues for each writer.

edit: the author claims the algorithm is linearizable, so I'll keep reading the article.

edit2: Right, writer acquires ticket, when done attempts to set the done flag on node. Reader acquires ticket, attempts to consume the node. On failure the writer will retry. Even if you have one producer and one consumer, might not you, theoretically, end up in a scenario where the writer never make progress? I guess as technically no node was produced, this is still technically lock-free, but still... ... oh, I should have read it further, the single writer scenario (even in the dynamic case) is special cased! It will still produce the node.

Nice, but it seems exceedingly complex.


Using AI to write good code faster is hard work.

I once toured a dairy farm that had been a pioneer test site for Lasix. Like all good hippies, everyone I knew shunned additives. This farmer claimed that Lasix wasn't a cheat because it only worked on really healthy cows. Best practices, and then add Lasix.

I nearly dropped out of Harvard's mathematics PhD program. Sticking around and finishing a thesis was the hardest thing I've ever done. It didn't take smarts. It took being the kind of person who doesn't die on a mountain.

There's a legendary Philadelphia cook who does pop-up meals, and keeps talking about the restaurant he plans to open. Professional chefs roll their eyes; being a good cook is a small part of the enterprise of engineering a successful restaurant.

(These are three stool legs. Neurodivergents have an advantage using AI. A stool is more stable when its legs are further apart. AI is an association engine. Humans find my sense of analogy tedious, but spreading out analogies defines more accurate planes in AI's association space. One doesn't simply "tell AI what to do".)

Learning how to use AI effectively was the hardest thing I've done recently, many brutal months of experiment, test projects with a dozen languages. One maintains several levels of planning, as if a corporate CTO. One tears apart all code in many iterations of code review. Just as a genius manager makes best use of flawed human talent, one learns to make best use of flawed AI talent.

My guess is that programmers who write bad code with AI were already writing bad code before AI.

Best practices, and then add AI.


When I was a senior at Swarthmore College, Herb Wilf came over from U Penn to teach a course in combinatorial algorithms. I was encouraged to attend.

He claimed that choosing a subset of k integers at random from {1..n} should have a log in its complexity, because one needs to sort to detect duplicates. I realized that if one divided [1..n] into k bins, one could detect duplicates within each bin, for a linear algorithm. I chose bubble sort because the average occupancy was 1, so bubble sort gave the best constant.

I described this algorithm to him around 5pm, end of his office hours as he was facing horrendous traffic home. I looked like George Harrison post-Beatles, and probably smelled of pot smoke. Understandably, he didn't recognize a future mathematician.

Around 10pm the dorm hall phone rang, one of my professors relaying an apology for brushing me off. He got it, and credited me with many ideas in the next edition of his book.

Of course, I eventually found all of this and more in Knuth's books. I was disillusioned, imagining that adults read everything. Later I came to understand that this was unrealistic.


Love anecdotes like this! But admittedly I feel a bit lost, so please forgive my ignorance when I ask: why does choosing a subset of k integers at random require deduplication? My naive intuition is that sampling without replacement can be done in linear time (hash table to track chosen elements?). I’m probably not understanding the problem formulation here.


your random number function might return the same number multiple times? So to choose k random but unique numbers you may have to call the random number function more than k times?

Of course my intuition would be that you can do a random shuffle and then take the first k, which is O(N). So I might be misunderstanding.


Is there a O(n) shuffling algorithm? In place, I don't think so.


Um, the "Knuth Shuffle" aka "Fisher-Yates" ?

https://en.wikipedia.org/wiki/Fisher%E2%80%93Yates_shuffle


You can do that for O(N), but the problem can be solved in O(k).


interesting, when is this? because that seems obvious today - basically hash-set, right?


1977. And I didn't know what a hash table was, though I can't explain now why they didn't think of using a hash table. I was effectively using a dumb hash function.

Their 1978 second edition works in exactly the memory needed to store the answer, by simulating my algorithm in a first pass but only saving the occupancy counts.

Oh, and thanks (I guess). I really didn't expect to ever be reading FORTRAN code again. One learned to program at Swarthmore that year by punching cards, crashing our IBM 1130, and bringing the printout to my supervisor shift. I'd find the square brackets and explain how you'd overwritten your array. I even helped an economics professor Frederic Pryor (the grad student in the "Bridge of Spies" cold war spy swap) this way, when I made an ill-advised visit to the computer center on a Saturday night. Apparently I could still find square brackets.


I always book directly with hotels, airlines, and auto rental agencies. One gains privileges by cutting out the middleman.

For example, one can generally check out early. We had followed a hotel reservation from locals in Nagoya, and found ourselves in a stodgy "classic" hotel. We were able to pivot to possibly the nicest corner suite in the entire city, at a steep last minute discount.

I did get trapped once, not realizing that my Hiroshima hotel became nonrefundable several days before check-in. With a phone call they moved my reservation to a few days later as a courtesy. The web page then let me cancel.


My gut, reading the HN title, was "half the time". I had to read the article, to see how many words piled up before he said that.

Of course, considering finite fields of prime power order, one might leap to the conclusion "a quarter of the time". One can adjoin "i" for prime powers p^n for half the primes and odd n.

Alas, this be wrong, for an amusing reason.


Why is that? My guess would be that you could adjoin an i all the time to the p^n field and get the p^2n field, as long as you had p = 4k + 3. But that's admittedly based on approximately zero thinking.

EDIT: Looking things up indicates that if n is even, there's already a square root of -1 in the field, so we can't add another. So now I believe the 1/4 of the time thing you mentioned, and can't see how that's wrong.


Spitballing here, but I suspect it's a density thing. If you are considering all prime powers up to some bound N, then the density of prime powers (edit: of size p^n with n > 1) approaches 0 as N tends to infinity. So rather than things being 1/4 like our intuition says, it should unintuitively be 1/2. I haven't given this much thought, but I suspect this based on checking some examples in Sage.


Oh, so just a probability density thing where we sample q and check if it's p^n (retrying if not) rather than sampling p and n separately and computing q=p^n? I guess that's probably what the they were going for, yeah.


exactly


"Other than returning the four parts for a refund (which we did) and documenting this behavior here, our only other recourse was to guarantee that these four specific parts were never sold as new again:"

Alas, one can completely remove Sharpie writing from metal with 99% isopropyl alcohol. Did they make a better choice? This looks like Sharpie writing to me.


You can also remove sharpie writing using sharpies and a wet wipe - write over and wipe while its still wet. The dry pigment will dissolve in the solvent in the fresh ink.


My favourite trick is to write over Sharpie with a dry erase marker and erase it all.


They can do it but they likely won't bother at scale.



One spray of brake cleaner and it's gone. Cheaper to use a carbide scribe or simply get a shard of glass or ceramic and physically scribe into the drive. Now they're permanently used drives, no debate is possible.


But are they genuine and unused? !


That’s your takeaway?


Yep. They should've engraved it.


The classic is two guys on a moped in Marseilles. The passenger cuts a pedestrian purse strap (or iPhone strap) and they vanish.

One could embed an invisible security cable, but then...


To decide on a language for a math research project, I implemented a toy problem in many languages: What is the distribution of cycle counts for signed permutations on n letters? Use all cores in parallel.

  C++  100  19.57s                 
  Rust  96  20.40s                 
  F#  95  20.52s                  
  Nim  75  26.04s                  
  Julia  64  30.40s                 
  Ocaml  48  41.07s                 
  Haskell  41  47.64s                
  Chez  39  49.53s                 
  Swift  33  58.46s                 
  Lean  7  278.88s                 

  Tarjan, n = 10
  Nyx - Apple M4 Max - 12 performance and 4 efficiency cores
  n! * 2^n = 3,715,891,200 signed permutations
  score = gops normalized so best language averages 100
  time = running time in seconds
This had me briefly smitten with F#, till I realized the extent that rusty .NET bed springs were poking through. Same as the JVM and Clojure, or Erlang and Elixir. The F# JIT compiler is nevertheless pretty amazing.

I nearly settled on OCaml. After AI warning me that proper work-stealing parallelism is a massive, sophisticated project to code properly, the 40 lines of OCaml code I wrote that beat available libraries is my favorite code file in years.

Nevertheless, once one understands lazy evaluation in Haskell, it's hard to use any other language. The log slowdown for conventional use of a functional data structure becomes a linear speedup once one exploits persistence.


F# is a nice language, but it depends on .NET, so a direct comparison to OCaml is not possible. The .NET runtime dependency often makes things unnecessarily complex and fragile. There must be a reason why Golang (no VM runtime) is successful despite entering the market very late after Java and C# had already established clear dominance.


Easier deployment helped, but nowadays it’s common to put a single Go executable in a container anyway. .net core doesn’t have much if any disadvantage there.

Go was made by Google flawed, but with clarity of purpose and consistently usable libraries.

.NET spent years forked into framework and core and unclear open source future.

Java struggled to scale down and Oracle caused similar distrust as Microsoft.


While I agree with most points, .NET or JRE introduce additional complexity that may not be welcome in every setting. I've worked for several companies where upgrading to a newer version of .NET was always a major pain point, especially for applications that weren't very actively developed. There is limited security support for each major .NET version, so at some point, you and your customers need to upgrade. Such problems are non-existent if you don't have a separate VM runtime dependency.


> I've worked for several companies where upgrading to a newer version of .NET was always a major pain point,

This was so difficult for the teams I work with that the person who became the team specialist in MSBuild, dotnet, and C# basically punched his ticket and received a couple of promotions. The work is so miserable and challenging that basically no one else Will take it.

I've struggled to understand why .net upgrades are so painful and laborious despite the maturity of the languages and ecosystems. I had assumed that, as with Powershell, C# and .net behind the scenes are undergoing enormous changes as Microsoft attempts to improve or add support for historically neglected platforms and hardware, and so I had assumed also that these were temporary problems, just badly managed growing pains on a path to major improvement, but I gather from your comment that the problem has a long history and is systemic. Awful.

MSBuild has been a huge source of the suffering I've experienced while working on these projects. The documentation is, in my experience, a nightmare: exhaustingly verbose, full of jargon, and uncanny in its ability to talk around an issue, concept, or functionality without clarifying anything conceptually or practically. I find this throughout Microsoft docs.


Upgrading a modern .NET (Core) version often means changing just versions in the project file. Upgrading from the old .NET Framework is bit more challenging, but we have also successfully upgraded a couple of Framework projects to modern .NET without too many issues.


Huh, maybe I misunderstood the nature of the .net work the guy I mentioned was doing. All I know directly is the pain of MSBuild itself. Thanks for the correction.


I don’t think that has anything to do with VMs, the same happens with C++ runtime libraries.

The difference is Go really focused on continuity and compatibility, while .NET didn’t.


> ...the same happens with C++ runtime libraries

I don't think so. Upgrading to a newer major version of the C++ runtime essentially involves recompiling, unless you're dealing with an application that's 15 years or older. You can even instruct the compiler to use an older C++ standard if your code won't compile with the newer one. There is also an option to compile the runtime statically, though it is obviously not recommended.

.NET is a different story [1]

[1]: https://learn.microsoft.com/en-us/dotnet/core/compatibility/...


I would really like to read more about this!


F# would be awesome but for the dogs*t .Net backend that made the tiniest error insoluble


What about D? Can you post any of the 10 source codes?


Interesting, would be a good blog post to read.


No Fortran?


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: