Lesser known but possibly more relevant to most HN readers are Feynman's lectures on computation - https://theswissbay.ch/pdf/Gentoomen%20Library/Extra/Richard... . There's some really great explanations in there of computability, information theory, entropy, thermodynamics, and more. Very little of it is now out-dated.
“For our first seminar he invited John Hopfield, a friend of his from CalTech, to give us a talk on his scheme for building neural networks. In 1983, studying neural networks was about as fashionable as studying ESP, so some people considered John Hopfield a little bit crazy. Richard was certain he would fit right in at Thinking Machines Corporation.”
Interesting, he also talks about quantum computing (a first?): p. 191, "We now go on to consider how such a computer can also be built using the laws of quantum mechanics. We are going to write a Hamiltonian, for a system of interacting parts, which will behave in the same way as a large system in serving as a universal computer."
p. 196: "In general, in quantum mechanics, the outgoing state at time t is
eⁱᴴᵗ Ψᵢₙ where Ψᵢₙ is the input state, for a system with Hamiltonian H. To try to find, for a given special time t, the Hamiltonian which will produce M = eⁱᴴᵗ when M is such a product of non-commuting matrices, from some simple property of the matrices themselves, appears to be very difficult.
We realize, however, that at any particular time, if we expand eⁱᴴᵗ out (as 1 + iHt − H²t²⁄2 + …) we'll find the operator H operating an innumerable arbitrary number of times — once, twice, three times, and so forth — and the total state is generated by a superposition of these possibilities. This suggests that we can solve this problem of the composition of these A’s in the following way..."
Feynman is indeed often quoted among the first people to propose the idea of a quantum computer! This talk he gave in ‘81 is among the earliest discussion of why a quantum universe requires a quantum computer to be simulated [1]:
> Can a quantum system be probabilisticaUy simulated by a classical (probabilistic, I'd assume) universal computer? In other words, a computer which will give the same probabilities as the quantum system
does. If you take the computer to be the classical kind I've described so far, (not the quantum kind described in the last section) and there're no changes in any laws, and there's no hocus-pocus, the answer is certainly, No! This is called the hidden-variable problem: it is impossible to represent the results of quantum mechanics with a classical universal device.
Another unique lecture is a 1959 one [2] about the potential of nanotechnology (not even a real thing back then). He speaks of directly manipulating atoms and building angstrom-scale engines and microscope with a highly unusual perspective, extremely fascinating for anyone curious about these things and the historical perspective. Even for Feynman’s standards, this was a unique mix of topics and terminology. For context, the structure of DNA has been discovered about 5 years prior, and the first instruments capable of atomic imaging and manipulation are from at least the 80’s.
If you’re captivated by this last one as I was, I can also recommend Greg Bear’s novel “Blood Music”. It doesn’t explore the nanotechnology side much, but the main hook is biological cells as computers. Gets very crazy from there on.
If you're into atomic physics and getting a feel for the intricate structure of the basic processes, the best find I had recently is this MIT course by Wolfgang Ketterle. The first lecture is an informal overview, and he gives vivid and detailed descriptions of the phenomena they can create and control now, like why we see different kinds of thing happening at very low temperatures: the atoms are moving past each other so slowly that it gives their wavefunctions time to overlap and interact, using intersecting lasers to create arrays of dimples in the electromagnetic field to draw in and hold single atoms, this kind of thing. It gives a more tangible insight into the quantum aspects of matter that can otherwise seem inscrutable
The quote is not suggesting a quantum computer can’t be simulated classically, it can in fact, just slowly, by keeping track of the quantum state where n qubits is 2^n complex amplitudes.
It relates more to the Bell results, that there doesn’t exist a hidden variable system that’s equivalent to QM.
“There’s plenty of space at the bottom” only really took off in popularity decades later. Feynman’s accomplishments are undeniable, Nobel prize and all, but his celebrity status is given by other aspects of his personality. No Feynman equivalent I can think of is alive today. Perhaps Geoffrey Hinton and his views on the risk of AGI? He’s far from the only one of course.
The Feynman lectures are obviously brilliant but think the computation lectures are probably a better display of Feynman's brilliance. It's quite stunning how up to date they are.
Although that being said the rough outline of a field is usually worked out almost immediately after a consensus forms that it's "real" so to speak.
The theory of computation hasn't changed a whole lot since those times - and feynman explains it very well to a laymen audience (which is what makes it great, as it's not filled with jargon).
I feel like the section on primality testing with Fermat's test should at least make a shout out to Carmichael numbers and that for some inputs the probability you get a false positive result is 1.