Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Doesn't that imply our theories are "good enough" for all practical purposes? If they're impossible to empirically disprove?
 help



Yes, for all practical purposes. This is the position of physicist Sean Carroll and probably others. We may not know what is happening in the middle of a black hole, or very close to the big bang, but here on Earth we do.

"in the specific regime covering the particles and forces that make up human beings and their environments, we have good reason to think that all of the ingredients and their dynamics are understood to extremely high precision"[0]

0: https://philpapers.org/archive/CARCAT-33


ER=EPR says something completely shocking about the nature of the universe. If there is anything to it, we have almost no clue about how it works or what its consequences are.

Sean Carroll's own favorite topics (emergent gravity, and the many worlds interpretation) are also things that we don't have any clue about.

Yes there is stuff we can calculate to very high precision. Being able to calculate it, and understanding it, are not necessarily the same thing.


Typically whenever you look closely at an object with complex behavior, there is a system inside made of smaller, simpler objects interacting to produce the complexity.

You'd expect that at the bottom, the smallest objects would be extremely simple and would follow some single physical law.

But the smallest objects we know of still have pretty complex behavior! So there's probably another layer underneath that we don't know about yet, maybe more than one.


I agree, and I think that your claim is compatible with the comment that you are responding to. Indeed, perhaps it's turtles all the way down and there is systematic complexity upon systematic complexity governing our universe that humanity has been just too limited to experience.

For a historical analogy, classical physics was and is sufficient for most practical purposes, and we didn't need relativity or quantum mechanics until we had instruments that could manipulate them, or that at least experienced them. While I guess that there were still macroscopic quantum phenomena, perhaps they could have just been treated as empirical material properties without a systematic universal theory accounting for them, when instruments would not have been precise enough to explore and exploit predictions of a systematic theory.


The experiments that lead to the invention of quantum theory are relatively simple and involve objects you can touch with your bare hands without damaging them. Some are done in high school, eg the photoelectric effect.

Whereas I did hedge my point regarding macroscopic quantum phenomena, I think that the quantum nature of the photoelectric effect would have been harder to discern without modern access to pure wavelength lighting. But you could still rely on precise optics to purify mixed light I suppose. But without even optics it should be even harder.

All the 19th century experiments that desired monochromatic light, including those that have characterized the photoelectric effect, used dispersive prisms, which separated the light from the Sun or from a candle into its monochromatic components. These are simple components, easily available.

This allowed experiments where the frequency of light was varied continuously, by rotating the prism.

Moreover, already during the first half of the 19th century, it became known that using gas-discharge lamps with various gases or by heating certain substances in a flame you can obtain monochromatic light corresponding to certain spectral lines specific to each substance. This allowed experiments where the wavelength of the light used in them was known with high accuracy.

Already in 1827, Jacques Babinet proposed the replacement of the platinum meter standard with the wavelength of some spectral line, as the base for the unit of length. This proposal has been developed and refined later by Maxwell, in 1870, who proposed to use both the wavelength and the period of some spectral line for the units of length and time. The proposal of Babinet has been adopted in SI in 1960, 133 years later, while the proposal of Maxwell has been adopted in SI in 1983, 113 years later.

So there were no serious difficulties in the 19th century for using monochromatic light. The most important difficulty was that their sources of monochromatic light had very low intensities, in comparison with the lasers that are available today. The low intensity problem was aggravated when coherent light was needed, as that could be obtained only by splitting the already weak light beam that was available. Lasers also provide coherent light, not only light with high intensity, thus they greatly simplify experiments.


> You'd expect that at the bottom, the smallest objects would be extremely simple and would follow some single physical law.

That presupposes that there's a bottom, and that each subsequent layer gets simpler. Neither proposition is guaranteed, indeed the latter seems incorrect since quantum chromodynamics governing the internal structure of the proton is much more complex than the interactions governing its external behavior.


Yeah that's the outcome theorized by Gödel.

Incompleteness is inherent to our understanding as the universe is too vast and endless for us to ever capture a holistic model of all the variables.

Gödel says something specific about human axiomatic systems, akin to a special relativity, but it generalizes to physical reality too. A written system is made physical writing it out, and never complete. Demonstrates that our grasp of physical systems themselves is always incomplete.


Gödel’s incompleteness says almost nothing about this. I wish people wouldn’t try to apply it in ways that it very clearly is not applicable to.

An environment living in Conway’s Game of Life could be quite capable of hypothesizing that it is implemented in Conway’s Game of Life.


That's not what they were saying.

Systems can hypothesize about themselves but they cannot determine why the rules they can learn exist in the first place. Prior states are no longer observable so there is always incomplete history.

Conway's Game of Life can't explain its own origins just itself. Because the origins are no longer observable after they occur.

What are the origins of our universe? We can only guess without the specificity of direct observation. Understanding is incomplete with only simulation and theory.

So the comment is right. We would expect to be able to define what is now but not completely know what came before.


If they had a correct point, the appeal to Gödel’s results did not help to justify it.

https://news.ycombinator.com/item?id=46955821

"The universe is not required to appeal to your aesthetic tastes."


Indeed, as I think I commented before here, this kind of self-reference is exactly what makes Gödel's proof work.

Now the question is are we in Conways Game of Life?

The fundamental theories are good enough in that we can't find a counterexample, but they're only useful up to a certain scale before the computational power needed is infeasible. We're still hoping to find higher-level emergent theories to describe larger systems. By analogy, in principle you could use Newton's laws of motion (1685) to predict what a gas in a room is going to do, or how fluid will flow in a pipe, but in practice it's intractable and we prefer to use the higher-level language of fluid mechanics: the ideal gas law, the navier-stokes equations, etc.

If I have to make a guess, we are at the level of pre-copernicus in particle physics.

We are finding local maximums(induction) but the establishment cannot handle deduction.

Everything is an overly complex bandaid. At some point someone will find something elegant that can predict 70% as good, and at some point we will realize: 'Oh that's great, the sun is actually at the center of the solar system, Copernicious was slightly wrong thinking planets make circular rotations. We just needed to use ellipses!'

But with particles.


The sun is not at the center of the solar system. The intellectual leap was not to replace earth with the sun. Earth does not "revolve around the sun". The intellectual leap was to realize that the situation is somewhat symmetric -- they both attract each other, and they orbit around their center of gravity (which, yes, is in the sun. But not because the sun is the center.)

This sounds like a distinction without consequence, but I think that's wrong. The sun is not special. It just has a lot of mass. If somebody learns: The earth orbits the sun-- They don't understand how two black holes can orbit each other. If somebody learns: The sun and the earth orbit their CM -- They will be able to understand that.


Classical physics was indeed "good enough for all practical purposes" as well at the time... but those didn't include electronics, nuclear power, most all basic understanding of materials, chemistry, and just a tremendous amount of things.

The point being it's not at all clear what we might be missing without these impractical little mysteries that so far are very distant from every day life.


The point is not to make better predictions of the things we already know how to predict. The point is to determine what abstractions link the things we don't presently understand--because these abstraction tend to open many new doors in other directions. This has been the story of physics over and over: relativity, quantum theory, etc, not only answered the questions they were designed to answer but opened thousands of new doors in other directions.

Maybe? We seem to be able to characterize all the stuff we have access to. That doesn't mean we couldn't say produce new and interesting materials with new knowledge. Before we knew about nuclear fission we didn't realize that we couldn't predict that anything would happen from a big chunk of uranium or the useful applications of that. New physics might be quite subtle or specific but still useful.

All the stuff we have access to?

There isn't even a general physical theory of window glass -- i.e. of how to resolve the Kauzmann paradox and define the nature of the glass transition. Glass is one of man's oldest materials, and yet it's still not understood.

There's also, famously, no general theory for superconducting materials, so superconductors are found via alchemical trial-and-error processes. (Quite famously a couple of years ago, if you remember that circus.)

Solid-state physics has a lot of big holes.


The existing theories are extremely far from being good enough for practical purposes.

There exists a huge number of fundamental quantities that should be calculated from the parameters of the "standard model", but we cannot compute them, we can only measure them experimentally.

For instance, the masses and magnetic moments of the proton, of the neutron and of all other hadrons, the masses and magnetic moments of the nuclei, the energy spectra of nuclei, of atoms, of ions, of molecules, and so on.

The "standard model" can compute only things of negligible practical importance, like the statistical properties of the particle collisions that are performed at LHC.

It cannot compute anything of value for practical engineering. All semiconductor devices, lasers and any other devices where quantum physics matters are not designed using any consistent theory of quantum physics, but they are designed using models based on a great number of empirical parameters determined by measurement, for which quantum physics is only an inspiration for how the model should look like and not a base from which the model can be derived rigorously.


This depends very much on what "practical purposes" are. For almost all conceivable technology, relativistic quantum mechanics for electrons and light, ie QED, is sufficient fundamental theory. This is unlike before quantum mechanics, when we basically didn't have fundamental laws for chemistry and solid-state physics.

The vast majority of useful things cannot be computed with QED from fundamental principles. You cannot compute even simple atomic energy spectra.

The fundamental laws of chemistry have not been changed much by quantum physics, they just became better understood and less mysterious. Quantum mechanics has explained various cases of unusual chemical bonds that appeared to contradict the simpler rules that were believed to be true before the development of quantum physics, but not much else has practical importance.

Solid-state physics is a much better example, because little of it existed before quantum physics.

Nevertheless, solid-state physics is also the most obvious example that the current quantum physics cannot be used to compute anything of practical value from first principles.

All solid-state physics is based on experimentally-measured parameters, which cannot be computed. All mathematical models that are used in solid-state physics are based on guesses about how the solutions could behave, e.g. by introducing various fictitious averaged potentials in equations, like the Schroedinger equation, and they are not based on computations that use primary laws, without guesses that do not have any other justification, except that when the model is completed with the experimentally-measured values for its parameters, it can make reasonably accurate predictions.

Using empirical mathematical models of semiconductor materials, e.g. for designing transistors, is perfectly fine and entire industries have been developed with such empirical models.

However, the fact that one must develop custom empirical models for every kind of application, instead of being able to derive them from what are believed to be the universal laws of quantum physics, demonstrates that these are not good enough.

We can live and progress very well with what we have, but if someone would discover a better theory or a mathematical strategy for obtaining solutions, that could be used to compute the parameters that we must now measure and which could be used to model everything that we need in a way for which there would be guarantees that the model is adequate, then that would be a great advance in physics.


You seem to be familiar with the field, yet this is a very strange view? I work on exactly this slice of solid state physics and semiconductor devices. I’m not sure what you mean here.

The way we construct Hamiltonians is indeed somewhat ad hoc sometimes, but that’s not because of lack of fundamental knowledge. In fact, the only things you need are the mass of the electron/proton and the quantum of charge. Everything else is fully derived and justified, as far as I can think of. There’s really nothing other than the extremely low energy limit of QED in solid state devices, then it’s about scaling it up to many body systems which are computationally intractable but fully justified.

We don’t even use relativistic QM 95% of the time. Spin-orbit terms require it, but once you’ve derived the right coefficients (only needed once) you can drop the Dirac equation and go back to Schrödinger. The need for empirical models has nothing to do with fundamental physics, and all to do with the exorbitant complexity of many-body systems. We don’t use QFT and the standard model just because, as far as I can tell, the computation would never scale. Not really a fault of the standard model.


> The fundamental laws of chemistry have not been changed much by quantum physics, they just became better understood and less mysterious. Quantum mechanics has explained various cases of unusual chemical bonds that appeared to contradict the simpler rules that were believed to be true before the development of quantum physics, but not much else has practical importance.

Um, false? The fundamentals of chemistry are about electron orbitals (especially the valence ones) and their interactions between atoms to form molecules. All of my college chemistry courses delved somewhat into quantum mechanics, with the biggest helping being in organic chemistry. And modern computational chemistry is basically modeling the QED as applied to atoms.


What are you talking about? The spectra of hydrogen is very well understood and a text book example for students to calculate.

We use spectra to test QED calculations to something like 14 digits.


The hydrogenoid atoms and ions, with a single electron, are the exception that proves the rule, because anything more complex cannot be computed accurately.

The spectrum of hydrogen (ignoring the fine structure) could be computed with the empirical rules of Rydberg before the existence of quantum physics. Quantum physics has just explained it in terms of simpler assumptions.

Quantum physics explains a great number of features of the atomic spectra, but it is unable to compute anything for complex atoms with an accuracy comparable with the experimental measurements.

The QED calculations with "14 digits" of precision are for things that are far simpler than atomic spectra, e.g. for the gyromagnetic ratio of the electron, and even for such things the computations are extremely difficult and error-prone.


> The hydrogenoid atoms and ions, with a single electron, are the exception that proves the rule, because anything more complex cannot be computed accurately.

Rather: there is no known closed-form solution (and there likely won't be any).


If you let the computer run for long enough, it will compute any atomic spectrum to arbitrary accuracy. Only QFT has non-divergent series, so at least in theory we expect the calculations to converge.

There’s an intrinsic physical limit to which you can resolve a spectrum, so arbitrarily many digits of precision aren’t exactly a worthy pursuit anyway.


Lattice-QCD can, by now, actually calculate the masses of the proton, neutron from first principles pretty accurately.

This is of course a brute-force approach. We currently lack, in all fields, theory for emergent properties. And the mass of the proton definitely is such.


There have been claims about this, starting with "Ab Initio Determination of Light Hadron Masses" (Science, 2008).

Nevertheless, until now I have not seen anything that qualifies as "computing the masses".

Research papers like that do not contain any information that would allow someone to verify their claims. Moreover, such papers are much more accurately described as "fitting the parameters of the Standard Model, such as quark masses, to approximately match the measured masses", and not as actually computing the masses.

The published results of hadron masses are not much more accurate than you could compute mentally, without using any QCD, much less Lattice QCD, by estimating approximate quark masses from the composition in quarks of the hadrons and summing them. What complicates the mass computations is that while the heavy quarks have masses that do not vary much, the effective masses of the light quarks (especially u and d, which compose the protons and neutrons) vary a lot between different particles. Because of this, there is a very long way between a vague estimate of the mass and an accurate value.


The theories don't answer all the questions we can ask, namely questions about how gravity behaves at the quantum scale. (These questions pop up when exploring extremely dense regions of space - the very early universe and black holes).

I think the problem is that GR and QFT are at odds with each other? (I am not quite versed in the subject and this is my high-level understanding of the “problem”)

They require space to be two different things. And we kind of expect to be able to quantize gravity but none of the approaches that worked for three other interactions work here.

Absolutely not. Newtonian physics was 'good enough' until we disproved it. Imagine where we would be if all we had was Newtonian physics.

You would still make it to the moon (so I've heard). Maybe you wouldn't have GPS systems?

Newtonian physics is good enough for almost everything that humans do. It's not good for predicting the shit we see in telescopes, and apparently it's not good for GPS, although honestly I think without general relativity, GPS would still get made but there'd be a fudge factor that people just shrug about.

For just about anything else, Newton has us covered.


Oh sure, nothing major. Just transistors, lasers, MRI, GPS,nuke power, photovoltaics, LEDs, x-rays, and pretty much anything requiring maxwells equations.

Nothing major.


Microchips? A lot of quantum physics is applied here from the top of my mind.

Quantum mechanics is relevant to humanity because we build things which are very small. General relativity is not, because we're more or less incapable of actually doing things on a scale where it matters.

General relativity is pretty relevant to GPS satellites.

quantum mechanics (also very much not Newtonian) is much more important to our day-to-day lives.

this kind of distinction is quite stupid in general as plenty of things that we rely on for day-to-day activities such as our houses, desks, chairs, beds, shoes, clothes, etc are all based on Newtonian/classical mechanics. Basically everything that we use which existed pre-transistor strictly speaking only required classical physics.

I mean sure, but the transistor is pretty important to the way I live my life now!

I'd argue so is the bed you sleep in every night, and the roof over your head. Best not to take those for granted, as I don't think the transistor would last so long if it wasn't sheltered from the environment.

The argument is that these kind of distinctions between how "classical" and "quantum" physics affects our lives is just a pointless endeavor that even academics don't waste their time with.


Is it?

Flash memory (quantum tunneling), lasers (stimulated emission), transistors (band theory), MRI machines (nuclear spin), GPS (atomic transition), LED's (band gap), digital cameras (photoelectric effect), ...the list does, in fact, go on, and on, and on.

Did you intentionally list things that are clearly not essential to day-to-day life?

I'd argue flash memory and transistors certainly are.

There's still huge gaps in our understanding: quantum gravity, dark matter, what happens before planck time, thermodynamics of life and many others.

Part of the problem is that building bigger colliders, telescopes, and gravitational wave detectors requires huge resources and very powerful computers to store and crunch all the data.

We're cutting research instead of funding it right now and sending our brightest researchers to Europe and China...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: