Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Dumb question: with all we know about theoretical quantum physics, couldn't we compute what happens in these extreme conditions? Or is it still computationally intractable? If so, why is that?


Only hydrogen has an analytic solution. Even helium requires approximations because the electrons interact. Approximation requires an understanding of the structure of the wavefunction which gets increasingly complex as we add more particles to the system that interact and entangle with one another like electrons in an atom do. Even further, that's for single atoms that aren't interacting strongly with their environment. Here we're talking about an ensemble of nitrogen atoms interacting with each other in a somewhat extreme environment.

https://en.wikipedia.org/wiki/Hartree%E2%80%93Fock_method


All of the cool computer science problems are in the other sciences. This one in particular looks like a challenging engineering problem to scale up.


According to 'Devs', you just need a few quantum computers and lots of gold tinfoil.


Don't forget the vacuum sealed quantum levitating elevator!


This is one of the things that makes me think we are not living in a computer simulation - the physics is really hard to simulate. Unless the simulation is just fooling us to think that ;)


It may just be rendering what we observe.


I am not a quantum physicist by far, but my understanding is that solving the Schrodinger equation for anything more complex than hydrogen is more or less impossible.

So instead we go the other way: use "close enough" experimental data and perturbation theory to approximate the results for more complex systems.


> solving the Schrodinger equation for anything more complex than hydrogen is more or less impossible.

Solving in the sense of an analytic, closed form solution to the non-relativistic Schrodinger equation (itself an approximation to QFT) using commonly accepted elementary functions. Even this requires actually evaluating the elementary functions to some degree of numerical accuracy if you want digits.

I often see this claim that only hydrogen is solvable, but I find it misleading. We have algorithms that will give solutions to the non-relativistic Schrodinger equation for helium to whatever degree of decimal precision you like (see FCI QMC methods) — how long these algorithms take to run is a different matter (see the fermion sign problem), but for helium it’s not too bad. The algorithms are unbiased, which I consider to be an exact solution.

You can think of it in the same way that we have Monte Carlo solutions to the rendering equation for global illumination. They are not closed form in terms of arbitrary elementary functions, but they converge to the exact solution over time.


I _am_ a computational physicist. The main problem I'm interested in is of a slightly different nature: just compute properties of protons, neutrons, and light nuclei from QCD, the theory of quarks and gluons. However, because the "operating system" of quantum mechanics is the same, many of the computational difficulties are shared.

I described the method we use and some of the computational difficulties in some previous threads

https://news.ycombinator.com/item?id=15780514

https://news.ycombinator.com/item?id=12048170

But, if you're comfortable with an argument from authority, suffice it to say that much of the computational power in the largest supercomputing centers all across the world are dedicated to quantum many-body problems.

If you want equilibrium properties at temperature T [non equilibrium properties, like real-time dynamics have additional exponentially-bad computational intractabilities, known as the sign problem], you want to evaluate a partition function

    Z = tr[ exp( - H / T ) ]
where H is the Hamiltonian that, given an eigenstate, has an the state's energy as the eigenvalue. The space H acts on grows exponentially with the number of particles in your simulation [simulations to determine crystal structures are typically in the canonical ensemble; if that doesn't mean anything to you, that's OK, but it's something you can read further on; basically it means: you try to take the number of particles to be large, rather than, for example, fix a density].

We have importance-sampling Markov Chain Monte Carlo techniques for evaluating Z. For some problems it succeeds with flying colors, although demonstrating that you're in the thermodynamic [infinite number of particles and infinite volume] limit requires calculating at a variety of volumes and particle numbers, for extrapolation. If you just do random updates to your state vector, most updates will be rejected; use HMC to update the whole state at once [HMC = Hybrid Monte Carlo, or Hamiltonian Monte Carlo, if you're not a computational physicist, strangely].

This approach is not naively parallelizable, and the machines where HMC is performed requires extremely high-performance interconnects, think 10-100 times faster than what's available in a standard datacenter.

The method is inherently statistical: give me more computing time and I can shrink the error bars. But if the problem has certain technical difficulties (or is in real-time, for example, rather than equilibrium), you encounter the sign problem, where the partition function you want to evaluate now becomes

    Z = tr[ exp( i H / T ) ]
and you need an exponentially large sample to resolve the intricate cancellations that arise from summing just a bunch of phases.


This went way over my head but just wanted to thank you for taking the time to make this interesting post


Unfortunately the solutions for anything more than a single proton are non-linear. This means finding stable states for anything other than hydrogen is very hard computationally speaking. The problem can be compared to predicting weather patterns a month in advance.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: