Thursday, October 4, 2012

How Cosmological Supercomputers Evolve the Universe All Over Again

I'm not a cosmologist, but I am an astronomer. Most of the questions you ask are in the papers associated with Bolshoi, but science writers just leave them out because the numbers are so huge and hard to relate with -- I'm going to use megaparsecs for distances; 1 megaparsec = 1 million parsecs = 3.26 million light years = 200 billion astronomical units. 1 astronomical unit is ~93 million miles, the distance from the Earth to the Sun.

First off, "entire evolution of the universe" should obviously be qualified with "on cosmological scales", unless they've built the matrix. That said, how big is the domain? Is it just set to match the observable universe? 2048 grid points across the entire universe (or just the observable universe) seems rather... low-res. The TFA mentions an adaptive grid, but fails to mention what factor that can increase the local resolution by.

As you point out, the 'entire evolution...' phrase is a bad way of saying that the simulated volume and mass is large enough to be statistically representative of the large scale structure and evolution of the entire universe. It's 2048^3 particles total, which is a heck of a lot. 8,589,934,592 particles total, each pushing and pulling on each other simultaneously. It's an enormous computational problem. The particles are put into a box ~250 megaparsecs on a side; the Milky Way is ~0.03 megaparsecs in diameter, and it's ~0.8 megaparsecs from here to the Andromeda galaxy, our nearest large galaxy. 250 megaparsecs is a huge slice and more than enough to ensure that local variations (galaxies) won't dominate the statistics. The ART code starts with a grid covering 256^3 points, but can subdivide to higher resolutions if some threshold is passed up to 10 times if I remember correctly, giving a limit of around 0.001 megaparsecs. My memory is hazy, and the distances are scaled according to the hubble constant at any given point, but they're in the ballpark I think.

Also, how exactly do we model dark matter when we don't really know WTF it is beyond the fact that it has gravitational mass? Does it work because gravitational effects are the only thing that really matters on cosmological scales?

Essentially, yes; gravity absolutely dominates at these scales compared to all other forces considered. The role of stellar and galactic feedback into their environment when forming (and as they evolve) changes lots of important things, but simulations like Bolshoi seek to simulate the largest scale structures in the universe. Smaller subsections of the simulation can be picked out to run detailed N-body simulations of Milky Way type galaxies, or to statically match the dark matter clumps (which will form galaxies) to huge databases like the Sloan Digital Sky Survey. Both of those are pretty active things-to-do in cosmology now.

Source: http://rss.slashdot.org/~r/Slashdot/slashdotScience/~3/4OUPjz7WeX0/how-cosmological-supercomputers-evolve-the-universe-all-over-again

lottery winners lottery winners april fools day pranks ohio state vs kansas daniel von bargen the beach blood diamond

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.