Sunday, August 12, 2007

Galaxies Dark and Distant


On June 19 this blog reported on discovery of a Dark Galaxy. In addition to hosting the Carnival of Space, Dr. Pamela Gay has just written about Finding Dark Galaxies. In the 1930's Fritz Zwicky suspected that unseen "dark matter" dominated galaxy clusters. Vera Rubin in the 1970's deduced its existence in galaxies from their rotation curves. Now we know that an unknown number of galaxies are completely made of the stuff.

Since DM emits no light and can be located only through gravity, we should consider Black Holes. Since every galaxy ever examined contains a Black Hole, it would be safe to surmise that dark galaxies contain them too. The dark halo surrounding a galaxy could contain billions of Black Holes. The "voids" in Space could hold billions of dark galaxies. The 71.62% of the Universe ascribed to "dark energy" could be hidden in these voids.

Atop Mauna Kea, the ultraviolet telescopes are grouped together in "Millimeter Valley." Using the UV spectrum, astronomers have discovered extremely bright young galaxies in the process of forming stars. 12 billion light-years away, we see them as they were when the Universe was less than 2 billion years old. They are the most luminous and massive galaxies ever seen at that distance.

The AzTEC submillimeter camera first detected a very bright ultraviolet source (left). Next the Smithsonian Submillimeter Array localised the source (centre). The Hubble Space Telescope database found this tiny point of light at the edge of detectability (right). This shows that the source is very bright but extremely distant, nearly hidden by dust. A bright ultraviolet signature indicates that the galaxies are in the midst of star formation, producing new suns at a rate 1000 times faster than our galaxy.

Astronomers have no idea how large galaxies could form so soon after the Big Bang. One way to form them would be around massive Primordial Black Holes. PBH's were formed from quantum fluctuations shortly after the Big Bang. Previously it was thought they would be tiny because of a finite speed of light. Discovery of massive dark galaxies is one more sign of a "c change" in physics.

Labels: , ,

16 Comments:

Anonymous Anonymous said...

Have you seen the paper astro-ph/0610520
Void Ellipticity Distribution as a Probe of Cosmology
by Daeseong Park and Jounghun Lee ?
They say:
"... Voids are likely to be severely disturbed by the tidal effect from the surrounding dark matter. ... a recent systematic analysis of simulation data has
revealed that the shapes of voids are in fact far from
spherical symmetry ...
we ... use the spatial distribution of the void galaxies as a measure of the non-spherical shape of a void. If a void has a spherical shape, then the spatial distribution of its galaxies will show a more or less isotropic pattern. If it has a nonspherical
shape, the void galaxy spatial distribution will also deviate from the isotropic pattern. In this practical way, the nonsphericity of the shape of a given void may be quantified by the ellipticity of the spatial distribution of the galaxies that it contains. ...
we assume a LAMBDA-dominated cold dark matter cosmology ... and consider ... OMEGAm (matter density parameter) ... h (dimensionless Hubble parameter) ...
for ... low OMEGAm and high h, voids tend to have more spherical shapes ...
The void ellipticity distribution is an outcome of the counterbalance between the tidal effect of the dark matter and the expansion of the universe: The tidal effect disturbs the void shapes from spherical symmetry, while the expansion of the universe resists such disturbance ...
we select only those voids which contain more than 5 galaxies ...
we test .. against ... OMEGAm = 0.25 ... h = 0.73 ...
the ... results are in excellent quantitative agreements for all cases ...".

From their figure 3, it seems that their ellipticity results are around 0.4 (using smoothing scales from 4 to 6.4 h^(-1) Mpc).
From their figure 1, it seems that ellipticity 0.4 corresponds to
OMEGAm = 0.32 and h = 0.68 for a 5 h^(-1) MPc smoothing scale
and
OMEGAm greater than 0.35 and h less than 0.65 for a 8 h^(-1) MPc smoothing scale.

I don't understand all the details of their calculations fully, but it seems to me that if you extrapolate their lower diagram up to ellipticity 0.4 you might get something like OMEGAm = 0.46 and h = 0.57
and
if you extrapolated to smoothing scale 11 h^(-1) Mpc you might get something like OMEGAm = 0.57 and h = 0.49.

Since the actual maximum effective radius of their voids varies from 9.04 to 21.53,
I wonder whether their analysis, even though it is stated in LAMBDA-cold dark matter conventional terms, might in fact be consistent with your picture of Dark Mass (primordial black holes) in voids.

Tony Smith

6:26 PM  
Blogger nige said...

"The 71.62% of the Universe ascribed to "dark energy" could be hidden in these voids." - Louise

This 71.62% is based on the value of a cosmological constant, Lambda, that needs to be inserted into Einstein's field equation to make it fit the distant supernova and gamma ray burster (high redshift) recession data.

That data isn't really accurate to four significant figures. See, for example, the large scatter of the data plots in http://cosmicvariance.com/2006/01/11/evolving-dark-energy/

The whole concept of a fixed amount of "dark energy" is put in doubt by the fact that the best fit to the data shows an evolution in the "cosmological constant". Explaining "the small positive cosmological constant" is one difficulty for the mainstream, but explaining why it is not a constant, conserved quantity is even worse for them.

Instead of building up an epicycle-type model where more and more "corrections" (fiddles) are "discovered" (invented) and added to the "theory" (speculative guesswork model) to overcome "difficulties" (disproofs), it's maybe worth while considering whether Einstein's field equation including cosmological constant is a good model for cosmology in the quantum gravity context.

The problem is that the universe is expanding, which should weaken the gravity coupling constant G over large distances where the exchange radiation (gravitons) are severely red-shifted due to recession of the gravitational charges (masses) over long distances.

Hence, general relativity needs to be supplied with a quantum gravity correction which makes G decrease towards zero for situations where there are large redshifts involves between the masses in question.

For calculating the gravitational slow-down of a distant receding galaxy or gamma-ray burster, general relativity without a cosmological constant (Lambda = 0)gives similar results to Newtonian gravity.

Using Newtonian gravity, the effective centre of mass of the universe for any observer is that observer's location, and so a distant supernova is slowed down by gravity like a bullet fired upwards, just change the mass of the Earth to mass of the universe.

Supernova and gamma-ray burster data show that there is less gravitational slow-down than expected from this model at extreme distances.

The mainstream ad hoc "explanation" is that there is a repulsive force from a small positive cosmological constant which is trivial over small distances but over great distances (high redshifts) cancels out the attractive force of gravity.

A more objective explanation for the data exists, produced in 1996, before the observations confirmed it. For example, the gravitational constant G will decrease at high red-shifts due to the loss of graviton energy E = hf caused by the redshift of the gravitons which reduces their frequency f. Gravitons which lose energy when redshifted cause weaker gravitational interactions than those from nearby (non-receding) masses.

This is totally separate from various geometric considerations (the inverse-square law is a geometric divergence effect).

11:56 PM  
Blogger davinci said...

This may be totally off the subject, but I rarely get to talk to anyone with your background. I wonder how astronomers deal with the differing amounts of time involved in viewing distant events. We look at the heavens and seldom realize that we are viewing history, especially the galaxies billions of light years away. We have no idea what is happening now, only what happened billions of years ago. Does the fact that we cant really see what’s going on ‘at this moment’ present any challenges to astronomers?

2:54 AM  
Anonymous Anonymous said...

Nige quoted Louise as saying:
"The 71.62% of the Universe ascribed to "dark energy" could be hidden in these voids.".

In addition to the possibility of Dark Mass being Dark Matter (maybe black holes) in the Voids (see my comment above about astro-ph/0610520),
it is interesting to speculate that maybe the Voids contain SomethingElse. If the density of SomethingElse is roughly that of the universe outside the Voids, then the percentage of volume of the universe that is in Voids might be relevant.

In astro-ph/0610689 Tikhonov says:
"... we have identified 732 voids with a radius of the seed sphere Rseed greater than 4.0 h^(−1) Mpc in a volume-limited sample of galaxies from the southern part of the 2dFGRS survey.
110 voids with Rseed greater than 9.0 h^(−1) Mpc have a positive significance.
The mean volume of such voids is 19 x 10^3 h^(−3) Mpc^3.
Voids with Rseed greater than 9.0 h^(−1) Mpc occupy 55 per cent of the sample volume. ...".

If the large Voids contain 55 per cent of total volume of the universe, then maybe the smaller Voids might contain the 20 percent needed to get to the 75 percent commonly thought of as the Dark Energy proportion.

A guess about the SomethingElse commonly thought of as Dark Energy might be that the Voids contain free conformal graviphotons that vary c,
unlke
the Ordinary-Matter-Dominated regions such as where we live in which graviphotons carrying the conformal c-varying degrees of freedom are frozen and suppressed.

Whether or not my guess has some seeds of truth,
it is interesting that the percentage of volume of our universe in Voids is similar to the WMAP Dark Energy proportion of about 75 per cent.

Tony Smith

10:31 AM  
Blogger L. Riofrio said...

For Tony: I just speed-read through the Park-Lee paper. The insidious thing about "dark energy" models is that the parameters can be manipulated to fit observations, as was done with WMAP data. Upi'll see from my June 1 post that Black Holes are equally common voids that comprise at least 50% of the volume.

For nige: I agree that adding epicycles can make models fit the data.

For davinci: By looking at high-redshift data astronomers know they are looking into the past. That doesn't mean they read the data correctly; it has led a few to think that the Universe is accelerating!

12:12 PM  
Blogger nige said...

"If the large Voids contain 55 per cent of total volume of the universe, then maybe the smaller Voids might contain the 20 percent needed to get to the 75 percent commonly thought of as the Dark Energy proportion.

"A guess about the SomethingElse commonly thought of as Dark Energy might be that the Voids contain free conformal graviphotons that vary c,
unlke
the Ordinary-Matter-Dominated regions such as where we live in which graviphotons carrying the conformal c-varying degrees of freedom are frozen and suppressed.

"Whether or not my guess has some seeds of truth,
it is interesting that the percentage of volume of our universe in Voids is similar to the WMAP Dark Energy proportion of about 75 per cent." - Tony Smith

I wonder if people agree on what is meant physically by "dark energy"? If dark energy just means gauge boson exchange radiation energy, i.e. energy of gravitons, then that's more physical and more reasonable. The confusion is illustrated by Lee Smolin writing in "The Trouble with Physics" (U.S. ed., page 209) that the acceleration of the universe due to "dark energy" is (c^2)/R:

"... c^2 /R. This turns out to be an acceleration. It is in fact the acceleration by which the rate of expansion of the universe is increasing - that is, the acceleration produced by the cosmological constant ... it is a very tiny acceleration: 10^-8 centimetres per second."

Obviously, Smolin or the publisher's editor gets the units wrong (acceleration is centimetres per second^2). But there is a far deeper error.

Take Hubble's law known in 1929: v=HR.

Acceleration is then:

a = dv/dt = d(HR)/dt = Hv.

For the scale of the universe, v = c and H = 1/t = c/R, so

a = Hv = (c/R)c = (c^2)/R.

Hence, we have obtained Smolin's acceleration for the universe from Hubble's law, by a trivial but physical calculation. The fact that velocity varies with distance in spacetime automatically implies an effective outward acceleration. That's present in Hubble's law which is built into the Friedmann-Robertson-Walker metric of general relativity.

So there is a massive "coincidence" that the real acceleration of the universe, given by the fact that Hubble's v = HR implies a = (c^2)/R, is identical to the acceleration allegedly offsetting gravitational deceleration at large redshifts!

Maybe this can be explained if we can reinterpret the cosmological constant and dark energy to the gauge boson exchange radiation energy which is being exchanged between masses. Gravitational attraction occurs as a shadowing effect (causing an anisotropy and hence a net force towards the mass which is shielding you), whereas the isotropic graviton pressure causes gravitational contraction effects (the (1/3)MG/c^2 = 1.5 mm shrinkage of Earth's radius which Feynman deduces from GR in his Lectures on Physics), and also the expansion of the finite-sized universe (the impacts of gravitons being exchanged between a finite number of atoms in the universe cause it to expand).

I also agree that dark matter exists. What is at issue here is how much there is and what evidence there is for it. There is dark matter around in neutrinos which have mass. I've seen papers showing that, when galaxies merge, their central black holes can sometimes be catapulted out of their galaxy in the chaos and end up in a void of space.

But this paper astro-ph/0610520 is exceedingly speculative and builds on the mainstream guesswork general relativity model, which doesn't contain quantum gravity mechanism corrections for general relativity on large (cosmological) scales.

The actual nature of "dark matter" can be determined simply by working out the correct quantum gravity theory and correcting general relativity accordingly for exchange radiation (graviton) effects: if it turns out that the quantum gravity theory differs from Lambda-CDM in dispensing with 90% of the currently-presumed quantity of dark matter, then we know that the amounts of dark matter present in the universe are relatively small and can be explained using known physics.

The density of the universe in the Lambda-CDM mainstream model of cosmology is approximately the critical density in the Friedmann-Walker-Robertson model,

Rho = (3/8)*(H^2)/(Pi*G).

This is the estimate of total density which is about 10-20 times higher than the observed density of visible stars in the universe. Hence this is the key formula which leads to the quantitative "prediction" (a very non-falsifiable prediction, well in the "not even wrong" category) that 90-95% of the mass of the universe is invisible dark matter.

However, some calculations based on a quantum gravity mechanism suggest that when quantum effects are taken into account, the correct density prediction is different, being almost exactly a factor of ten smaller:

Rho = (3/4)*(H^2)/(Pi*G*e^3)

where e is base of natural logs, and comes into this from an integral necessary to evaluate the effect of the changing density of the universe in spacetime (the density increases with observable distance, because of looking back to a more compressed era of the universe) on graviton exchange.

This implies that the correct density of the universe may be around 10 times less than the critical density given by general relativity (which is wrong for neglecting quantum gravity dynamics, like G falling with the redshift of gravitons exchanged between receding masses over long distances in the universe, the variation in density of the universe in spacetime, where gravitons coming from great distances come from more compressed eras of the universe, etc.).

So, instead of there being as much as 10-20 times as much dark matter as mass in the visible stars, the total mass of dark matter is probably at most only similar to the amount of visible matter.

This probably means that escaped black holes and neutrinos can account for dark matter, which means that by studying quantum gravity effects, it is possible to determine the nature of dark matter (simply because you know the correct abundance). Of course, orthodoxy insists (falsely) that general relativity only needs correction for quantum gravity effects on small distances (high energy physics), not over massive distances.

But physically any form of boson, including a graviton, should be affected by recession when being exchanged between two receding gravitational charges (masses). The redshift of the graviton received should weaken the gravity coupling and thus the effective value of G for gravitational interactions between receding (highly redshifted) masses.

I was disappointed when Stanley G. Brown, editor of PRL, rejected my paper on this when I was studying at Gloucestershire university:

Sent: 02/01/03 17:47
Subject: Your_manuscript LZ8276 Cook

Physical Review Letters does not, in general, publish papers on alternatives to currently accepted theories. ...

Yours sincerely,
Stanley G. Brown,
Editor, Physical Review Letters


I didn't seriously expect to have the paper published in PRL, but I did hope for some scientific reaction. After several exchanges of emails, Stanley G. Brown resorted to sending me an email saying that an associate editor had read the paper and determined that it wasn't pertinent to string theory or any other mainstream idea. I then responded that it obviously wasn't intended to be. Stanley G. Brown forwarded a final response from his associate editor claiming that my calculation was just a theory "based on various assumptions". Actually, it was based on various facts determined by observations. E.g., Hubble's law v = HR implies acceleration a = dv/dt = H*dR/dt = H*v = H*HR = 6*10^{-10} ms^{-2} for the matter receding at the greatest distances. This implies outward force of that matter of F=ma= m(H^2)R, and by Newton's 3rd law you have an equal inward force (by elimination of possibilities, this inward force is that carried by gauge boson radiations like gravitons) which gives a mechanism for gravity by masses shadowing the inward-directed force of graviton exchange radiation.

Maybe the focus with black holes could be on trying to understand existing experimentally verified facts? Instead of imaginatively filling the voids of the vacuum with speculative black holes based on dark matter estimates made by discrepancies between somewhat speculative or wrong models, it might be more productive to consider what the consequences would be if fundamental particles were black holes. The radius of a black hole electron is far smaller than the Planck length. The Hawking radiation emission from small black holes is massive, perhaps it is the gauge boson exchange radiation that causes force-fields? At least you can easily check that kind of theory just by calculating all the consequences. The Hawking black body radiating temperature depends on the mass of the black hole, and the radiant power of Hawking radiation is then dependent on that temperature and the black hole event horizon radius (2GM/c^2) which provides the radiating surface. Hence you can predict the rate of emission of Hawking radiation from a black hole of electron mass. It's immense, but that's what you need to calculate the physical dynamics gauge boson exchange radiation; the cross-section for capture of the radiation by other fundamental particle masses is very small (their cross-sections are the area of a circle of radius equal to event horizon 2GM/c^2), so you need an immense radiant power to produce the observed forces.

Obviously the Yang-Mills theory is physically real, and the exchange radiation is normally in some sort of equilibrium: the energy gravitons (and other force-mediating radiation) falling into black hole-sized fundamental particles gets re-emitted as Hawking radiation (behaving as gravitons). So there is an exchange of gravitons between masses at the velocity of light.

Undoubtedly believers in spin-2 gravitons can raise objections about this being unorthodox, but spin-2 gravitons haven't actually been observed.

Nigel

2:09 PM  
Blogger Kea said...

Physical Review Letters does not, in general, publish papers on alternatives to currently accepted theories...

LOL! Let's compare PRL rejections! Mine are usually along the lines, "that belongs in a maths journal", to which the objection, "but it's about particle masses" is usually met with something along the lines of, "it belongs in a maths journal."

4:01 PM  
Anonymous Anonymous said...

Louise said "... The insidious thing about "dark energy" models is that the parameters can be manipulated to fit observations ...".

A good example of that is the new Park-Lee paper at arxiv.org/abs/0704.0881

The older Park-Lee paper at astro-ph/0610520 has enough basic charts, calculations, etc., that you can roughly extrapolate to see that Louise's varying-c model might be consistent with observations etc,
but
in the new paper at arxiv.org/abs/0704.0881 they carefully restrict their charts, calculations, etc., to the LAMBDA-CDM range of parameters, so it appears that their work supports LAMBDA-CDM (even though the older paper, with rough extrapolations, indicated that their work is ALSO consistent with Louise's model.
Specifically,
charts in their new paper show only void scale at 4 h^(-1) Mpc
and
omit the charts showing void scale at 8 h^(-1) Mpc that were shown in the older paper,
which larger scale charts were the basis for my rough extrapolations indicating that Louise's model might be consistent with data and calculations.

That is indeed a very "insidious" way that people present their work so that it appears to be totally supportive of the establishment consensus,
when in fact the data and calculations really are consistent with both establishment and Louise.
Such stuff makes it easy for the establishment to ridicule and dismiss outsiders like Louise.

Three examples of similar "insidious thing[s]" are:

consensus dismissal of Halton Arp's astronomical observations;

independent analyses of neutrino observatory data that show that reasonable non-consensus interpretations of background are consistent with proton decay of SU(5) GUT models; and

non-consensus analyses of Fermilab data consistent with the Tquark not having only a single state, but having 3 states with the Higgs being a T-Tbar condensate.

Unfortunately, physics/astrophysics is riddled with such smug-consensus insidious foolishness.
The progress of science is being held back today by the Clerics of Physics
very much as it was being held back by the Clerics of the Church a few hundred years ago.

Superstring may be "Not Even Wrong" (as Woit says) but it is not "THE Trouble With Physics" (as Smolin says).
The REAL Trouble With Physics is the smug attitude of the CLERICS OF THE CONSENSUS who are so arrogantly dismissive of ANY alternative ideas to the CONSENSUS,
and
such CLERICS are found not only in the superstring theory community, but also in the communities of Loop Quantum Gravity, LAMBDA-CDM, anti-SU(5) GUT, etc ...

Tony Smith

PS - I am NOT saying that LAMBDA-CDM is wrong.
I AM saying that it is NOT the only game in town that is consistent with data,
and that is is dangerous to arrogantly rule out other models (such as Louise's) that are ALSO consistent with today's data.
If the Science Community were to be playing by real Scientific Rules,
then ALL models consistent with data would be taught and discussed,
and experiments would be devised to distinguish among them.
However,
nowadays the CLERICS OF THE CONSENSUS make premature declarations as to which model "wins",
and real Science (and human understanding of Nature) is the loser.

4:05 PM  
Anonymous Alex said...

The idea that black holes are the dark matter is passed away.It seems that something was wrong with number of barions in background radiation.The new interesting theory,which also outside a mainstream,is TeVeS.But it is to new to be considered seiriously.

8:52 PM  
Anonymous mendo said...

Hi Louise,

Do you have any prediction for the mass spectrum of PBHs? I'm wondering if that would give some very interesting tests for your model. For instance, microlensing in the MW, LMC and Andromeda could (and perhaps already does?) put limits on the number of PBHs in galactic halos. Mind you, that would only be in a relatively small mass range - IIRC from thousandths to around a few tens of a solar mass. At smaller masses I think people have considered using GLAST to look for Hawking radiation from the MW halo.

On larger scales do you think weak lensing results like the bullet cluster could provide measurements that would distinguish BH dark matter from 'standard' (for want of a much better word) CDM?

Cheers,

Mendo.

1:20 PM  
Blogger L. Riofrio said...

Fortunately Black Holes radiate a characteristic blackbody (surprise!) spectrum dependent upon temperature.

Weak lensing is still in its infancy, but so far it shows that dark mass emits no radiation but is detectable by gravity, just like a halo of Black Holes.

1:49 PM  
Blogger nige said...

"The REAL Trouble With Physics is the smug attitude of the CLERICS OF THE CONSENSUS who are so arrogantly dismissive of ANY alternative ideas to the CONSENSUS,
and
such CLERICS are found not only in the superstring theory community, but also in the communities of Loop Quantum Gravity, LAMBDA-CDM, anti-SU(5) GUT, etc ... " - Tony Smith

Tony, if you own the U.S. edition of Dr Lee Smolin's "The Trouble with Physics" (Houghton Mifflin Company, Boston & New York, 2006), see Endnote number 9 on page 370:

"I have here to again emphasize that I am talking only about people with a good training all the way through to a PhD. This is not a discussion about quacks or people who misunderstand what science is."

Better still, the PhD physicists who get the most attention paid to them are those who get paid to do research. The fact that someone is being paid to research or publicise new developments in science, makes their enthusiastic claims for that thing completely unbiased and unprejudiced.

That makes it quick and easy for everyone to judge scientific papers based on the credentials of the author, without bothering to check them objectively first. Examples of the consequences are Dr Blondlot's famous "N-rays", Drs. Fleischmann and Pons' famous "cold fusion", etc.

"Fortunately Black Holes radiate a characteristic blackbody (surprise!) spectrum dependent upon temperature." - Louise

Pair-production has to occur at the event horizon, R = 2GM/c^2, of the black hole in order for the black hole to radiate radiation by Hawking's mechanism. In order for this pair-production to occur at this distance, the electric field at the event horizon must be above Schwinger's threshold for pair production which is 1.3*10^18 v/m. Therefore, QFT seems to suggest that Hawking radiation requires the black hole to have a net electric charge. If it does have such an electric charge, this will affect the Hawking radiation mechanism. Hawking's theory ignores the effect of electric charge so that when pair production (for instance electron and positron) creation occurs just outside the event horizon, one charge at random falls in and the other escapes. This means that on average you get as many positrons as electrons escaping, which annihilate into gamma radiation. But when you include the fact that an electric charge seems to be required in order that pair production can occur around the event horizon radius of the black hole, the electric charge then affects which of the pair is likely to fall into the black hole.

A positively charged black hole is likely to attract negative charge and repel positive charge. This affects the whole mechanism for Hawking radiation because you may end up with the charge in the black hole being neutralized, and then pair production (and Hawking radiation which is dependent on pair production near the event horizon) stops.

At this time, the black hole core is neutral but the event horizon is surrounded by a cloud of similar charges which are of exactly the same sign and exactly the same quantity as the charge in the core of the black hole at the beginning.

Hence, seen from a long distance the black hole retains exactly the same electric charge but just ceases to radiate any Hawking radiation. The pair-production mechanism is only capable of radiating a total quantity of Hawking radiation equal to the E=mc^2 where m is the mass of the net charge initially present in the black hole. The pair-production and annihilation mechanism for Hawking radiation in effect transfers the charge inside the black hole to outside the black hole, without the charge physically escaping from the event horizon.

7:05 AM  
Blogger nige said...

The charges produced by pair production around black holes in the previous comment are fermion charges, not charged bosons like massive or massless W gauge bosons (which are charged force-mediating exchange radiation).

7:15 AM  
Anonymous mendo said...

Fortunately Black Holes radiate a characteristic blackbody
(surprise!) spectrum dependent upon temperature.


This reminded me of another way to search for, or least place limits on, black holes as dark matter. If black holes do radiate then they contribute to the overall cosmic background radiation (not just the microwave background from last scattering, but all wavelengths from all possible sources). Since there are good measurements and upper limits from radio to gamma ray wavelengths (see for instance Figure 1. in Overduin and Wesson, astro-ph/0407207) you can compare these with predictions from models for PBHs. The Overduin and Wesson paper considers PBH contributions to background light in some detail in Section 9 - although it's for a standard GR based cosmology. Nevertheless, it shouldn't be that hard to recalculate these numbers for other cosmologies.


As Nige pointed out, black holes might not radiate or perhaps much more weakly than currently thought. If that's the case, cosmic background light wouldn't be a good test. However, galactic microlensing searches would still be able to check for their presence in the Milky Way halo.

Cheers,

Mendo.

12:55 PM  
Anonymous Anonymous said...

In response to my comment:
"The REAL Trouble With Physics is the smug attitude of the CLERICS OF THE CONSENSUS who are so arrogantly dismissive of ANY alternative ideas to the CONSENSUS,
and
such CLERICS are found not only in the superstring theory community, but also in the communities of Loop Quantum Gravity, LAMBDA-CDM, anti-SU(5) GUT, etc ... "

Nige said
"... Tony, if you own the U.S. edition of Dr Lee Smolin's "The Trouble with Physics" (Houghton Mifflin Company, Boston & New York, 2006), see Endnote number 9 on page 370:
"I have here to again emphasize that I am talking only about people with a good training all the way through to a PhD. This is not a discussion about quacks or people who misunderstand what science is." ...".

Lee Smolin's Endnote was (see page 319) about
"... tak[ing] seriously ... people ... who are seers but not lucky enough to have made substantial contributions first ...".

It is interesting that Lee Smolin requires "a PhD" for people to be "taken seriously",
especially since on page 278 he says
"... QED ... was the triumph of Richard Feynman, Freeman Dyson, and their generation ..."
and
Freeman Dyson did NOT have a PhD.

There is a stark contrast between the Cleric attitude of Lee Smolin
and
what Yuval Neeman once told me:
"If your model is good, then whether or not you have a PhD should be irrelevant.
If your model is no good, then no matter how many PhDs you might have, it is still no good."

Tony Smith

PS - From the www.sns.ias.edu web site tilde Dyson:
"... He graduated from Cambridge University in 1945 with a BA degree in mathematics. He went on to Cornell University as a graduate student in 1947 and worked with Hans Bethe and Richard Feynman. His most useful contribution to science was the unification of the three versions of quantum electrodynamics invented by Feynman, Schwinger and Tomonaga. Cornell University made him a professor without bothering about his lack of Ph.D. ...".

Unfortunately, Cornell has degenerated since then by blacklisting people from the arXiv. A particularly horrific example was that Cornell blacklisted from arXiv an article (written by another blacklisted person, not me) that was endorsed by Bethe himself (claiming that Bethe was unqualified).

2:16 PM  
Blogger L. Riofrio said...

For nige: I agree that Black Hole radiation may not follow the standard formula, but it is a phenomenon that needs investigation.

For Tony: The university that was also home to Carl Sagan has gone downhill. You can see my March 20 and March 21 to see how low they have sunk. Before anyone thinks they have won, I just counted and have given 5 talks for the one that was cancelled.

These efforts have successfully resulted in a programme shift, of which we will hear more of soon.

4:29 PM  

Post a Comment

Links to this post:

Create a Link

<< Home

Locations of visitors to this page