Hubble's Law vs Cosmological Distances
In the local universe, we often use Hubble's Law (V = H0Dp) to get distances to galaxies using their observed recession velocity. But we've seen that distances take on different definitions on cosmological scales. So how far out in redshift we can go using Hubble's Law before the errors introduced by ignoring cosmological effects become substantial? Let's imagine observing a Milky Way-like galaxy with an absolute magnitude of MV = -21 and a size of Rg = 20 kpc in a vanilla LCDM cosmology with H0 = 70 km/s/Mpc,
Ωm0 = 0.3, and
ΩΛ0 = 0.7.
Make plots of the following quantities as a function of redshift [log(1+z) from z = 0.01 to 1.0 works well] in two cases: (i) using Hubble's Law and (ii) using the correct cosmological quantity:
- a) The apparent magnitude of the galaxy. At what redshift does using Hubble's law introduce a magnitude error of 0.1 magnitudes? At what redshift is the magnitude wrong by 0.5 magnitudes?
- b) The apparent size of the galaxy [log(apparent size), in arcsec)]. At what redshift does Hubble's law introduce a size error of 10%? At what redshift does Hubble's law introduce a size error of 50%?
- c) The mean surface brightness of the galaxy. At what redshift have cosmological effects dimmed the surface brightness by 0.5 mag/arcsec^2?
Put the results for each part on the same graph so you can see the difference.
You'll end up with 3 plots (for a, b, and c), each with two lines.
Note that 0.1 magnitudes is a large error for modern photometry
(differences of < 0.01 mag. can be measured with care), so these
effects are readily discernable.