Excerpt from The Sky This Week for 10/23/2008, a weekly astronomy column run by David Oesper:
If you turned off the Sun and all it illuminates, how much visible light would illuminate the Earth? That visible light would, of course, come from the combined light of all the stars and nebulae in the universe. In 2004, Abdul Ahad (1968-) determined that the sum total of all the visible light reaching the Earth from beyond the solar system would be equivalent to one star having an apparent visual magnitude of -6.5. Brighter than Venus, but dimmer than the brightest Iridium satellite flares. And almost certainly dimmer than your neighbor's dusk-to-dawn insecurity light. This total integrated brightness of the universe is now known as "Ahad's Constant".
How can we check Abdul's calculation? First download the most comprehensive star catalog you can find that includes apparent visual magnitude mv. Then write a computer program to process every one of the stars through the following equation:
where mi is the apparent visual magnitude of the ith star
Of course, this wouldn't include nebulae and all the stars that are too faint to be included in the catalog, but it should get us asymptotically close to the correct value.
Ahad's Constant is not really a constant in the strictest sense, as its value would differ (at least slightly) depending on where (and when) in the universe you are making the measurement. So, I prefer to call this Ahad's Magnitude.
How far from the Sun would you have to be before the sum total of all the light in the universe just exceeds the brightness of the Sun? That distance is called the Ahad Radius and is 11,500 AU (0.18 ly) from the Sun. If some unimaginably cold and lonely object were in a circular orbit around the Sun at the Ahad Radius, it would take 1.2 million years to complete a single orbit. That's far out!