Before I start, I'll admit that I've criticized the question based on its improbability; however, I've been persuaded otherwise. I'm going to try to do the calculations based on completely different formulas than I think have been used; I hope you'll stay with me as I work it out.
Let's imagine that Lucifer becomes a main-sequence star - in fact, let's call it a low-mass red dwarf. Main-sequence stars follow the mass-luminosity relation:
$$frac{L}{L(s)} = left(frac{M}{M(s)}right)^a$$
Where $L$ and $M$ are the star's luminosity and mass, and $L(s)$ and $M(s)$ and the luminosity and mass of the Sun. For stars with $M < 0.43M(s)$, $a$ takes the value of 2.3. Now we can plug in Jupiter's mass ($1.8986 times 10 ^{27}$ kg) into the formula, as well as the Sun's mass ($1.98855 times 10 ^ {30}$ kg) and luminosity ($3.846 times 10 ^ {26}$ watts), and we get
$$frac{L}{3.846 times 10 ^ {26}} = left(frac{1.8986 times 10 ^ {27}}{1.98855 times 10 ^ {30}}right)^{2.3}$$
This becomes $$L = left(frac{1.8986 times 10 ^ {27}}{1.98855 times 10 ^ {30}}right)^{2.3} times 3.846 times 10 ^ {26}$$
which then becomes
$$L = 4.35 times 10 ^ {19}$$ watts.
Now we can work out the apparent brightness of Lucifer, as seen from Earth. For that, we need the formula
$$m = m(s) - 2.5 log left(frac {L}{L(s)}left(frac {d(s)}{d}right) ^ 2right)$$
where $m$ is the apparent magnitude of the star, $m(s)$ is the apparent magnitude of the Sun, $d(s)$ is the distance to the Sun, and $d$ is the distance to the star. Now, $m = -26.73$ and $d(s)$ is 1 (in astronomical units). $d$ varies. Jupiter is about 5.2 AU from the Sun, so at its closest distance to Earth, it would be ~4.2 AU away. We plug these numbers into the formula, and find
$$m = -6.25$$
which is a lot less brighter than the Sun. Now, when Jupiter is farthest away from the Sun, it is ~6.2 AU away. We plug that into the formula, and find
$$m = -5.40$$
which is dimmer still - although, of course, Jupiter would be completely blocked by the Sun. Still, for finding the apparent magnitude of Jupiter at some distance from Earth, we can change the above formula to
$$m = -26.73 - 2.5 log left(frac {4.35 times 10 ^ {19}}{3.846 times 10 6 {26}}left(frac {1}{d}right) ^ 2right)$$
By comparison, the Moon can have an average apparent magnitude of -12.74 at full moon - much brighter than Lucifer. The apparent magnitude of both bodies can, of course, change - Jupiter by transits of its moon, for example - but these are the optimal values.
While the above calculations really don't answer most parts of your question, I hope it helps a bit. And please, correct me if I made a mistake somewhere. LaTeX is by no means my native language, and I could have gotten something wrong.
I hope this helps.
Edit
The combined brightness of Lucifer and the Sun would depend on the angle of the Sun's rays and Lucifer's rays. Remember how we have different seasons because of the tilt of the Earth's axis? Well, the added heat would have to do with the tilt of Earth's and Lucifer's axes relative to one another. I can't give you a numerical result, but I can add that I hope it wouldn't be too much hotter than it is now, as I'm writing this!
Second Edit
Like I said in a comment somewhere on this page, the mass-luminosity relation really only works for main-sequence stars. If Lucifer was not on the main sequence. . . Well, then none of my calculations would be right.
No comments:
Post a Comment