Thermal radiation absorbed by the star


A given star is a sphere with a radius of 5.81x108 m and an average surface temperature of 6970 K. Determine the amount by which the star's thermal radiation increases the entropy of the universe each second. Assume that the star is a perfect blackbody, and that the average temperature of the rest of the universe is 2.73 K. Do not consider thermal radiation absorbed by the star from the rest of the universe.

Request for Solution File

Ask an Expert for Answer!!
Physics: Thermal radiation absorbed by the star
Reference No:- TGS0716119

Expected delivery within 24 Hours