We treated the gravitational field of the earth as a constant (9.80 m/s2), but we know that the gravitational field of the earth should fall off as 1/r2. How far away from the surface of the earth would we have to go for the field to decrease in the last decimal place, which is the 0.1% level? In other words, over what distance from the surface of the earth is field essentially constant to the 0.1% level? Treat the earth a uniform sphere of mass with radius 6370 km.
used the equation g= Gm/r^2
mutiplied 9.8(.999) to get 9.7902 = g
Mass earth= 5.97e24
radius of earth = 6370000m
i plugged values into equation g=Gm/r^2 to find r.
Then i subtracted r from radius of earth to get 9799.25m (distance away from surface of earth that has 9.7902 gravitational pull.)
ive tried multiplying 9.8 by .001 and tried doing that way to, but to no avail..maybe im just not understanding the question?