If you look at the regression equation for TVHOURS = 1.86 + 0.02 (AGE), the slope is .02 which is practically a straight horizontal line. Although a test might show that this slope is statistically significant, .02 is so small that it adds very little to the overall result, having no practical significance.
For example. The difference between hours for a person 20 years old
and 30 years old, using this equation is 1.86+ 20(.02) = 2.26 hours and
1.86 + 30(.02) = 2.46 hours That comes out to just a 12 minute
difference. Even for a wide difference in age from 20 to 70 it only adds
1 extra hour.. for a 50 year difference. Each increase of 1 year in age
adds only 1.2 extra minutes of tv time, showing no practical