Answer: Hi!
We know that the program has an average of 5.9 points in rating.
The executives estimate that a 0.2 drop means that they lose $4.7 million.
this means that, if we define f(p) as the revenue:
f(5.9) = x
f(5.7) = x - $4.7 millions
If we assume that this has a linear relationship, we can assume that:
0.2*k = $4.7 mill
k = $4.7/0.2 mill
k = $23.5 mill.
this means that for every point of rating, they lose $23.5 million.
Now we can describe our function as:
f( 5.9 - j) = x - ($23.5 mill)*j
where j is the change with respect to the mean of 5.9 points of rating, if j is positive means that the rating is decreasing, then the profit is decreasing.
Now if we derivate this we can obtain.
f'(5.9 - j) = -$23.5 mill
this means that the rate of change of the profit with respect to the change in the rating is of -$23.5 million, meaning that if the tv program lost one point of rating, they would lose 23.5 million dollars.