In the year 2000, the average cost of a computer could be modeled by the equation C = -5t2 + 750, where t is the number of years since 2000. By the year 2008 the average cost had changed, so it could be modeled by the equation C = -10t2 + 500. Find the difference in the average costs for a computer between 2008 and 2000.

Respuesta :

Answer:

[tex]\Delta C=-5t^2-250[/tex]

Step-by-step explanation:

Given:

The average cost of a computer in the year 2000 is given as:

[tex]C=-5t^2+750[/tex]

The average cost of a computer in the year 2008 is given as:

[tex]C=-10t^2+500[/tex]

Now, the difference in average cost between the years 2008 and 2000 can be calculated by subtracting the average cost in 2000 from the average cost in 2008.

Framing in equation form, we get:

Difference in average cost (ΔC) is given as:

[tex]\Delta C=C_{2008}-C_{2000}\\\\\Delta C= (-10t^2+500)-(-5t^2+750)\\\\\textrm{Distributing the megative sign inside the second polynomial, we get:}\\\\\Delta C=-10t^2+500+5t^2-750\\\\\textrm{Grouping like terms, we get}\\\\\Delta C=(-10t^2+5t^2)+(500-750)\\\\\Delta C=-5t^2-250[/tex]

Therefore, the difference in the costs for a computer between 2008 and 2000 is [tex]\Delta C=-5t^2-250[/tex]

ACCESS MORE