Respuesta :
Answer:
40 miles
Step-by-step explanation:
Let's set x to the number of miles driven, and t to the number of hours it took to drive.
We know that 40t is equal to x.
We also know that 40t is equal to 30(t + 1/3).
Solve for t:
40t = 30(t+1/3)
40t = 30t + 10
Subtract 30t from both sides:
10t = 10
Divide 10 from both sides:
t = 1
40t = 40 x 1 = 40 miles
Answer:
40
Step-by-step explanation:
Let $d$ be the distance to the beach, in miles. Then the time it took to drive to the beach, at 40 miles per hour, is $d/40$ (in hours).
If I had driven at 30 miles per hour instead, then it would take me $d/30$ hours. Note that 1 hour is equivalent to 60 minutes, so 20 minutes is equivalent to $20/60 = 1/3$ of an hour. Therefore,
\[\frac{d}{40} = \frac{d}{30} - \frac{1}{3}.\]Multiplying both sides by 120 to get rid of the fractions, we get
\[3d = 4d - 40,\]so $d = \boxed{40}$.