"MATH!"
Background: Another nice one from the archives of pending questions
from e-mails =P
To: <questions@stupidquestionsanswered.com>
From: "C. Daniel *Last name removed" <*E-mail address removed*>
Subject: I have a stupid question!
Date: Thu, 7 Nov 2002 11:50:19 -0500
-----
Imagine a figure-8 race track, each oval exactly 1 mine in length,
so
that one complete circuit of both ovals is exactly 2 miles. Place a
car
on
the track at the cross-over point where the two ovals meet.
Have the car run one of the loops at an average speed of 30 miles
per
hour (no more no less).
As the car crosses into the second loop it accelerates to the
average
speed of 90 mph, so that when it competes the second loop, it has
averaged
60 mph for the entire circuit. Simple, right?
Now, re-do the first paragraph above in your mind.
Now, re-do the second paragraph above in your mind.
NOW, how fast does the car have to travel along the second loop to
average a mile-a-minute? (Remember, 60 mph is a mile-a-minute.)
Answer: The car cannot go fast enough under any circumstances to
achieve
an overall speed of a mile-a-minute because if the car averages 30
mph
around the first oval it would already have taken 2 minutes to cover
the
1-mile, and there is still 1 mile yet to go.
Now my question. If, in the first part, the car CAN average 60 mph,
why
CAN'T it average a mile-a-minute in the second part?
This has bugged me for about 10 years. I have asked mathematicians
and
engineers, and no one has been able give me an answer. HELP!
Aladdin is starting
to get angry with us cause we are giving the hard ones to him. Oh
well I guess when we are on the phone with the professor for one we
will have to ask this one as well =P Well here goes nothing:
The
reason why is that speed is based on distance per unit time, right?
Like miles per hour. It depends on both distance and time. Now, when
we have the car going 30 mph, it takes him 2 minutes (60/30=2) for
the first lap. When he travels at 90 mph for the second lap, it
takes him 2/3 of a minute (60/90), or 40 seconds.
So, he's traveled 2 miles in 2:40, which is an average of 45 mph.
Info: Which makes sense when you think about it... the driver
spent more time going 30 mph then he did going 90 mph. He didn't
spend more "distance" going 30 than 90, but more of his time spent
in the car was at 30 mph. The equation is an asymptotic approaching
60, thus 60 can not get reached, the person I talked to
also believes that under the speed of light you may be able to slow
time down enough to reach 60 mph an hour average. The answer to
this question came from an unlikely source a CPA and a web surfer
that e-mailed me. Just goes to show, to get hoards of useless
knowledge you have to look all over the place.