Let me simplify, then complicate the question.
Fred: I'm specifically considering drag racing with this question. On a road course the issue is considerably more complicated, and optimal shifting for power purposes really only applies to long straights.
First off, I'm an engineer but not a pilot or experienced drag racer. I believe Tom's empirical evidence that short shifting produces better 1/4 mile times -- I just want to understand *why*. Here's what bothers me:
I think we can all agree that you want to shift when you can accelerate faster in the gear you're shifting INTO than in the gear you're currently in. Now, acceleration is a function of driving force at the rear wheels (F), minus drag and rolling friction, divided by mass. Restated, we have the familiar equation a = F' / M, where F' is the net of driving force minus drag & friction. For the purposes of this discussion, we're trying to figure out optimal shift points, so mass is constant and can be ignored, and drag is a function of speed and tire inflation, and can also be ignored. So the problem is: how to maximize F, the driving force at the rear wheels.
F is simply the engine's torque multiplied by the gear ratios in between the engine and rear wheels. These multipliers are fixed for a given gear, so we can enumerate them:
m1 = 2.66 (1st) * 3.07 (diff) = 8.166
m2 = 1.78 * 3.07 = 5.46
m3 = 1.3 * 3.07 = 3.99
m4 = 1.0 * 3.07 = 3.07
So clearly, optimal shift points occur when the rear wheel driving force (F) in your current gear drops below the driving force obtainable in the next gear. Let's compare shifting at 6,000 RPM with shifting at the HP peak @ 5,600 RPM. I'll use my car's stock dyno graph for these figures, but the basic idea is the same for any car. Note that because I'm using chassis dyno graphs, HP/torque figures are net of drivetrain losses.
At 5,600 RPM in 1st, net engine torque is 360 ft/lbs, which means that driving force is (360 * 8.166 =) 2940 ft/lbs at the rear wheels. Shifting at this point will drop RPMs to 3745, at which point torque is 440 ft/lbs and driving force is (440 * 5.46 =) 2402 ft/lbs.
At 6,000 RPM in 1st, engine torque is 330 ft/lbs and F = 2695 ft/lbs. The shift to 2nd drops RPMs to 4010, at which point torque is also 440 ft/lbs and driving force is also 2402 ft/lbs.
The net of all this math is that the viper is still accelerating faster in 1st at 6,000 RPM than it will when shifted to 2nd. It is thus advantageous to stay in 1st as long as possible, i.e. until redline. I won't repeat the calculations, but the results are similar for 2nd and 3rd gear, except that optimal shifting is at 6,100 in 2nd and 6,000 in 3rd.
So... what's wrong with this logic? Why is it faster to shift early? I can only think of one explanation: the power delivery curve in lower gears looks quite different from the curve in 4th. For instance, in lower gears the mass of the engine & flywheel is much more of a factor, because RPMs are changing more quickly and thus more of the engine's power is being consumed by spinning internals. Also, driveline frictional losses might look different in lower gears -- force in the driveline is higher, so gears will consume more power. Also, only 4th gear is 1:1, so tranny losses will be higher in lower gears.
Two questions for the board, then:
(1) can anyone point out the flaw in my reasoning above?
(2) does anyone have dyno charts for the lower three gears?