yeah, agreed... terminal velocity margins of error are multiplied.
Force is only a squared factor though, and that is really the measure you are looking for.
Plus the RR, which is a large non-aero factor, weighs in.
What I am seeing in my measurements is a RR force about 30 lbs with a aero force at 60 mph of about 50 lbs... actually one should be able to get the margin of error from this:
let's say +- 1mph on speedo reading which is +2, -almost 2 lbs of aero force.
Cd change of +- 0.1 on my VX yields +-2lbs as well.
So a aero change of +- .02 is the best I could hope to measure.
Also forgetting wind variation... and I am not convinced RR is a constant.
Either way this gives an idea of what is possible.
I am playing with coast down testing a lot, using a video camera on a second clock and the speedo. When I review, I mark the timestamps for each 5 mph tick, the calculate the deltas between in 10mph increments. I average this approach of 5 or so runs. I think I can show that the margin of error is less for a couple of reasons:
the deltas are calculated and relative: ie if I get one reading high or low the next calculation delta will absorb that error.
the averages should take wind and road levelness out of the equation (although 10 runs might be better).
|