As many of you know, I've been chasing the elusive bug for some 6+ years now. Something just didn't seem right and the controller precision was lost ever since the introduction of R5. Well, R5 brought so many changes and the "bug" was so subtle, I figured it could be anything causing the problem from input lag to frame rate management or too many hooks or too much crap in the game loop,...
Anyhow, long story short,...while playing around with madpeople & Imago's formula in wintrek.cpp, (Trac ticket #88) I found that if I changed
Code: Select all
yaw = pht->GetMaxTurnRate(c_axisYaw)Code: Select all
yaw = pht->GetMaxTurnRate(c_axisPitch)Using the pitch max turn rate for both pitch and yaw values, the deadzone feels correct and more round. The precision at very slight input is returned to normal.
btw, Rix still behaves the same as before with this change in place.
My question is this: Given that an IC Int turns the same speed regardless of the angle of pitch/bank, how is it that when using the "yaw = pht->GetMaxTurnRate(c_axisYaw)" we get a different (value returned?) than when we use "yaw = pht->GetMaxTurnRate(c_axisPitch)" in it's place?....makes no sense to me.
If only I knew how to code,..I can find bugs, I just don't know how to fix them.


