The received wisdom is that a lower brace height shoots faster than a higher brace height on account of the longer power stroke.
I'm not arguing with the principle, but why do folks who have opinions on brace height and cast never quantify this gain or loss?
Since I am more interested in having my set-up and brace height optimised for accuracy I was beginning to wonder how much difference a change in brace height really makes to the cast.
I know from extensive record keeping and testing that I derive a benefit from optimising my set-up for accuracy.
And I had an idea that any benefit in cast might be small, so I put a few arrows through a chronograph, then lowered the brace height by one inch and repeated the exercise, and guess what?
I came up with a mean gain in cast of about 2 fps. (incorrect % deleted)
Seems to me that it might be worth running larger sets of samples to discover more.
A question for Dave aka Woodbear (or any others mathematically inclined). What does the math predict?
And has anyone else run chrono samples?
If so how large a sample and with how great a range of variation?
And honestly, is your normal range of variation in cast greater than or less than the likely change in casr?