It'd be good to alert users to common reasons for discrepancies between computer and reported p-values, either with text or with a link to something. For example, one can get a discrepancy when authors report the rounded test statistic, like a ran across one example which reported an F of .81 and I think the reason for the discrepant p-value was because the real underlying F was 0.8053.
I know the algorithm does some kind of rounding check (only for one-way tests?) and it would be great to know where to go to read the details of that.
It'd be good to alert users to common reasons for discrepancies between computer and reported p-values, either with text or with a link to something. For example, one can get a discrepancy when authors report the rounded test statistic, like a ran across one example which reported an F of .81 and I think the reason for the discrepant p-value was because the real underlying F was 0.8053.
I know the algorithm does some kind of rounding check (only for one-way tests?) and it would be great to know where to go to read the details of that.