Skip to content

Long Data Sets not re-calculating correctly #3

@Zanidean

Description

@Zanidean

There is some issue with runs not recalculating on long data sets using xmR(). When the length of data is longer than about 20-30 data points and the data has high variance, it will sometimes miss the run or re-calculate at a different place than whats expected.

I think this is caused by the following scenario:

When bounds are initially calculated, there may be a high-variance run the the data which the bounds are recalculated to envelope. In this envelope, a lower short run is 'created' - a bit like an inverted U shape. The high-variance in the initial run means the function attempts to incorporate some of the high-variance points into the second re-calculation - which then 'creates' a new run and so on and so on recalculating bounds until it reaches a type of equilibrium.

I've tried using a conditional-while loop to test for runs, but couldn't get the test right to ensure it didn't go off into infinity. Instead, I've opted to just repeat some of the sub-functions inside xmR() to try to catch as many runs as is possible.

Another solution for this may be to add an argument for repetitions, but since MHC has a mostly stable use for xmR() I won't do this unless this bug actually crops up in use.

Metadata

Metadata

Assignees

No one assigned

    Labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions