Skip to content

Consider using "compressed" over synthetic data #522

@remicousin

Description

@remicousin
          In the past days, I came to thinking that instead of using synthetic data, we could use "compressed" data. That is to say take 10 years or coarse daily ENACTS data, compute its SVD or Fourier Transform, keep only enough modes or harmonics to keep enough variance (possibly keep less and add random noise when recomposing the data). Save that in gitHub: recompose data with given modes/harmonics (add random noise), interpolate to finer grid if need be.

This may make for more robust fake data. My playing with the entirely synthetic is borderline irrealistic and while it still allows to compute onset, cessation and CSC (which surpirses me a bit), the next fancy thing it will need to calculate might blow up...

The question is how big the compressed data would be... And I assume there is what's needed in xarray/scipi/numpy/etc. to compute either SVD or Fourier, or something that can serve the same purpose, or enough to write them...

Originally posted by @remicousin in #513 (comment)

Sub-issues

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions