-
Notifications
You must be signed in to change notification settings - Fork 6
Open
0 / 10 of 1 issue completedLabels
Description
In the past days, I came to thinking that instead of using synthetic data, we could use "compressed" data. That is to say take 10 years or coarse daily ENACTS data, compute its SVD or Fourier Transform, keep only enough modes or harmonics to keep enough variance (possibly keep less and add random noise when recomposing the data). Save that in gitHub: recompose data with given modes/harmonics (add random noise), interpolate to finer grid if need be.
This may make for more robust fake data. My playing with the entirely synthetic is borderline irrealistic and while it still allows to compute onset, cessation and CSC (which surpirses me a bit), the next fancy thing it will need to calculate might blow up...
The question is how big the compressed data would be... And I assume there is what's needed in xarray/scipi/numpy/etc. to compute either SVD or Fourier, or something that can serve the same purpose, or enough to write them...
Originally posted by @remicousin in #513 (comment)
Reactions are currently unavailable