-
Notifications
You must be signed in to change notification settings - Fork 5
Description
Edit: Hijacking this issue as a more general question about the computational cost of running ModGP.
How much disk space is required for running the whole thing? Adding CAPFITOGEN may increase the total somewhat, but I think ModGP and the BioClim data are the heaviest data&space users.
What are the total storage requirements for the bioclimatic data download? I'm trying to make a similar download for testing capfitogen locally, and so far I have got two files for 2m temperature that seem to cover less than a year of data. Each file is ~23GB. I guess there is some further processing of these files to reduce their size and store them as NetCDF instead, so maybe the end result takes up less space.
So for the current example of ModGP with data from 1985--2015, it may be too much to store locally. I can run shorter tests so it's OK so far, but I think we should spell out how storage-hungry this process is. @trossi or @MichalTorma do you know how much space is used (on LUMI or somewhere else?) for these data?
It would be nice to add an example as a warning in the code, e.g. in the ModGP master script where the download happens.