The Dynamically Consistent ENsemble of Temperature (DCENT) is a surface data set for monitoring global and regional temperatures since 1850. This product provides combined land surface air temperature (LSAT) and sea surface temperature (SST) estimates at monthly 5°x5° resolution. A major feature of DCENT, compared with other widely used products, is the harmonisation of land and ocean temperatures throughout the historical period.
This repository contains code for reproducing DCENT from scratch (Chan et al., 2024a). The DCENT data set is available at https://doi.org/10.7910/DVN/NU4UGW.
Reference: Chan D., Gebbie G., Huybers P., & Kent E. (2024). An Ensemble of Earth Surface Temperature Change since 1850 with Dynamically Consistent Land and Ocean Evolution. Scientific Data.
The Dynamically Consistent Ensemble of Temperatures (DCENT) is an ensemble of historical earth surface temperature estimates featuring sophisticated bias adjustments and comprehensive uncertainty quantification. It provides a cleaner picture for quantifying historical climate change at both global and regional scales, and for understanding climate variability and the dynamics behind it.
The development of DCENT builds upon a series of published papers and can be summarized into five steps, as indicated in the schematic below:
- (1) Homogenizing land station temperatures (Chan et al., 2024b);
- (2) Inferring near coast SSTs from homogenized coastal station temperatures (Chan et al., 2023);
- (3) Groupwise SST comparisons (Chan and Huybers, 2019; 2021);
- (4) Correcting groupwise homogenized SSTs using inferred near-coast SSTs (Chan et al., 2024a);
- (5) Combining the land and ocean components.
All scripts for generating DCENT have been organized in this stand-alone repository, with the following structure:
- 📁 root_directory/ - The main folder for all resources
- 📁 DCENT_code/ - Directory of this repository
- 📁 SATH_V3/ - Pairwise land station temperature homogenization
- 📁 ICOADS3_preprocess_NC/ - Preprocessing for ICOADS3 in NetCDF format
- 📁 SST_Intercomparison/ - Sea Surface Temperature comparisons
- 📁 Air_Ocean_Intercomparison/ - Compares air and ocean data
- 📁 others/ - Download and processing other estimates
- 📁 m_map/ - Dependent functions
- 📁 CD_Computation/ - Dependent functions
- 📁 CD_Figures/ - Dependent functions
- 📁 Data/ - Main data storage directory
- 📁 DCENT/ - The DCENT ensemble
- 📁 DCLAT/ - Land station temperatures
- 📁 GWSST/ - Groupwise homogenized SSTs
- 📁 ICOADS/ - International Comprehensive Ocean-Atmosphere Data Set
- 📁 others/ - Other existing estimates
- 📁 log/ - Output logs of computation
- 📁 DCENT_code/ - Directory of this repository
Note that all data, including raw, intermediate, and final outputs, are provided here.
- Git clone this repository.
- Change directories in
dir_list.txtfor individual sub data folders. - Change information in
DCENT_config.shaccording to your systems. A list of parameters and their meanings can be found here.
We recommend reproducing the analysis in the following order. Note that we provide scripts for submitting jobs as .sh files. Although we tried to use namelists to avoid modifying these scripts, certain modifications may be required to match up with specific SLURM versions.
When submitting jobs, ensure that you are in the correct sub-folder of code.
This step is in ICOADS3_preprocess_NC and has four substeps:
- [Downloading data] In the command line, use the following to download data:
nohup ./download_ds548.0_1800s.csh &
nohup ./download_ds548.0_1900s.csh &
nohup ./download_ds548.0_2000s.csh &
- [Preprocessing data] This includes finding the nation and SST method information, as well as calculating the winsorized mean in each 5-day 1-degree regrid boxes for Sea-Surface Temperature (SST) and Marine Air Temperatures (MAT). In the command line, use the following:
./submit_Step_01_02.sh
- [Calculating Neighbor Std] This step calculates the standard error required for conducting buddy checks.
./ICOADS_NC_Step_03_Neighbor_std
- [Performing Buddy Check] This step finalizes the quality control of ICOADS datasets.
./submit_Step_04.sh
This step is in SST_Intercomparison. This analysis has been wrapped into a Shell script:
./ICOADS_monthly_update.sh
This step is in SATH_V3, which has 3 substeps:
- [Downloading data] Update to get the most recent version of GHCNmV4:
operational.sh
- [Homogenization] Perform the pairwise homogenization:
./Run_SATH_multiple_CPUs.sh GHCN auto 50 100 # The first 100 members
./Run_SATH_multiple_CPUs.sh GHCN GAPL 50 1000 # The second 100 members
- [Post-processing] Grid station-based temperature and calculate anomalies relative to the 1982--2014 climatology:
./submit_GHCN3R_post.sh GHCN auto 50
./submit_GHCN3R_post.sh GHCN GAPL 50
This step is in Air_Ocean_Intercomparison, which has 2 substeps:
- [Infer coastal SSTs] from coastal station-based temperatures:
./submit_AOI.sh
- [Generate DCENT] as the final product:
./submit_output_DCENT.sh
The output of this step is a 200-member DCENT ensemble, in the DCENT folder, which can be downloaded from here.
-
cluster_account(soton): Specifies the account name for accessing cluster resources. -
cluster_use(batch): Defines the cluster usage type, indicating batch processing tasks. -
cluster_bigmem(highmem): Specifies the memory type required, indicating the need for high-memory nodes. -
cluster_time(3599minutes): Sets the maximum duration for regular jobs on the cluster, allowing each job to run for just under one hour. -
cluster_long_time(7199minutes): Specifies the duration for longer jobs, permitting up to approximately two hours. -
ICOADS_preprocess_mem(50000MB): Memory required for preprocessing ICOADS data. -
LME_bin_mem(80000MB): Memory required for binning groupwise SST pairs. -
download_mem(80000MB): Memory required for download operations. -
LME_mem(1300000MB): Memory required for running SST groupwise intercomparison. -
SATH_net_mem(10000MB): Memory required for identifying neighboring network in the station temperature pairwise homogenization. -
SATH_ibp_mem(8000MB): Memory required for identifying breakpoints in the station temperature pairwise homogenization. -
SATH_att_mem(20000MB): Memory required for attributing breakpoints in the station temperature pairwise homogenization. -
SATH_comb_mem(20000MB): Memory reserved for combining breakpoints nearby in time in the station temperature pairwise homogenization. -
SATH_est_mem(80000MB): Memory required for estimating adjustments in the station temperature pairwise homogenization. -
SATH_2nd_mem(25000MB): Memory required for higher-order iterations in the station temperature pairwise homogenization. -
SATH_grid(30000MB): Memory required for gridding homogenized land station temperatures
This project is licensed under the Creative Commons Attribution 4.0 International License - see the LICENSE file for details.
Maintained by Duo Chan (Duo.Chan@soton.ac.uk)
Last Update: July 23, 2024
