The basic instructions to setup and run flashgg are described here and in corresponding READMEs in subdirectories of the repository.
If you get stuck or have questions, please first consult the FAQs page here.
FlashGG runs on EL7 architecture, which was recently terminated as it had reached end-of-life. You may need to set up your GRID proxy before entering a Singularity container. Then, run the following based on this README:
mv start_el7.sh ~/ ## general shell script to start a Singularity container that supports condor submissions ~/start_el7.sh
Get flashgg:
export SCRAM_ARCH=slc7_amd64_gcc700 cmsrel CMSSW_10_6_29 cd CMSSW_10_6_29/src/ cmsenv git cms-init git clone -b lowmass_106X https://github.com/dsperka/flashgg cd flashgg/ git remote add myFlashGG https://github.com/elfontan/flashgg-1 git remote -v git fetch myFlashGG git merge myFlashGG/lowMass_106X_2018UL source flashgg/setup_flashgg.sh scram b -j 8
- Manual test for microAOD production with a single file:
cmsRun MicroAOD/test/microAODstd.py processType=[sig] datasetName=glugluh conditionsJSON=MetaData/data/MetaConditions/Era2018_UL_lowMassDiphotonAnalysis.json - MicroAOD step on a dataset (note that you need a proper json file in the
campaigndirectory):./prepareCrabJobs.py -V v0 -C analysisLM_UL18 --meta-conditions /afs/cern.ch/work/e/elfontan/private/DiPhotonAnalysis/myFlashGG/CMSSW_10_6_8/src/flashgg/MetaData/data/MetaConditions/Era2018_UL_lowMassDiphotonAnalysis.json -O T2_US_MIT -o /store/user/elfontan -s campaigns/TEST_M70_UL.json --mkPilotThe jobs can be submitted and monitored with this kind of commands:
echo pilot_*.py | xargs -n 1 crab submit ## it will submit the pilot test crab jobs echo crabConfig_v0_GluGluHToGG_M*.py | xargs -n 1 crab submit ## submission of the signal jobs echo crab_v0_GluGluHToGG_M* | xargs -n 1 crab status ## to check the status of the jobsThe output is located in DAS in the
prod/phys03database in the form:/GluGluHToGG_M70_TuneCP5_13TeV-amcatnloFXFX-pythia8/elfontan-analysisLM_UL18-v0-v0-RunIISummer20UL18MiniAODv2-106X_upgrade2018_realistic_v16_L1v1-v1-82689a8dd9f3f8660dbba0021defef47/USER
- First you need to create a catalogue for the newly produced dataset. From the
flashggdirectory, runfggManageSamples.py -C analysisLM_UL18 import '/GluGluHToGG_M*/elfontan-analysisLM_UL18*/USER'The output will be located in a subdirectory of
MetaData/datawith the same name of the campaign, e.g.:/afs/cern.ch/work/e/elfontan/private/DiPhotonAnalysis/myFlashGG/CMSSW_10_6_8/src/flashgg/MetaData/data/analysisLM_UL18/datasets.json
- Then, you can prepare the configuration file to launch your jobs and produce the final miniTrees. Note that the relevant ingredients for this step are:
- [1] the name of the campaign (and the PU profile coherent with the era under consideration: make sure to use the UL one!);
- [2] the name of the MetaCondition file: at the moment use Era2018_UL_lowMassDiphotonAnalysis_noDiphotonBoundaries to run without any categorization;
- [3] the name of the
Systematicsconfiguration file in the Metaconditions: flashggDiPhotonSystematics2018LM_UL_cfi.
- NOTE 1: It is crucial at this stage to choose properly the name of the process because some steps and filters (like for example the
PromptFakeFilter) run only for certainprocessId. If you are ntuplizing a gJets sample, use for examplegjets_promptfake, while if you are working with QCD, you can use for exampleqcd_fakefake. - NOTE 2: The latest scales and smearing files used by the mass measurement team is not centrally available. Copy them from
~jtao/public/ForElisa/Run2018_09Sep2021_RunFineEtaR9Et_stochastic_oldFormat*toEgammaAnalysis/ElectronTools/data/ScalesSmearings. - Technical note: copying the proxy file to the working node is not yet supported when using HTCondor as bacth system. Therefore the user must set the
X509_USER_PROXYenvironment variable and run with the--no-copy-proxyoptioncd Systematics/test voms-proxy-init -voms cms --valid 168:00 cp /tmp/MYPROXY ~/ export X509_USER_PROXY=~/MYPROXY fggRunJobs.py --load analysisLM_UL18.json -d outdir_analysisLM_UL18 workspaceStd.py -n 300 -q workday --no-copy-proxy --no-use-tarball