Skip to content

Floating-point exception in SinglePhaseWell with multiple elements per segment #3958

@dkachuma

Description

@dkachuma

THERMAL.xml
run.log.txt

Describe the bug
A single-phase thermal simulation with a single well fails with a floating point exception when the well segment is discretised with two elements (InternalWell.numElementsPerSegment=2). The same case runs to completion when the segment is discretised with a single element (=1). The failure occurs shortly after the well comes online (at day 30), during the fluid model update in the well subregion.

To Reproduce
Steps to reproduce the behavior:

  1. Run attached case THERMAL.xml
  2. See error

Expected behavior
The well should initialize properly when it comes online and the simulation should proceed without numerical exceptions irrespective of the number of elements per segment (1 or 2).

Screenshots
Snippet of stack trace

**** ERROR
***** SIGNAL: 8
***** LOCATION: (external error, captured by signal handler)
Floating point error encountered: 

** StackTrace of 20 frames **
Frame 2: expf64 
Frame 3: RAJA::policy::omp::forall_impl<RAJA::TypedRangeSegment<int, int>, geos::singlePhaseBaseKernels::FluidUpdateKernel::launch<geos::constitutive::ThermalCompressibleSinglePhaseUpdate<(geos::constitutive::ExponentApproximationType)0, (geos::constitutive::ExponentApproximationType)1, (geos::constitutive::ExponentApproximationType)1> >(geos::constitutive::ThermalCompressibleSinglePhaseUpdate<(geos::constitutive::ExponentApproximationType)0, (geos::constitutive::ExponentApproximationType)1, (geos::constitutive::ExponentApproximationType)1> const&, LvArray::ArrayView<double const, 1, 0, int, LvArray::ChaiBuffer> const&, LvArray::ArrayView<double const, 1, 0, int, LvArray::ChaiBuffer> const&)::{lambda(int)#1}, RAJA::policy::omp::omp_for_schedule_exec<RAJA::policy::omp::Auto>, RAJA::expt::ForallParamPack<> >(camp::resources::v1::Host, RAJA::PolicyBaseT<(RAJA::Policy)3, (RAJA::Pattern)1, (RAJA::Launch)0, (camp::resources::v1::Platform)1, RAJA::policy::omp::Parallel, RAJA::wrapper<RAJA::policy::omp::omp_for_schedule_exec<RAJA::policy::omp::Auto> > > const&, RAJA::TypedRangeSegment<int, int>&&, geos::singlePhaseBaseKernels::FluidUpdateKernel::launch<geos::constitutive::ThermalCompressibleSinglePhaseUpdate<(geos::constitutive::ExponentApproximationType)0, (geos::constitutive::ExponentApproximationType)1, (geos::constitutive::ExponentApproximationType)1> >(geos::constitutive::ThermalCompressibleSinglePhaseUpdate<(geos::constitutive::ExponentApproximationType)0, (geos::constitutive::ExponentApproximationType)1, (geos::constitutive::ExponentApproximationType)1> const&, LvArray::ArrayView<double const, 1, 0, int, LvArray::ChaiBuffer> const&, LvArray::ArrayView<double const, 1, 0, int, LvArray::ChaiBuffer> const&)::{lambda(int)#1}&&, RAJA::expt::ForallParamPack<>)::{lambda()#1}::operator()() const 
Frame 4: /build/pine/geos01/gcc-12.2.0/openmpi-5.0.5/cuda-0/release/lib/libfluidFlowSolvers.so 
Frame 5: GOMP_parallel 
Frame 6: geos::SinglePhaseWell::updateFluidModel(geos::WellElementSubRegion&) const 
Frame 7: geos::SinglePhaseWell::updateSubRegionState(geos::ElementRegionManager const&, geos::WellElementSubRegion&) 
Frame 8: geos::WellSolverBase::updateState(geos::DomainPartition&) 
...
Frame 20: _start 
=====

Platform (please complete the following information):

  • Compiler: gcc 12.2.0
  • GEOS develop @ 2acb3ef

Metadata

Metadata

Assignees

Labels

type: bugSomething isn't workingtype: newA new issue has been created and requires attention

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions