Summary
In Spark 4.1.0, the evalMode field on arithmetic expressions was moved inside an evalContext object, requiring access via expr.evalContext.evalMode instead of expr.evalMode.
Details
Affected Expressions
| Expression |
Spark ≤4.0.x |
Spark 4.1.0+ |
Add |
add.evalMode |
add.evalContext.evalMode |
Subtract |
sub.evalMode |
sub.evalContext.evalMode |
Multiply |
mul.evalMode |
mul.evalContext.evalMode |
Divide |
div.evalMode |
div.evalContext.evalMode |
Remainder |
mod.evalMode |
mod.evalContext.evalMode |
Sum |
sum.evalMode |
sum.evalContext.evalMode |
Average |
avg.evalMode |
avg.evalMode (unchanged) |
Impact
Code that directly accesses .evalMode on arithmetic expressions will fail to compile against Spark 4.1.0.
Example:
// Before (Spark ≤4.0.x)
val mode = addExpr.evalMode
// After (Spark 4.1.0+)
val mode = addExpr.evalContext.evalMode
Proposed Solution
Create version-specific TryModeShim objects:
spark330db/TryModeShim.scala - uses direct .evalMode access
spark410/TryModeShim.scala - uses .evalContext.evalMode access
Related
- Part of Spark 4.1.0 API compatibility work
Summary
In Spark 4.1.0, the
evalModefield on arithmetic expressions was moved inside anevalContextobject, requiring access viaexpr.evalContext.evalModeinstead ofexpr.evalMode.Details
Affected Expressions
Addadd.evalModeadd.evalContext.evalModeSubtractsub.evalModesub.evalContext.evalModeMultiplymul.evalModemul.evalContext.evalModeDividediv.evalModediv.evalContext.evalModeRemaindermod.evalModemod.evalContext.evalModeSumsum.evalModesum.evalContext.evalModeAverageavg.evalModeavg.evalMode(unchanged)Impact
Code that directly accesses
.evalModeon arithmetic expressions will fail to compile against Spark 4.1.0.Example:
Proposed Solution
Create version-specific
TryModeShimobjects:spark330db/TryModeShim.scala- uses direct.evalModeaccessspark410/TryModeShim.scala- uses.evalContext.evalModeaccessRelated