Describe the issue
The current TrajFlag.APEX implementation returns the highest raw trajectory point, but this is only correct when shooting at zero look angle to a 'zero distance' target. For any non-zero impact distance or look angle, the true ballistic apex needs to be calculated by rotating the trajectory to align with the horizontal plane between the origin and the target point. Without this rotation, the returned apex point is incorrect.
Current behavior
hit_result.flag(TrajFlag.APEX) returns the highest point in the raw trajectory, effectively calculating apex for a zero-distance zero-angle shot only.
Expected behavior
The apex should be calculated relative to the line of sight between the origin and the intended impact point, i.e. the highest point after rotating the trajectory so that the origin-to-target vector is horizontal.
Current workaround
We currently work around this with the following approach — rotating the partial trajectory to align to the horizontal plane and finding the highest point after rotation:
def _calculate_apex_distance_height(
default_result: HitResult,
hit: TrajectoryData,
) -> BulletDistanceHeight:
start_point = default_result.trajectory[0]
max_distance_meter = -10
max_height_meter = -10
shift_angle_radians = _get_apex_shift_angle(
hit.distance >> Distance.Meter,
hit.height >> Distance.Meter,
start_point.distance >> Distance.Meter,
start_point.height >> Distance.Meter,
)
for trajectory_step in default_result.trajectory:
(new_distance_meter, new_height_meter) = _translate_trajectory_point(
trajectory_step.distance >> Distance.Meter,
trajectory_step.height >> Distance.Meter,
start_point.distance >> Distance.Meter,
start_point.height >> Distance.Meter,
-shift_angle_radians,
)
if new_height_meter < max_height_meter:
break
max_distance_meter = new_distance_meter
max_height_meter = new_height_meter
return BulletDistanceHeight(distance_meter=max_distance_meter, height_meter=max_height_meter)
def _get_apex_shift_angle(point_x: float, point_y: float, origin_x: float, origin_y: float) -> float:
return math.atan((point_y - origin_y) / (point_x - origin_x))
def _translate_trajectory_point(
point_x: float, point_y: float, origin_x: float, origin_y: float, angle: float
) -> tuple[float, float]:
origin_shift_x = point_x - origin_x
origin_shift_y = point_y - origin_y
rotated_x = origin_shift_x * math.cos(angle) - origin_shift_y * math.sin(angle)
rotated_y = origin_shift_y * math.cos(angle) + origin_shift_x * math.sin(angle)
return (rotated_x, rotated_y)
The problem with the planned extra_data removal
This workaround relies on having a dense set of trajectory points to iterate over, which is currently possible via extra_data=True. With the planned removal of extra_data, we would be forced to manually specify a very small calculation step (e.g. 0.1m) to get sufficient resolution for the rotation — which is both inefficient and messy.
Question
Is there a plan to support look-angle-aware apex calculation natively in TrajFlag.APEX, accepting a target TrajectoryData point as reference for the rotation? Alternatively, is there a clean way to get dense trajectory output without extra_data that would keep this workaround viable?
Describe the issue
The current
TrajFlag.APEXimplementation returns the highest raw trajectory point, but this is only correct when shooting at zero look angle to a 'zero distance' target. For any non-zero impact distance or look angle, the true ballistic apex needs to be calculated by rotating the trajectory to align with the horizontal plane between the origin and the target point. Without this rotation, the returned apex point is incorrect.Current behavior
hit_result.flag(TrajFlag.APEX)returns the highest point in the raw trajectory, effectively calculating apex for a zero-distance zero-angle shot only.Expected behavior
The apex should be calculated relative to the line of sight between the origin and the intended impact point, i.e. the highest point after rotating the trajectory so that the origin-to-target vector is horizontal.
Current workaround
We currently work around this with the following approach — rotating the partial trajectory to align to the horizontal plane and finding the highest point after rotation:
The problem with the planned
extra_dataremovalThis workaround relies on having a dense set of trajectory points to iterate over, which is currently possible via
extra_data=True. With the planned removal ofextra_data, we would be forced to manually specify a very small calculation step (e.g. 0.1m) to get sufficient resolution for the rotation — which is both inefficient and messy.Question
Is there a plan to support look-angle-aware apex calculation natively in
TrajFlag.APEX, accepting a targetTrajectoryDatapoint as reference for the rotation? Alternatively, is there a clean way to get dense trajectory output withoutextra_datathat would keep this workaround viable?