Skip to content

Inconsistent floating-point sum results based on row order #27223

@unights

Description

@unights

Checks

  • I have checked that this issue has not already been reported.
  • I have confirmed this bug exists on the latest version of Polars.

Reproducible example

import polars as pl
import pandas as pd

# Polars behavior
print("--- Polars ---")
pl_ser_list = [
    pl.Series([390, 1006, 2628]),
    pl.Series([390, 2628, 1006]),
    pl.Series([1006, 390, 2628]),
    pl.Series([1006, 2628, 390]),
    pl.Series([2628, 390, 1006]),
    pl.Series([2628, 1006, 390]),
]

for ser in pl_ser_list:
    print(f"{ser.to_list()}: {(ser / ser.sum()).sum()}")
    
# output:
# [390, 1006, 2628]: 0.9999999999999999
# [390, 2628, 1006]: 1.0
# [1006, 390, 2628]: 0.9999999999999999
# [1006, 2628, 390]: 1.0
# [2628, 390, 1006]: 1.0
# [2628, 1006, 390]: 1.0

# Pandas behavior
print("\n--- Pandas ---")
pd_ser_list = [
    pd.Series([390, 1006, 2628]),
    pd.Series([390, 2628, 1006]),
    pd.Series([1006, 390, 2628]),
    pd.Series([1006, 2628, 390]),
    pd.Series([2628, 390, 1006]),
    pd.Series([2628, 1006, 390]),
]

for ser in pd_ser_list:
    print(f"{ser.to_list()}: {(ser / ser.sum()).sum()}")

# output:
# [390, 1006, 2628]: 1.0
# [390, 2628, 1006]: 1.0
# [1006, 390, 2628]: 1.0
# [1006, 2628, 390]: 1.0
# [2628, 390, 1006]: 1.0
# [2628, 1006, 390]: 1.0

Log output

Issue description

I noticed an inconsistent floating-point calculation behavior when dividing a Series by its sum and summing the results. Depending on the order of the elements in the Series, the final sum is either 1.0 or 0.9999999999999999.

I understand this is likely due to IEEE 754 floating-point arithmetic and the order of operations affecting rounding errors. However, doing the exact same operation in pandas consistently yields 1.0 regardless of the row order.

I am not a floating-point expert, but I wanted to report this. Is there any plan for Polars to implement a summation algorithm that is less sensitive to row order, or is the current behavior strictly intended for performance reasons?

Expected behavior

Ideally, the mathematical sum of percentages should consistently evaluate to 1.0 or be agnostic to the ordering of the exact same numbers, similar to the behavior in pandas.

Installed versions

Details
--------Version info---------
Polars:              1.39.3
Index type:          UInt32
Platform:            Windows-10-10.0.19045-SP0
Python:              3.12.11 (main, Jul 11 2025, 22:40:18) [MSC v.1944 64 bit (AMD64)]
Runtime:             rt32

----Optional dependencies----
Azure CLI            'az' is not recognized as an internal or external command,
operable program or batch file.
<not installed>
adbc_driver_manager  <not installed>
altair               <not installed>
azure.identity       <not installed>
boto3                <not installed>
cloudpickle          <not installed>
connectorx           <not installed>
deltalake            <not installed>
fastexcel            0.19.0
fsspec               <not installed>
gevent               <not installed>
google.auth          <not installed>
great_tables         <not installed>
matplotlib           <not installed>
numpy                2.4.4
openpyxl             <not installed>
pandas               3.0.2
polars_cloud         <not installed>
pyarrow              <not installed>
pydantic             <not installed>
pyiceberg            <not installed>
sqlalchemy           <not installed>
torch                <not installed>
xlsx2csv             <not installed>
xlsxwriter           <not installed>

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't workingneeds triageAwaiting prioritization by a maintainerpythonRelated to Python Polars

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions