Skip to content
2 changes: 1 addition & 1 deletion doc/source/whatsnew/v2.0.0.rst
Original file line number Diff line number Diff line change
Expand Up @@ -993,7 +993,7 @@ Period
^^^^^^
- Bug in :meth:`Period.strftime` and :meth:`PeriodIndex.strftime`, raising ``UnicodeDecodeError`` when a locale-specific directive was passed (:issue:`46319`)
- Bug in adding a :class:`Period` object to an array of :class:`DateOffset` objects incorrectly raising ``TypeError`` (:issue:`50162`)
-
- Bug in :class:`Period` where passing a string with finer resolution than nanosecond would result in a ``KeyError`` instead of dropping the extra precision (:issue:`50417`)

Plotting
^^^^^^^^
Expand Down
9 changes: 8 additions & 1 deletion pandas/_libs/tslibs/parsing.pyx
Original file line number Diff line number Diff line change
Expand Up @@ -393,7 +393,7 @@ cdef parse_datetime_string_with_reso(
&out_tzoffset, False
)
if not string_to_dts_failed:
if dts.ps != 0 or out_local:
if out_bestunit == NPY_DATETIMEUNIT.NPY_FR_ns or out_local:
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

makes sense. to be totally correct we might need to check for ps, fs, as?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good point. What should be the error message we raise, or should we silently ignore?

On 1.5.2, doing pd.Period("1970/01/01 00:00:00.000000000111") gives me

Traceback (most recent call last): File "<stdin>", line 1, in <module> File "pandas/_libs/tslibs/period.pyx", line 2579, in pandas._libs.tslibs.period.Period.__new__ File "pandas/_libs/tslibs/parsing.pyx", line 369, in pandas._libs.tslibs.parsing.parse_time_string File "pandas/_libs/tslibs/parsing.pyx", line 431, in pandas._libs.tslibs.parsing.parse_datetime_string_with_reso KeyError: 11 

Timestamp seems to ignore the extra precision, so maybe we should match that here?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

definitely better than the status quo. it might be reasonable to have Timestamp warn/raise instead of silently truncating, but thats out of scope.

# TODO: the not-out_local case we could do without Timestamp;
# avoid circular import
from pandas import Timestamp
Expand All @@ -402,6 +402,13 @@ cdef parse_datetime_string_with_reso(
parsed = datetime(
dts.year, dts.month, dts.day, dts.hour, dts.min, dts.sec, dts.us
)
# Match Timestamp and drop picoseconds, femtoseconds, attoseconds
# The new resolution will just be nano
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

GH ref pointing back here?

# GH 50417
if out_bestunit in {NPY_DATETIMEUNIT.NPY_FR_ps,
NPY_DATETIMEUNIT.NPY_FR_fs,
NPY_DATETIMEUNIT.NPY_FR_as}:
out_bestunit = NPY_DATETIMEUNIT.NPY_FR_ns
reso = {
NPY_DATETIMEUNIT.NPY_FR_Y: "year",
NPY_DATETIMEUNIT.NPY_FR_M: "month",
Expand Down
5 changes: 5 additions & 0 deletions pandas/tests/scalar/period/test_period.py
Original file line number Diff line number Diff line change
Expand Up @@ -545,6 +545,11 @@ def test_period_cons_combined(self):
(".000000999", 999),
(".123456789", 789),
(".999999999", 999),
(".999999000", 0),
# Test femtoseconds, attoseconds, picoseconds are dropped like Timestamp
(".999999001123", 1),
(".999999001123456", 1),
(".999999001123456789", 1),
],
)
def test_period_constructor_nanosecond(self, day, hour, sec_float, expected):
Expand Down