Opened 9 months ago

Closed 2 months ago

#27818 closed Cleanup/optimization (fixed)

Use contextlib.suppress to suppress exceptions.

Reported by: Mads Jensen Owned by: Tim Graham <timograham@…>
Component: Core (Other) Version: master
Severity: Normal Keywords:
Cc: Adam (Chainz) Johnson Triage Stage: Accepted
Has patch: yes Needs documentation: no
Needs tests: no Patch needs improvement: no
Easy pickings: no UI/UX: no

Description

As per a comment in #23919, this pattern can be replaced with contextlib.suppress(...) which is valid as of Python 3.4. The code looks a bit cleaner.

try:
    ...
except ...:
    pass

Change History (11)

comment:2 Changed 9 months ago by Tim Graham

Component: UncategorizedCore (Other)
Triage Stage: UnreviewedAccepted

comment:3 Changed 9 months ago by Mads Jensen

Has patch: set
Owner: changed from nobody to Mads Jensen
Patch needs improvement: set
Last edited 9 months ago by Tim Graham (previous) (diff)

comment:4 Changed 9 months ago by Mads Jensen

Patch needs improvement: unset

comment:5 Changed 9 months ago by Tim Graham

Easy pickings: unset

comment:6 Changed 5 months ago by Tim Graham <timograham@…>

Owner: set to Tim Graham <timograham@…>
Resolution: fixed
Status: newclosed

In 550cb3a:

Fixed #27818 -- Replaced try/except/pass with contextlib.suppress().

comment:7 Changed 3 months ago by Adam (Chainz) Johnson

I only found out about this in review from Tim the other week. I think it should be reverted because it's making things slower unnecessarily. I count contextlib.suppress as about 10 times slower than plain try/except, not to mention it can trigger garbage collection as it creates and destroys an object every time:

In [8]: def foo():
   ...:     try:
   ...:         pass
   ...:     except AttributeError:
   ...:         pass
   ...:

In [9]: def foo2():
   ...:     with contextlib.suppress(AttributeError):
   ...:         pass
   ...:

In [10]: %timeit foo()
The slowest run took 10.24 times longer than the fastest. This could mean that an intermediate result is being cached.
10000000 loops, best of 3: 111 ns per loop

In [11]: %timeit foo2()
The slowest run took 6.15 times longer than the fastest. This could mean that an intermediate result is being cached.
1000000 loops, best of 3: 1.13 µs per loop

How would we proceed? Should I take this to the mailing list, or can I just submit a revert PR?

comment:8 Changed 3 months ago by Mads Jensen

Resolution: fixed
Status: closednew

If a change slows down things, please revert the change.

comment:9 Changed 3 months ago by Tim Graham

I'm not against reverting, but before doing so, it could be useful to ask cpython developers if the slowness should be a documented caveat of contextlib.suppress().

comment:10 Changed 2 months ago by Tim Graham

I got this input from #python-dev:

Alex_Gaynor: Without measuring, I'd bet almost anything that the majority of the overhead is in an additional function call, so it's not really avoidable. I'm not sure it makes sense to docuemnt, _any_ fucntion call would add the same overhead (assuming I'm right). (For that reason I'd expect it to have no additional overhead on PyPy)

__ap__: Those kinds of wholesale replacements look like an anti-pattern to me. Most of the time, the un-abstracted version is as readable as the slower abstracted one.

PR to revert.

comment:11 Changed 2 months ago by Adam (Chainz) Johnson

Cc: Adam (Chainz) Johnson added

Thanks Tim for checking with #python-dev and writing the PR! I forgot to add myself to CC so didn't get email updates (I'm used to systems where comment = auto subscribe).

comment:12 Changed 2 months ago by GitHub <noreply@…>

Resolution: fixed
Status: newclosed

In 6e4c6281:

Reverted "Fixed #27818 -- Replaced try/except/pass with contextlib.suppress()."

This reverts commit 550cb3a365dee4edfdd1563224d5304de2a57fda
because try/except performs better.

Note: See TracTickets for help on using tickets.
Back to Top