Fix 'Too many open files' error.

Happened before after continuous archiving of few hundreds links.

Fix by:
* setting process object to `None` to trigger GC finalizer cleanup of pipe descriptors
* protecting against double cleanup
This commit is contained in:
Dima Gerasimov 2018-11-26 02:29:56 +00:00
parent 91d60364de
commit 03c1b0009c

View file

@ -161,7 +161,13 @@ def progress(seconds=TIMEOUT, prefix=''):
def end():
"""immediately finish progress and clear the progressbar line"""
nonlocal p
if p is None: # protect from double termination
return
p.terminate()
p = None
sys.stdout.write('\r{}{}\r'.format((' ' * TERM_WIDTH), ANSI['reset'])) # clear whole terminal line
sys.stdout.flush()