profile
viewpoint
If you are wondering where the data of this site comes from, please visit https://api.github.com/users/mcepl/events. GitMemory does not store any data, but only uses NGINX to cache data for a period of time. The idea behind GitMemory is simply to give users a better reading experience.

mcepl/github-issues-export 17

script to exports issues from github to file which can be imported to bugzilla

mcepl/gen-oath-safe 10

Script for generating HOTP/TOTP keys (and QR code) for LinOTP

guillaumechereau/gedit-reflow-plugin 9

A simple gedit plugin that allows to reflow paragraphs the same way emacs <alt>-q command does.

mcepl/gitissius 1

Distributed Issue Tracking for GIT

mcepl/botocore 0

The low-level, core functionality of boto 3.

mcepl/bsgit 0

GIT Frontend to the Open Build Service

mcepl/bugzillatools 0

CLI and Python library for interacting with Bugzilla

mcepl/ciscoconfparse 0

Parse, Audit, Query, Build, and Modify Cisco IOS-style configurations

mcepl/citation.vim 0

Zotero and bibtex citations for Vim

mcepl/corrode 0

C to Rust translator

pull request commentopenSUSE/libsolv

Don't build Python 2 bindings on SLE-15-SP4 or more recent.

I can certainly merge that, but be aware that libsolv currently is build in SP2, so the change does not do anything.

We would probably need a fork for SUSE:SLE-15-SP4:GA … otherwise, I am not sure what will happen. When you make the request for SUSE:SLE-15-SP4:GA let me know, and I will get it to Staging:A which is where Python2-less world resides right now. Thank you very much.

mcepl

comment created time in 2 hours

push eventmcepl/libsolv

Matěj Cepl

commit sha c56df65fc7b00be41459cfd5260276714b149bac

%suse_version macro doesn't change with SP, sle_version does.

view details

push time in 2 hours

issue openedencode/starlette

Eight tests fail when running the test suite (asyncio related?)

Checklist

  • [X] The bug is reproducible against the latest release and/or master. (tarball 0.16.0)
  • [X] There are no similar issues or pull requests to fix it yet (certainly I have searched and I haven’t found anything relevant)

Describe the bug

When running the test suite during the packaging of the package for openSUSE/Factory I got eight tests failing:

[   31s] =================================== FAILURES ===================================
[   31s] _______________________ test_streaming_response[asyncio] _______________________
[   31s] 
[   31s] test_client_factory = functools.partial(<class 'starlette.testclient.TestClient'>, backend='asyncio', backend_options={})
[   31s] 
[   31s]     def test_streaming_response(test_client_factory):
[   31s]         filled_by_bg_task = ""
[   31s]     
[   31s]         async def app(scope, receive, send):
[   31s]             async def numbers(minimum, maximum):
[   31s]                 for i in range(minimum, maximum + 1):
[   31s]                     yield str(i)
[   31s]                     if i != maximum:
[   31s]                         yield ", "
[   31s]                     await anyio.sleep(0)
[   31s]     
[   31s]             async def numbers_for_cleanup(start=1, stop=5):
[   31s]                 nonlocal filled_by_bg_task
[   31s]                 async for thing in numbers(start, stop):
[   31s]                     filled_by_bg_task = filled_by_bg_task + thing
[   31s]     
[   31s]             cleanup_task = BackgroundTask(numbers_for_cleanup, start=6, stop=9)
[   31s]             generator = numbers(1, 5)
[   31s]             response = StreamingResponse(
[   31s]                 generator, media_type="text/plain", background=cleanup_task
[   31s]             )
[   31s]             await response(scope, receive, send)
[   31s]     
[   31s]         assert filled_by_bg_task == ""
[   31s]         client = test_client_factory(app)
[   31s] >       response = client.get("/")
[   31s] 
[   31s] tests/test_responses.py:101: 
[   31s] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
[   31s] /usr/lib/python3.9/site-packages/requests/sessions.py:555: in get
[   31s]     return self.request('GET', url, **kwargs)
[   31s] starlette/testclient.py:468: in request
[   31s]     return super().request(
[   31s] /usr/lib/python3.9/site-packages/requests/sessions.py:542: in request
[   31s]     resp = self.send(prep, **send_kwargs)
[   31s] /usr/lib/python3.9/site-packages/requests/sessions.py:655: in send
[   31s]     r = adapter.send(request, **kwargs)
[   31s] starlette/testclient.py:266: in send
[   31s]     raise exc
[   31s] starlette/testclient.py:263: in send
[   31s]     portal.call(self.app, scope, receive, send)
[   31s] /usr/lib64/python3.9/contextlib.py:126: in __exit__
[   31s]     next(self.gen)
[   31s] starlette/testclient.py:446: in _portal_factory
[   31s]     yield portal
[   31s] /usr/lib64/python3.9/contextlib.py:126: in __exit__
[   31s]     next(self.gen)
[   31s] /usr/lib/python3.9/site-packages/anyio/from_thread.py:406: in start_blocking_portal
[   31s]     run_future.result()
[   31s] /usr/lib64/python3.9/concurrent/futures/_base.py:445: in result
[   31s]     return self.__get_result()
[   31s] /usr/lib64/python3.9/concurrent/futures/_base.py:390: in __get_result
[   31s]     raise self._exception
[   31s] /usr/lib64/python3.9/concurrent/futures/thread.py:52: in run
[   31s]     result = self.fn(*self.args, **self.kwargs)
[   31s] /usr/lib/python3.9/site-packages/anyio/_core/_eventloop.py:56: in run
[   31s]     return asynclib.run(func, *args, **backend_options)  # type: ignore
[   31s] /usr/lib/python3.9/site-packages/anyio/_backends/_asyncio.py:230: in run
[   31s]     return native_run(wrapper(), debug=debug)
[   31s] /usr/lib64/python3.9/asyncio/runners.py:48: in run
[   31s]     loop.run_until_complete(loop.shutdown_asyncgens())
[   31s] /usr/lib64/python3.9/asyncio/base_events.py:642: in run_until_complete
[   31s]     return future.result()
[   31s] /usr/lib64/python3.9/asyncio/base_events.py:542: in shutdown_asyncgens
[   31s]     results = await tasks.gather(
[   31s] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
[   31s] 
[   31s] loop = <_UnixSelectorEventLoop running=False closed=True debug=False>
[   31s] return_exceptions = True
[   31s] coros_or_futures = (<async_generator_athrow object at 0x7ffb944e3ec0>,)
[   31s] 
[   31s]     def gather(*coros_or_futures, loop=None, return_exceptions=False):
[   31s]         """Return a future aggregating results from the given coroutines/futures.
[   31s]     
[   31s]         Coroutines will be wrapped in a future and scheduled in the event
[   31s]         loop. They will not necessarily be scheduled in the same order as
[   31s]         passed in.
[   31s]     
[   31s]         All futures must share the same event loop.  If all the tasks are
[   31s]         done successfully, the returned future's result is the list of
[   31s]         results (in the order of the original sequence, not necessarily
[   31s]         the order of results arrival).  If *return_exceptions* is True,
[   31s]         exceptions in the tasks are treated the same as successful
[   31s]         results, and gathered in the result list; otherwise, the first
[   31s]         raised exception will be immediately propagated to the returned
[   31s]         future.
[   31s]     
[   31s]         Cancellation: if the outer Future is cancelled, all children (that
[   31s]         have not completed yet) are also cancelled.  If any child is
[   31s]         cancelled, this is treated as if it raised CancelledError --
[   31s]         the outer Future is *not* cancelled in this case.  (This is to
[   31s]         prevent the cancellation of one child to cause other children to
[   31s]         be cancelled.)
[   31s]     
[   31s]         If *return_exceptions* is False, cancelling gather() after it
[   31s]         has been marked done won't cancel any submitted awaitables.
[   31s]         For instance, gather can be marked done after propagating an
[   31s]         exception to the caller, therefore, calling ``gather.cancel()``
[   31s]         after catching an exception (raised by one of the awaitables) from
[   31s]         gather won't cancel any other awaitables.
[   31s]         """
[   31s]         if loop is not None:
[   31s] >           warnings.warn("The loop argument is deprecated since Python 3.8, "
[   31s]                           "and scheduled for removal in Python 3.10.",
[   31s]                           DeprecationWarning, stacklevel=2)
[   31s] E           DeprecationWarning: The loop argument is deprecated since Python 3.8, and scheduled for removal in Python 3.10.
[   31s] 
[   31s] /usr/lib64/python3.9/asyncio/tasks.py:755: DeprecationWarning
[   31s] ____________________ test_sync_streaming_response[asyncio] _____________________
[   31s] 
[   31s] test_client_factory = functools.partial(<class 'starlette.testclient.TestClient'>, backend='asyncio', backend_options={})
[   31s] 
[   31s]     def test_sync_streaming_response(test_client_factory):
[   31s]         async def app(scope, receive, send):
[   31s]             def numbers(minimum, maximum):
[   31s]                 for i in range(minimum, maximum + 1):
[   31s]                     yield str(i)
[   31s]                     if i != maximum:
[   31s]                         yield ", "
[   31s]     
[   31s]             generator = numbers(1, 5)
[   31s]             response = StreamingResponse(generator, media_type="text/plain")
[   31s]             await response(scope, receive, send)
[   31s]     
[   31s]         client = test_client_factory(app)
[   31s] >       response = client.get("/")
[   31s] 
[   31s] tests/test_responses.py:157: 
[   31s] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
[   31s] /usr/lib/python3.9/site-packages/requests/sessions.py:555: in get
[   31s]     return self.request('GET', url, **kwargs)
[   31s] starlette/testclient.py:468: in request
[   31s]     return super().request(
[   31s] /usr/lib/python3.9/site-packages/requests/sessions.py:542: in request
[   31s]     resp = self.send(prep, **send_kwargs)
[   31s] /usr/lib/python3.9/site-packages/requests/sessions.py:655: in send
[   31s]     r = adapter.send(request, **kwargs)
[   31s] starlette/testclient.py:266: in send
[   31s]     raise exc
[   31s] starlette/testclient.py:263: in send
[   31s]     portal.call(self.app, scope, receive, send)
[   31s] /usr/lib64/python3.9/contextlib.py:126: in __exit__
[   31s]     next(self.gen)
[   31s] starlette/testclient.py:446: in _portal_factory
[   31s]     yield portal
[   31s] /usr/lib64/python3.9/contextlib.py:126: in __exit__
[   31s]     next(self.gen)
[   31s] /usr/lib/python3.9/site-packages/anyio/from_thread.py:406: in start_blocking_portal
[   31s]     run_future.result()
[   31s] /usr/lib64/python3.9/concurrent/futures/_base.py:445: in result
[   31s]     return self.__get_result()
[   31s] /usr/lib64/python3.9/concurrent/futures/_base.py:390: in __get_result
[   31s]     raise self._exception
[   31s] /usr/lib64/python3.9/concurrent/futures/thread.py:52: in run
[   31s]     result = self.fn(*self.args, **self.kwargs)
[   31s] /usr/lib/python3.9/site-packages/anyio/_core/_eventloop.py:56: in run
[   31s]     return asynclib.run(func, *args, **backend_options)  # type: ignore
[   31s] /usr/lib/python3.9/site-packages/anyio/_backends/_asyncio.py:230: in run
[   31s]     return native_run(wrapper(), debug=debug)
[   31s] /usr/lib64/python3.9/asyncio/runners.py:48: in run
[   31s]     loop.run_until_complete(loop.shutdown_asyncgens())
[   31s] /usr/lib64/python3.9/asyncio/base_events.py:642: in run_until_complete
[   31s]     return future.result()
[   31s] /usr/lib64/python3.9/asyncio/base_events.py:542: in shutdown_asyncgens
[   31s]     results = await tasks.gather(
[   31s] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
[   31s] 
[   31s] loop = <_UnixSelectorEventLoop running=False closed=True debug=False>
[   31s] return_exceptions = True
[   31s] coros_or_futures = (<async_generator_athrow object at 0x7ffb9426e2c0>,)
[   31s] 
[   31s]     def gather(*coros_or_futures, loop=None, return_exceptions=False):
[   31s]         """Return a future aggregating results from the given coroutines/futures.
[   31s]     
[   31s]         Coroutines will be wrapped in a future and scheduled in the event
[   31s]         loop. They will not necessarily be scheduled in the same order as
[   31s]         passed in.
[   31s]     
[   31s]         All futures must share the same event loop.  If all the tasks are
[   31s]         done successfully, the returned future's result is the list of
[   31s]         results (in the order of the original sequence, not necessarily
[   31s]         the order of results arrival).  If *return_exceptions* is True,
[   31s]         exceptions in the tasks are treated the same as successful
[   31s]         results, and gathered in the result list; otherwise, the first
[   31s]         raised exception will be immediately propagated to the returned
[   31s]         future.
[   31s]     
[   31s]         Cancellation: if the outer Future is cancelled, all children (that
[   31s]         have not completed yet) are also cancelled.  If any child is
[   31s]         cancelled, this is treated as if it raised CancelledError --
[   31s]         the outer Future is *not* cancelled in this case.  (This is to
[   31s]         prevent the cancellation of one child to cause other children to
[   31s]         be cancelled.)
[   31s]     
[   31s]         If *return_exceptions* is False, cancelling gather() after it
[   31s]         has been marked done won't cancel any submitted awaitables.
[   31s]         For instance, gather can be marked done after propagating an
[   31s]         exception to the caller, therefore, calling ``gather.cancel()``
[   31s]         after catching an exception (raised by one of the awaitables) from
[   31s]         gather won't cancel any other awaitables.
[   31s]         """
[   31s]         if loop is not None:
[   31s] >           warnings.warn("The loop argument is deprecated since Python 3.8, "
[   31s]                           "and scheduled for removal in Python 3.10.",
[   31s]                           DeprecationWarning, stacklevel=2)
[   31s] E           DeprecationWarning: The loop argument is deprecated since Python 3.8, and scheduled for removal in Python 3.10.
[   31s] 
[   31s] /usr/lib64/python3.9/asyncio/tasks.py:755: DeprecationWarning
[   31s] ________________ test_staticfiles_head_with_middleware[asyncio] ________________
[   31s] 
[   31s] tmpdir = local('/tmp/pytest-of-abuild/pytest-6/test_staticfiles_head_with_mid0')
[   31s] test_client_factory = functools.partial(<class 'starlette.testclient.TestClient'>, backend='asyncio', backend_options={})
[   31s] 
[   31s]     def test_staticfiles_head_with_middleware(tmpdir, test_client_factory):
[   31s]         """
[   31s]         see https://github.com/encode/starlette/pull/935
[   31s]         """
[   31s]         path = os.path.join(tmpdir, "example.txt")
[   31s]         with open(path, "w") as file:
[   31s]             file.write("x" * 100)
[   31s]     
[   31s]         routes = [Mount("/static", app=StaticFiles(directory=tmpdir), name="static")]
[   31s]         app = Starlette(routes=routes)
[   31s]     
[   31s]         @app.middleware("http")
[   31s]         async def does_nothing_middleware(request: Request, call_next):
[   31s]             response = await call_next(request)
[   31s]             return response
[   31s]     
[   31s]         client = test_client_factory(app)
[   31s] >       response = client.head("/static/example.txt")
[   31s] 
[   31s] tests/test_staticfiles.py:56: 
[   31s] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
[   31s] /usr/lib/python3.9/site-packages/requests/sessions.py:577: in head
[   31s]     return self.request('HEAD', url, **kwargs)
[   31s] starlette/testclient.py:468: in request
[   31s]     return super().request(
[   31s] /usr/lib/python3.9/site-packages/requests/sessions.py:542: in request
[   31s]     resp = self.send(prep, **send_kwargs)
[   31s] /usr/lib/python3.9/site-packages/requests/sessions.py:655: in send
[   31s]     r = adapter.send(request, **kwargs)
[   31s] starlette/testclient.py:266: in send
[   31s]     raise exc
[   31s] starlette/testclient.py:263: in send
[   31s]     portal.call(self.app, scope, receive, send)
[   31s] /usr/lib64/python3.9/contextlib.py:126: in __exit__
[   31s]     next(self.gen)
[   31s] starlette/testclient.py:446: in _portal_factory
[   31s]     yield portal
[   31s] /usr/lib64/python3.9/contextlib.py:126: in __exit__
[   31s]     next(self.gen)
[   31s] /usr/lib/python3.9/site-packages/anyio/from_thread.py:406: in start_blocking_portal
[   31s]     run_future.result()
[   31s] /usr/lib64/python3.9/concurrent/futures/_base.py:445: in result
[   31s]     return self.__get_result()
[   31s] /usr/lib64/python3.9/concurrent/futures/_base.py:390: in __get_result
[   31s]     raise self._exception
[   31s] /usr/lib64/python3.9/concurrent/futures/thread.py:52: in run
[   31s]     result = self.fn(*self.args, **self.kwargs)
[   31s] /usr/lib/python3.9/site-packages/anyio/_core/_eventloop.py:56: in run
[   31s]     return asynclib.run(func, *args, **backend_options)  # type: ignore
[   31s] /usr/lib/python3.9/site-packages/anyio/_backends/_asyncio.py:230: in run
[   31s]     return native_run(wrapper(), debug=debug)
[   31s] /usr/lib64/python3.9/asyncio/runners.py:48: in run
[   31s]     loop.run_until_complete(loop.shutdown_asyncgens())
[   31s] /usr/lib64/python3.9/asyncio/base_events.py:642: in run_until_complete
[   31s]     return future.result()
[   31s] /usr/lib64/python3.9/asyncio/base_events.py:542: in shutdown_asyncgens
[   31s]     results = await tasks.gather(
[   31s] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
[   31s] 
[   31s] loop = <_UnixSelectorEventLoop running=False closed=True debug=False>
[   31s] return_exceptions = True
[   31s] coros_or_futures = (<async_generator_athrow object at 0x7ffb976f8280>,)
[   31s] 
[   31s]     def gather(*coros_or_futures, loop=None, return_exceptions=False):
[   31s]         """Return a future aggregating results from the given coroutines/futures.
[   31s]     
[   31s]         Coroutines will be wrapped in a future and scheduled in the event
[   31s]         loop. They will not necessarily be scheduled in the same order as
[   31s]         passed in.
[   31s]     
[   31s]         All futures must share the same event loop.  If all the tasks are
[   31s]         done successfully, the returned future's result is the list of
[   31s]         results (in the order of the original sequence, not necessarily
[   31s]         the order of results arrival).  If *return_exceptions* is True,
[   31s]         exceptions in the tasks are treated the same as successful
[   31s]         results, and gathered in the result list; otherwise, the first
[   31s]         raised exception will be immediately propagated to the returned
[   31s]         future.
[   31s]     
[   31s]         Cancellation: if the outer Future is cancelled, all children (that
[   31s]         have not completed yet) are also cancelled.  If any child is
[   31s]         cancelled, this is treated as if it raised CancelledError --
[   31s]         the outer Future is *not* cancelled in this case.  (This is to
[   31s]         prevent the cancellation of one child to cause other children to
[   31s]         be cancelled.)
[   31s]     
[   31s]         If *return_exceptions* is False, cancelling gather() after it
[   31s]         has been marked done won't cancel any submitted awaitables.
[   31s]         For instance, gather can be marked done after propagating an
[   31s]         exception to the caller, therefore, calling ``gather.cancel()``
[   31s]         after catching an exception (raised by one of the awaitables) from
[   31s]         gather won't cancel any other awaitables.
[   31s]         """
[   31s]         if loop is not None:
[   31s] >           warnings.warn("The loop argument is deprecated since Python 3.8, "
[   31s]                           "and scheduled for removal in Python 3.10.",
[   31s]                           DeprecationWarning, stacklevel=2)
[   31s] E           DeprecationWarning: The loop argument is deprecated since Python 3.8, and scheduled for removal in Python 3.10.
[   31s] 
[   31s] /usr/lib64/python3.9/asyncio/tasks.py:755: DeprecationWarning
[   31s] _______________________ test_custom_middleware[asyncio] ________________________
[   31s] 
[   31s] test_client_factory = functools.partial(<class 'starlette.testclient.TestClient'>, backend='asyncio', backend_options={})
[   31s] 
[   31s]     def test_custom_middleware(test_client_factory):
[   31s]         client = test_client_factory(app)
[   31s] >       response = client.get("/")
[   31s] 
[   31s] tests/middleware/test_base.py:52: 
[   31s] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
[   31s] /usr/lib/python3.9/site-packages/requests/sessions.py:555: in get
[   31s]     return self.request('GET', url, **kwargs)
[   31s] starlette/testclient.py:468: in request
[   31s]     return super().request(
[   31s] /usr/lib/python3.9/site-packages/requests/sessions.py:542: in request
[   31s]     resp = self.send(prep, **send_kwargs)
[   31s] /usr/lib/python3.9/site-packages/requests/sessions.py:655: in send
[   31s]     r = adapter.send(request, **kwargs)
[   31s] starlette/testclient.py:266: in send
[   31s]     raise exc
[   31s] starlette/testclient.py:263: in send
[   31s]     portal.call(self.app, scope, receive, send)
[   31s] /usr/lib64/python3.9/contextlib.py:126: in __exit__
[   31s]     next(self.gen)
[   31s] starlette/testclient.py:446: in _portal_factory
[   31s]     yield portal
[   31s] /usr/lib64/python3.9/contextlib.py:126: in __exit__
[   31s]     next(self.gen)
[   31s] /usr/lib/python3.9/site-packages/anyio/from_thread.py:406: in start_blocking_portal
[   31s]     run_future.result()
[   31s] /usr/lib64/python3.9/concurrent/futures/_base.py:445: in result
[   31s]     return self.__get_result()
[   31s] /usr/lib64/python3.9/concurrent/futures/_base.py:390: in __get_result
[   31s]     raise self._exception
[   31s] /usr/lib64/python3.9/concurrent/futures/thread.py:52: in run
[   31s]     result = self.fn(*self.args, **self.kwargs)
[   31s] /usr/lib/python3.9/site-packages/anyio/_core/_eventloop.py:56: in run
[   31s]     return asynclib.run(func, *args, **backend_options)  # type: ignore
[   31s] /usr/lib/python3.9/site-packages/anyio/_backends/_asyncio.py:230: in run
[   31s]     return native_run(wrapper(), debug=debug)
[   31s] /usr/lib64/python3.9/asyncio/runners.py:48: in run
[   31s]     loop.run_until_complete(loop.shutdown_asyncgens())
[   31s] /usr/lib64/python3.9/asyncio/base_events.py:642: in run_until_complete
[   31s]     return future.result()
[   31s] /usr/lib64/python3.9/asyncio/base_events.py:542: in shutdown_asyncgens
[   31s]     results = await tasks.gather(
[   31s] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
[   31s] 
[   31s] loop = <_UnixSelectorEventLoop running=False closed=True debug=False>
[   31s] return_exceptions = True
[   31s] coros_or_futures = (<async_generator_athrow object at 0x7ffb94402cc0>,)
[   31s] 
[   31s]     def gather(*coros_or_futures, loop=None, return_exceptions=False):
[   31s]         """Return a future aggregating results from the given coroutines/futures.
[   31s]     
[   31s]         Coroutines will be wrapped in a future and scheduled in the event
[   31s]         loop. They will not necessarily be scheduled in the same order as
[   31s]         passed in.
[   31s]     
[   31s]         All futures must share the same event loop.  If all the tasks are
[   31s]         done successfully, the returned future's result is the list of
[   31s]         results (in the order of the original sequence, not necessarily
[   31s]         the order of results arrival).  If *return_exceptions* is True,
[   31s]         exceptions in the tasks are treated the same as successful
[   31s]         results, and gathered in the result list; otherwise, the first
[   31s]         raised exception will be immediately propagated to the returned
[   31s]         future.
[   31s]     
[   31s]         Cancellation: if the outer Future is cancelled, all children (that
[   31s]         have not completed yet) are also cancelled.  If any child is
[   31s]         cancelled, this is treated as if it raised CancelledError --
[   31s]         the outer Future is *not* cancelled in this case.  (This is to
[   31s]         prevent the cancellation of one child to cause other children to
[   31s]         be cancelled.)
[   31s]     
[   31s]         If *return_exceptions* is False, cancelling gather() after it
[   31s]         has been marked done won't cancel any submitted awaitables.
[   31s]         For instance, gather can be marked done after propagating an
[   31s]         exception to the caller, therefore, calling ``gather.cancel()``
[   31s]         after catching an exception (raised by one of the awaitables) from
[   31s]         gather won't cancel any other awaitables.
[   31s]         """
[   31s]         if loop is not None:
[   31s] >           warnings.warn("The loop argument is deprecated since Python 3.8, "
[   31s]                           "and scheduled for removal in Python 3.10.",
[   31s]                           DeprecationWarning, stacklevel=2)
[   31s] E           DeprecationWarning: The loop argument is deprecated since Python 3.8, and scheduled for removal in Python 3.10.
[   31s] 
[   31s] /usr/lib64/python3.9/asyncio/tasks.py:755: DeprecationWarning
[   31s] ______________________ test_middleware_decorator[asyncio] ______________________
[   31s] 
[   31s] test_client_factory = functools.partial(<class 'starlette.testclient.TestClient'>, backend='asyncio', backend_options={})
[   31s] 
[   31s]     def test_middleware_decorator(test_client_factory):
[   31s]         app = Starlette()
[   31s]     
[   31s]         @app.route("/homepage")
[   31s]         def homepage(request):
[   31s]             return PlainTextResponse("Homepage")
[   31s]     
[   31s]         @app.middleware("http")
[   31s]         async def plaintext(request, call_next):
[   31s]             if request.url.path == "/":
[   31s]                 return PlainTextResponse("OK")
[   31s]             response = await call_next(request)
[   31s]             response.headers["Custom"] = "Example"
[   31s]             return response
[   31s]     
[   31s]         client = test_client_factory(app)
[   31s]         response = client.get("/")
[   31s]         assert response.text == "OK"
[   31s]     
[   31s] >       response = client.get("/homepage")
[   31s] 
[   31s] tests/middleware/test_base.py:85: 
[   31s] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
[   31s] /usr/lib/python3.9/site-packages/requests/sessions.py:555: in get
[   31s]     return self.request('GET', url, **kwargs)
[   31s] starlette/testclient.py:468: in request
[   31s]     return super().request(
[   31s] /usr/lib/python3.9/site-packages/requests/sessions.py:542: in request
[   31s]     resp = self.send(prep, **send_kwargs)
[   31s] /usr/lib/python3.9/site-packages/requests/sessions.py:655: in send
[   31s]     r = adapter.send(request, **kwargs)
[   31s] starlette/testclient.py:266: in send
[   31s]     raise exc
[   31s] starlette/testclient.py:263: in send
[   31s]     portal.call(self.app, scope, receive, send)
[   31s] /usr/lib64/python3.9/contextlib.py:126: in __exit__
[   31s]     next(self.gen)
[   31s] starlette/testclient.py:446: in _portal_factory
[   31s]     yield portal
[   31s] /usr/lib64/python3.9/contextlib.py:126: in __exit__
[   31s]     next(self.gen)
[   31s] /usr/lib/python3.9/site-packages/anyio/from_thread.py:406: in start_blocking_portal
[   31s]     run_future.result()
[   31s] /usr/lib64/python3.9/concurrent/futures/_base.py:445: in result
[   31s]     return self.__get_result()
[   31s] /usr/lib64/python3.9/concurrent/futures/_base.py:390: in __get_result
[   31s]     raise self._exception
[   31s] /usr/lib64/python3.9/concurrent/futures/thread.py:52: in run
[   31s]     result = self.fn(*self.args, **self.kwargs)
[   31s] /usr/lib/python3.9/site-packages/anyio/_core/_eventloop.py:56: in run
[   31s]     return asynclib.run(func, *args, **backend_options)  # type: ignore
[   31s] /usr/lib/python3.9/site-packages/anyio/_backends/_asyncio.py:230: in run
[   31s]     return native_run(wrapper(), debug=debug)
[   31s] /usr/lib64/python3.9/asyncio/runners.py:48: in run
[   31s]     loop.run_until_complete(loop.shutdown_asyncgens())
[   31s] /usr/lib64/python3.9/asyncio/base_events.py:642: in run_until_complete
[   31s]     return future.result()
[   31s] /usr/lib64/python3.9/asyncio/base_events.py:542: in shutdown_asyncgens
[   31s]     results = await tasks.gather(
[   31s] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
[   31s] 
[   31s] loop = <_UnixSelectorEventLoop running=False closed=True debug=False>
[   31s] return_exceptions = True
[   31s] coros_or_futures = (<async_generator_athrow object at 0x7ffb9440f040>,)
[   31s] 
[   31s]     def gather(*coros_or_futures, loop=None, return_exceptions=False):
[   31s]         """Return a future aggregating results from the given coroutines/futures.
[   31s]     
[   31s]         Coroutines will be wrapped in a future and scheduled in the event
[   31s]         loop. They will not necessarily be scheduled in the same order as
[   31s]         passed in.
[   31s]     
[   31s]         All futures must share the same event loop.  If all the tasks are
[   31s]         done successfully, the returned future's result is the list of
[   31s]         results (in the order of the original sequence, not necessarily
[   31s]         the order of results arrival).  If *return_exceptions* is True,
[   31s]         exceptions in the tasks are treated the same as successful
[   31s]         results, and gathered in the result list; otherwise, the first
[   31s]         raised exception will be immediately propagated to the returned
[   31s]         future.
[   31s]     
[   31s]         Cancellation: if the outer Future is cancelled, all children (that
[   31s]         have not completed yet) are also cancelled.  If any child is
[   31s]         cancelled, this is treated as if it raised CancelledError --
[   31s]         the outer Future is *not* cancelled in this case.  (This is to
[   31s]         prevent the cancellation of one child to cause other children to
[   31s]         be cancelled.)
[   31s]     
[   31s]         If *return_exceptions* is False, cancelling gather() after it
[   31s]         has been marked done won't cancel any submitted awaitables.
[   31s]         For instance, gather can be marked done after propagating an
[   31s]         exception to the caller, therefore, calling ``gather.cancel()``
[   31s]         after catching an exception (raised by one of the awaitables) from
[   31s]         gather won't cancel any other awaitables.
[   31s]         """
[   31s]         if loop is not None:
[   31s] >           warnings.warn("The loop argument is deprecated since Python 3.8, "
[   31s]                           "and scheduled for removal in Python 3.10.",
[   31s]                           DeprecationWarning, stacklevel=2)
[   31s] E           DeprecationWarning: The loop argument is deprecated since Python 3.8, and scheduled for removal in Python 3.10.
[   31s] 
[   31s] /usr/lib64/python3.9/asyncio/tasks.py:755: DeprecationWarning
[   31s] _____________ test_state_data_across_multiple_middlewares[asyncio] _____________
[   31s] 
[   31s] test_client_factory = functools.partial(<class 'starlette.testclient.TestClient'>, backend='asyncio', backend_options={})
[   31s] 
[   31s]     def test_state_data_across_multiple_middlewares(test_client_factory):
[   31s]         expected_value1 = "foo"
[   31s]         expected_value2 = "bar"
[   31s]     
[   31s]         class aMiddleware(BaseHTTPMiddleware):
[   31s]             async def dispatch(self, request, call_next):
[   31s]                 request.state.foo = expected_value1
[   31s]                 response = await call_next(request)
[   31s]                 return response
[   31s]     
[   31s]         class bMiddleware(BaseHTTPMiddleware):
[   31s]             async def dispatch(self, request, call_next):
[   31s]                 request.state.bar = expected_value2
[   31s]                 response = await call_next(request)
[   31s]                 response.headers["X-State-Foo"] = request.state.foo
[   31s]                 return response
[   31s]     
[   31s]         class cMiddleware(BaseHTTPMiddleware):
[   31s]             async def dispatch(self, request, call_next):
[   31s]                 response = await call_next(request)
[   31s]                 response.headers["X-State-Bar"] = request.state.bar
[   31s]                 return response
[   31s]     
[   31s]         app = Starlette()
[   31s]         app.add_middleware(aMiddleware)
[   31s]         app.add_middleware(bMiddleware)
[   31s]         app.add_middleware(cMiddleware)
[   31s]     
[   31s]         @app.route("/")
[   31s]         def homepage(request):
[   31s]             return PlainTextResponse("OK")
[   31s]     
[   31s]         client = test_client_factory(app)
[   31s] >       response = client.get("/")
[   31s] 
[   31s] tests/middleware/test_base.py:123: 
[   31s] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
[   31s] /usr/lib/python3.9/site-packages/requests/sessions.py:555: in get
[   31s]     return self.request('GET', url, **kwargs)
[   31s] starlette/testclient.py:468: in request
[   31s]     return super().request(
[   31s] /usr/lib/python3.9/site-packages/requests/sessions.py:542: in request
[   31s]     resp = self.send(prep, **send_kwargs)
[   31s] /usr/lib/python3.9/site-packages/requests/sessions.py:655: in send
[   31s]     r = adapter.send(request, **kwargs)
[   31s] starlette/testclient.py:266: in send
[   31s]     raise exc
[   31s] starlette/testclient.py:263: in send
[   31s]     portal.call(self.app, scope, receive, send)
[   31s] /usr/lib64/python3.9/contextlib.py:126: in __exit__
[   31s]     next(self.gen)
[   31s] starlette/testclient.py:446: in _portal_factory
[   31s]     yield portal
[   31s] /usr/lib64/python3.9/contextlib.py:126: in __exit__
[   31s]     next(self.gen)
[   31s] /usr/lib/python3.9/site-packages/anyio/from_thread.py:406: in start_blocking_portal
[   31s]     run_future.result()
[   31s] /usr/lib64/python3.9/concurrent/futures/_base.py:445: in result
[   31s]     return self.__get_result()
[   31s] /usr/lib64/python3.9/concurrent/futures/_base.py:390: in __get_result
[   31s]     raise self._exception
[   31s] /usr/lib64/python3.9/concurrent/futures/thread.py:52: in run
[   31s]     result = self.fn(*self.args, **self.kwargs)
[   31s] /usr/lib/python3.9/site-packages/anyio/_core/_eventloop.py:56: in run
[   31s]     return asynclib.run(func, *args, **backend_options)  # type: ignore
[   31s] /usr/lib/python3.9/site-packages/anyio/_backends/_asyncio.py:230: in run
[   31s]     return native_run(wrapper(), debug=debug)
[   31s] /usr/lib64/python3.9/asyncio/runners.py:48: in run
[   31s]     loop.run_until_complete(loop.shutdown_asyncgens())
[   31s] /usr/lib64/python3.9/asyncio/base_events.py:642: in run_until_complete
[   31s]     return future.result()
[   31s] /usr/lib64/python3.9/asyncio/base_events.py:542: in shutdown_asyncgens
[   31s]     results = await tasks.gather(
[   31s] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
[   31s] 
[   31s] loop = <_UnixSelectorEventLoop running=False closed=True debug=False>
[   31s] return_exceptions = True
[   31s] coros_or_futures = (<async_generator_athrow object at 0x7ffb9436a100>, <async_generator_athrow object at 0x7ffb7fe06bc0>, <async_generator_athrow object at 0x7ffb9444b1c0>)
[   31s] 
[   31s]     def gather(*coros_or_futures, loop=None, return_exceptions=False):
[   31s]         """Return a future aggregating results from the given coroutines/futures.
[   31s]     
[   31s]         Coroutines will be wrapped in a future and scheduled in the event
[   31s]         loop. They will not necessarily be scheduled in the same order as
[   31s]         passed in.
[   31s]     
[   31s]         All futures must share the same event loop.  If all the tasks are
[   31s]         done successfully, the returned future's result is the list of
[   31s]         results (in the order of the original sequence, not necessarily
[   31s]         the order of results arrival).  If *return_exceptions* is True,
[   31s]         exceptions in the tasks are treated the same as successful
[   31s]         results, and gathered in the result list; otherwise, the first
[   31s]         raised exception will be immediately propagated to the returned
[   31s]         future.
[   31s]     
[   31s]         Cancellation: if the outer Future is cancelled, all children (that
[   31s]         have not completed yet) are also cancelled.  If any child is
[   31s]         cancelled, this is treated as if it raised CancelledError --
[   31s]         the outer Future is *not* cancelled in this case.  (This is to
[   31s]         prevent the cancellation of one child to cause other children to
[   31s]         be cancelled.)
[   31s]     
[   31s]         If *return_exceptions* is False, cancelling gather() after it
[   31s]         has been marked done won't cancel any submitted awaitables.
[   31s]         For instance, gather can be marked done after propagating an
[   31s]         exception to the caller, therefore, calling ``gather.cancel()``
[   31s]         after catching an exception (raised by one of the awaitables) from
[   31s]         gather won't cancel any other awaitables.
[   31s]         """
[   31s]         if loop is not None:
[   31s] >           warnings.warn("The loop argument is deprecated since Python 3.8, "
[   31s]                           "and scheduled for removal in Python 3.10.",
[   31s]                           DeprecationWarning, stacklevel=2)
[   31s] E           DeprecationWarning: The loop argument is deprecated since Python 3.8, and scheduled for removal in Python 3.10.
[   31s] 
[   31s] /usr/lib64/python3.9/asyncio/tasks.py:755: DeprecationWarning
[   31s] ____________________ test_app_middleware_argument[asyncio] _____________________
[   31s] 
[   31s] test_client_factory = functools.partial(<class 'starlette.testclient.TestClient'>, backend='asyncio', backend_options={})
[   31s] 
[   31s]     def test_app_middleware_argument(test_client_factory):
[   31s]         def homepage(request):
[   31s]             return PlainTextResponse("Homepage")
[   31s]     
[   31s]         app = Starlette(
[   31s]             routes=[Route("/", homepage)], middleware=[Middleware(CustomMiddleware)]
[   31s]         )
[   31s]     
[   31s]         client = test_client_factory(app)
[   31s] >       response = client.get("/")
[   31s] 
[   31s] tests/middleware/test_base.py:138: 
[   31s] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
[   31s] /usr/lib/python3.9/site-packages/requests/sessions.py:555: in get
[   31s]     return self.request('GET', url, **kwargs)
[   31s] starlette/testclient.py:468: in request
[   31s]     return super().request(
[   31s] /usr/lib/python3.9/site-packages/requests/sessions.py:542: in request
[   31s]     resp = self.send(prep, **send_kwargs)
[   31s] /usr/lib/python3.9/site-packages/requests/sessions.py:655: in send
[   31s]     r = adapter.send(request, **kwargs)
[   31s] starlette/testclient.py:266: in send
[   31s]     raise exc
[   31s] starlette/testclient.py:263: in send
[   31s]     portal.call(self.app, scope, receive, send)
[   31s] /usr/lib64/python3.9/contextlib.py:126: in __exit__
[   31s]     next(self.gen)
[   31s] starlette/testclient.py:446: in _portal_factory
[   31s]     yield portal
[   31s] /usr/lib64/python3.9/contextlib.py:126: in __exit__
[   31s]     next(self.gen)
[   31s] /usr/lib/python3.9/site-packages/anyio/from_thread.py:406: in start_blocking_portal
[   31s]     run_future.result()
[   31s] /usr/lib64/python3.9/concurrent/futures/_base.py:445: in result
[   31s]     return self.__get_result()
[   31s] /usr/lib64/python3.9/concurrent/futures/_base.py:390: in __get_result
[   31s]     raise self._exception
[   31s] /usr/lib64/python3.9/concurrent/futures/thread.py:52: in run
[   31s]     result = self.fn(*self.args, **self.kwargs)
[   31s] /usr/lib/python3.9/site-packages/anyio/_core/_eventloop.py:56: in run
[   31s]     return asynclib.run(func, *args, **backend_options)  # type: ignore
[   31s] /usr/lib/python3.9/site-packages/anyio/_backends/_asyncio.py:230: in run
[   31s]     return native_run(wrapper(), debug=debug)
[   31s] /usr/lib64/python3.9/asyncio/runners.py:48: in run
[   31s]     loop.run_until_complete(loop.shutdown_asyncgens())
[   31s] /usr/lib64/python3.9/asyncio/base_events.py:642: in run_until_complete
[   31s]     return future.result()
[   31s] /usr/lib64/python3.9/asyncio/base_events.py:542: in shutdown_asyncgens
[   31s]     results = await tasks.gather(
[   31s] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
[   31s] 
[   31s] loop = <_UnixSelectorEventLoop running=False closed=True debug=False>
[   31s] return_exceptions = True
[   31s] coros_or_futures = (<async_generator_athrow object at 0x7ffb94410080>,)
[   31s] 
[   31s]     def gather(*coros_or_futures, loop=None, return_exceptions=False):
[   31s]         """Return a future aggregating results from the given coroutines/futures.
[   31s]     
[   31s]         Coroutines will be wrapped in a future and scheduled in the event
[   31s]         loop. They will not necessarily be scheduled in the same order as
[   31s]         passed in.
[   31s]     
[   31s]         All futures must share the same event loop.  If all the tasks are
[   31s]         done successfully, the returned future's result is the list of
[   31s]         results (in the order of the original sequence, not necessarily
[   31s]         the order of results arrival).  If *return_exceptions* is True,
[   31s]         exceptions in the tasks are treated the same as successful
[   31s]         results, and gathered in the result list; otherwise, the first
[   31s]         raised exception will be immediately propagated to the returned
[   31s]         future.
[   31s]     
[   31s]         Cancellation: if the outer Future is cancelled, all children (that
[   31s]         have not completed yet) are also cancelled.  If any child is
[   31s]         cancelled, this is treated as if it raised CancelledError --
[   31s]         the outer Future is *not* cancelled in this case.  (This is to
[   31s]         prevent the cancellation of one child to cause other children to
[   31s]         be cancelled.)
[   31s]     
[   31s]         If *return_exceptions* is False, cancelling gather() after it
[   31s]         has been marked done won't cancel any submitted awaitables.
[   31s]         For instance, gather can be marked done after propagating an
[   31s]         exception to the caller, therefore, calling ``gather.cancel()``
[   31s]         after catching an exception (raised by one of the awaitables) from
[   31s]         gather won't cancel any other awaitables.
[   31s]         """
[   31s]         if loop is not None:
[   31s] >           warnings.warn("The loop argument is deprecated since Python 3.8, "
[   31s]                           "and scheduled for removal in Python 3.10.",
[   31s]                           DeprecationWarning, stacklevel=2)
[   31s] E           DeprecationWarning: The loop argument is deprecated since Python 3.8, and scheduled for removal in Python 3.10.
[   31s] 
[   31s] /usr/lib64/python3.9/asyncio/tasks.py:755: DeprecationWarning
[   31s] ____________________ test_gzip_streaming_response[asyncio] _____________________
[   31s] 
[   31s] test_client_factory = functools.partial(<class 'starlette.testclient.TestClient'>, backend='asyncio', backend_options={})
[   31s] 
[   31s]     def test_gzip_streaming_response(test_client_factory):
[   31s]         app = Starlette()
[   31s]     
[   31s]         app.add_middleware(GZipMiddleware)
[   31s]     
[   31s]         @app.route("/")
[   31s]         def homepage(request):
[   31s]             async def generator(bytes, count):
[   31s]                 for index in range(count):
[   31s]                     yield bytes
[   31s]     
[   31s]             streaming = generator(bytes=b"x" * 400, count=10)
[   31s]             return StreamingResponse(streaming, status_code=200)
[   31s]     
[   31s]         client = test_client_factory(app)
[   31s] >       response = client.get("/", headers={"accept-encoding": "gzip"})
[   31s] 
[   31s] tests/middleware/test_gzip.py:72: 
[   31s] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
[   31s] /usr/lib/python3.9/site-packages/requests/sessions.py:555: in get
[   31s]     return self.request('GET', url, **kwargs)
[   31s] starlette/testclient.py:468: in request
[   31s]     return super().request(
[   31s] /usr/lib/python3.9/site-packages/requests/sessions.py:542: in request
[   31s]     resp = self.send(prep, **send_kwargs)
[   31s] /usr/lib/python3.9/site-packages/requests/sessions.py:655: in send
[   31s]     r = adapter.send(request, **kwargs)
[   31s] starlette/testclient.py:266: in send
[   31s]     raise exc
[   31s] starlette/testclient.py:263: in send
[   31s]     portal.call(self.app, scope, receive, send)
[   31s] /usr/lib64/python3.9/contextlib.py:126: in __exit__
[   31s]     next(self.gen)
[   31s] starlette/testclient.py:446: in _portal_factory
[   31s]     yield portal
[   31s] /usr/lib64/python3.9/contextlib.py:126: in __exit__
[   31s]     next(self.gen)
[   31s] /usr/lib/python3.9/site-packages/anyio/from_thread.py:406: in start_blocking_portal
[   31s]     run_future.result()
[   31s] /usr/lib64/python3.9/concurrent/futures/_base.py:445: in result
[   31s]     return self.__get_result()
[   31s] /usr/lib64/python3.9/concurrent/futures/_base.py:390: in __get_result
[   31s]     raise self._exception
[   31s] /usr/lib64/python3.9/concurrent/futures/thread.py:52: in run
[   31s]     result = self.fn(*self.args, **self.kwargs)
[   31s] /usr/lib/python3.9/site-packages/anyio/_core/_eventloop.py:56: in run
[   31s]     return asynclib.run(func, *args, **backend_options)  # type: ignore
[   31s] /usr/lib/python3.9/site-packages/anyio/_backends/_asyncio.py:230: in run
[   31s]     return native_run(wrapper(), debug=debug)
[   31s] /usr/lib64/python3.9/asyncio/runners.py:48: in run
[   31s]     loop.run_until_complete(loop.shutdown_asyncgens())
[   31s] /usr/lib64/python3.9/asyncio/base_events.py:642: in run_until_complete
[   31s]     return future.result()
[   31s] /usr/lib64/python3.9/asyncio/base_events.py:542: in shutdown_asyncgens
[   31s]     results = await tasks.gather(
[   31s] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
[   31s] 
[   31s] loop = <_UnixSelectorEventLoop running=False closed=True debug=False>
[   31s] return_exceptions = True
[   31s] coros_or_futures = (<async_generator_athrow object at 0x7ffb978b5b40>,)
[   31s] 
[   31s]     def gather(*coros_or_futures, loop=None, return_exceptions=False):
[   31s]         """Return a future aggregating results from the given coroutines/futures.
[   31s]     
[   31s]         Coroutines will be wrapped in a future and scheduled in the event
[   31s]         loop. They will not necessarily be scheduled in the same order as
[   31s]         passed in.
[   31s]     
[   31s]         All futures must share the same event loop.  If all the tasks are
[   31s]         done successfully, the returned future's result is the list of
[   31s]         results (in the order of the original sequence, not necessarily
[   31s]         the order of results arrival).  If *return_exceptions* is True,
[   31s]         exceptions in the tasks are treated the same as successful
[   31s]         results, and gathered in the result list; otherwise, the first
[   31s]         raised exception will be immediately propagated to the returned
[   31s]         future.
[   31s]     
[   31s]         Cancellation: if the outer Future is cancelled, all children (that
[   31s]         have not completed yet) are also cancelled.  If any child is
[   31s]         cancelled, this is treated as if it raised CancelledError --
[   31s]         the outer Future is *not* cancelled in this case.  (This is to
[   31s]         prevent the cancellation of one child to cause other children to
[   31s]         be cancelled.)
[   31s]     
[   31s]         If *return_exceptions* is False, cancelling gather() after it
[   31s]         has been marked done won't cancel any submitted awaitables.
[   31s]         For instance, gather can be marked done after propagating an
[   31s]         exception to the caller, therefore, calling ``gather.cancel()``
[   31s]         after catching an exception (raised by one of the awaitables) from
[   31s]         gather won't cancel any other awaitables.
[   31s]         """
[   31s]         if loop is not None:
[   31s] >           warnings.warn("The loop argument is deprecated since Python 3.8, "
[   31s]                           "and scheduled for removal in Python 3.10.",
[   31s]                           DeprecationWarning, stacklevel=2)
[   31s] E           DeprecationWarning: The loop argument is deprecated since Python 3.8, and scheduled for removal in Python 3.10.
[   31s] 
[   31s] /usr/lib64/python3.9/asyncio/tasks.py:755: DeprecationWarning
[   31s] =========================== short test summary info ============================
[   31s] SKIPPED [3] tests/conftest.py:14: Trio not supported (yet!)
[   31s] =================== 8 failed, 457 passed, 3 skipped in 6.11s ===================

To reproduce

<!-- Provide a minimal example with steps to reproduce the bug locally.

NOTE: try to keep any external dependencies at an absolute minimum (middleware, servers, proxies, certificates...). In other words, remove anything that doesn't make the bug go away. -->

Expected behavior

<!-- A clear and concise description of what you expected to happen. -->

Actual behavior

<!-- A clear and concise description of what actually happens. -->

Debugging material

<!-- Any tracebacks, screenshots, etc. that can help understanding the problem.

NOTE:

  • Please list tracebacks in full (don't truncate them).
  • Consider using <details> to make tracebacks/logs collapsible if they're very large (see https://gist.github.com/ericclemmons/b146fe5da72ca1f706b2ef72a20ac39d). -->

Environment

  • OS: Linux (openSUSE/Factory)
  • Python version: 3.9.7 (passes on 3.6.15 and on 3.8.12)
  • Starlette version: 0.16.0

Additional context

Complete build log with all versions of packages used and steps taken.

created time in a day

issue closedpython-distro/distro

Failing tests on openSUSE

Hi, trying to fix our openSUSE package of python-distro and to run the test suite, and this is what I am getting? I thought that python3 is supported? Why encoding errors then?

Using tarball from https://files.pythonhosted.org/packages/source/d/distro/distro-1.3.0.tar.gz without any patches.

[    5s] + py.test-2.7
[    5s] ============================= test session starts ==============================
[    5s] platform linux2 -- Python 2.7.15, pytest-3.5.1, py-1.5.3, pluggy-0.6.0
[    5s] rootdir: /home/abuild/rpmbuild/BUILD/distro-1.3.0, inifile:
[    5s] plugins: cov-2.5.1
[    5s] collected 166 items
[    5s] 
[    6s] tests/test_distro.py ................................................... [ 30%]
[    7s] ........................................................................ [ 74%]
[    8s] ...........................................                              [100%]
[    8s] 
[    8s] ========================== 166 passed in 2.96 seconds ==========================
[    8s] ++ '[' -f _current_flavor ']'
[    8s] ++ cat _current_flavor
[    8s] + python_flavor=python2
[    8s] + '[' -z python2 ']'
[    8s] + '[' python2 '!=' python3 ']'
[    8s] + '[' -d build ']'
[    8s] + mv build _build.python2
[    8s] + '[' -d _build.python3 ']'
[    8s] + mv _build.python3 build
[    8s] + echo python3
[    8s] + py.test-3.6
[    8s] ============================= test session starts ==============================
[    8s] platform linux -- Python 3.6.5, pytest-3.5.1, py-1.5.3, pluggy-0.6.0
[    8s] rootdir: /home/abuild/rpmbuild/BUILD/distro-1.3.0, inifile:
[    8s] plugins: cov-2.5.1
[    8s] collected 166 items
[    8s] 
[    9s] tests/test_distro.py .........F....................................F.... [ 30%]
[   10s] ................F...............................F....................... [ 74%]
[   10s] ..F.F....F.................................                              [100%]
[   10s] 
[   10s] =================================== FAILURES ===================================
[   10s] ____________________ TestOSRelease.test_fedora19_os_release ____________________
[   10s] 
[   10s] self = <tests.test_distro.TestOSRelease object at 0x7f2ed5b73208>
[   10s] 
[   10s]     def test_fedora19_os_release(self):
[   10s]         desired_outcome = {
[   10s]             'id': 'fedora',
[   10s]             'name': 'Fedora',
[   10s]             'pretty_name': u'Fedora 19 (Schr\u00F6dinger\u2019s Cat)',
[   10s]             'version': '19',
[   10s]             'pretty_version': u'19 (Schr\u00F6dinger\u2019s Cat)',
[   10s]             'best_version': '19',
[   10s]             'codename': u'Schr\u00F6dinger\u2019s Cat'
[   10s]         }
[   10s] >       self._test_outcome(desired_outcome)
[   10s] 
[   10s] tests/test_distro.py:200: 
[   10s] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
[   10s] tests/test_distro.py:123: in _test_outcome
[   10s]     assert self.distro.id() == outcome.get('id', '')
[   10s] distro.py:688: in id
[   10s]     distro_id = self.os_release_attr('id')
[   10s] distro.py:883: in os_release_attr
[   10s]     return self._os_release_info.get(attribute, '')
[   10s] distro.py:550: in __get__
[   10s]     ret = obj.__dict__[self._fname] = self._f(obj)
[   10s] distro.py:922: in _os_release_info
[   10s]     return self._parse_os_release_content(release_file)
[   10s] distro.py:953: in _parse_os_release_content
[   10s]     tokens = list(lexer)
[   10s] /usr/lib64/python3.6/shlex.py:295: in __next__
[   10s]     token = self.get_token()
[   10s] /usr/lib64/python3.6/shlex.py:105: in get_token
[   10s]     raw = self.read_token()
[   10s] /usr/lib64/python3.6/shlex.py:136: in read_token
[   10s]     nextchar = self.instream.read(1)
[   10s] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
[   10s] 
[   10s] self = <encodings.ascii.IncrementalDecoder object at 0x7f2ed5b73c88>
[   10s] input = b'NAME=Fedora\nVERSION="19 (Schr\xc3\xb6dinger\xe2\x80\x99s Cat)"\nID=fedora\nVERSION_ID=19\nPRETTY_NAME="Fedora 19 (Schr\xc3\xb6dinger\xe2\x80\x99s Cat)"\nANSI_COLOR="0;34"\nCPE_NAME="cpe:/o:fedoraproject:fedora:19"\n'
[   10s] final = False
[   10s] 
[   10s]     def decode(self, input, final=False):
[   10s] >       return codecs.ascii_decode(input, self.errors)[0]
[   10s] E       UnicodeDecodeError: 'ascii' codec can't decode byte 0xc3 in position 29: ordinal not in range(128)
[   10s] 
[   10s] /usr/lib64/python3.6/encodings/ascii.py:26: UnicodeDecodeError
[   10s] _________________ TestDistroRelease.test_fedora19_dist_release _________________
[   10s] 
[   10s] self = <tests.test_distro.TestDistroRelease object at 0x7f2ed5b06128>
[   10s] 
[   10s]     def test_fedora19_dist_release(self):
[   10s]         desired_outcome = {
[   10s]             'id': 'fedora',
[   10s]             'name': 'Fedora',
[   10s]             'pretty_name': u'Fedora 19 (Schr\u00F6dinger\u2019s Cat)',
[   10s]             'version': '19',
[   10s]             'pretty_version': u'19 (Schr\u00F6dinger\u2019s Cat)',
[   10s]             'best_version': '19',
[   10s]             'codename': u'Schr\u00F6dinger\u2019s Cat',
[   10s]             'major_version': '19'
[   10s]         }
[   10s] >       self._test_outcome(desired_outcome, 'fedora', '19')
[   10s] 
[   10s] tests/test_distro.py:685: 
[   10s] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
[   10s] tests/test_distro.py:625: in _test_outcome
[   10s]     assert self.distro.id() == outcome.get('id', '')
[   10s] distro.py:696: in id
[   10s]     distro_id = self.distro_release_attr('id')
[   10s] distro.py:901: in distro_release_attr
[   10s]     return self._distro_release_info.get(attribute, '')
[   10s] distro.py:550: in __get__
[   10s]     ret = obj.__dict__[self._fname] = self._f(obj)
[   10s] distro.py:1068: in _distro_release_info
[   10s]     self.distro_release_file)
[   10s] distro.py:1134: in _parse_distro_release_file
[   10s]     return self._parse_distro_release_content(fp.readline())
[   10s] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
[   10s] 
[   10s] self = <encodings.ascii.IncrementalDecoder object at 0x7f2ed5b060b8>
[   10s] input = b'Fedora release 19 (Schr\xc3\xb6dinger\xe2\x80\x99s Cat)\n'
[   10s] final = False
[   10s] 
[   10s]     def decode(self, input, final=False):
[   10s] >       return codecs.ascii_decode(input, self.errors)[0]
[   10s] E       UnicodeDecodeError: 'ascii' codec can't decode byte 0xc3 in position 23: ordinal not in range(128)
[   10s] 
[   10s] /usr/lib64/python3.6/encodings/ascii.py:26: UnicodeDecodeError
[   10s] ______________________ TestOverall.test_fedora19_release _______________________
[   10s] 
[   10s] self = <tests.test_distro.TestOverall object at 0x7f2ed5b72240>
[   10s] 
[   10s]     def test_fedora19_release(self):
[   10s]         desired_outcome = {
[   10s]             'id': 'fedora',
[   10s]             'name': 'Fedora',
[   10s]             'pretty_name': u'Fedora 19 (Schr\u00F6dinger\u2019s Cat)',
[   10s]             'version': '19',
[   10s]             'pretty_version': u'19 (Schr\u00F6dinger\u2019s Cat)',
[   10s]             'best_version': '19',
[   10s]             'codename': u'Schr\u00F6dinger\u2019s Cat',
[   10s]             'major_version': '19'
[   10s]         }
[   10s] >       self._test_outcome(desired_outcome)
[   10s] 
[   10s] tests/test_distro.py:1047: 
[   10s] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
[   10s] tests/test_distro.py:914: in _test_outcome
[   10s]     assert self.distro.id() == outcome.get('id', '')
[   10s] distro.py:688: in id
[   10s]     distro_id = self.os_release_attr('id')
[   10s] distro.py:883: in os_release_attr
[   10s]     return self._os_release_info.get(attribute, '')
[   10s] distro.py:550: in __get__
[   10s]     ret = obj.__dict__[self._fname] = self._f(obj)
[   10s] distro.py:922: in _os_release_info
[   10s]     return self._parse_os_release_content(release_file)
[   10s] distro.py:953: in _parse_os_release_content
[   10s]     tokens = list(lexer)
[   10s] /usr/lib64/python3.6/shlex.py:295: in __next__
[   10s]     token = self.get_token()
[   10s] /usr/lib64/python3.6/shlex.py:105: in get_token
[   10s]     raw = self.read_token()
[   10s] /usr/lib64/python3.6/shlex.py:136: in read_token
[   10s]     nextchar = self.instream.read(1)
[   10s] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
[   10s] 
[   10s] self = <encodings.ascii.IncrementalDecoder object at 0x7f2ed5b723c8>
[   10s] input = b'NAME=Fedora\nVERSION="19 (Schr\xc3\xb6dinger\xe2\x80\x99s Cat)"\nID=fedora\nVERSION_ID=19\nPRETTY_NAME="Fedora 19 (Schr\xc3\xb6dinger\xe2\x80\x99s Cat)"\nANSI_COLOR="0;34"\nCPE_NAME="cpe:/o:fedoraproject:fedora:19"\n'
[   10s] final = False
[   10s] 
[   10s]     def decode(self, input, final=False):
[   10s] >       return codecs.ascii_decode(input, self.errors)[0]
[   10s] E       UnicodeDecodeError: 'ascii' codec can't decode byte 0xc3 in position 29: ordinal not in range(128)
[   10s] 
[   10s] /usr/lib64/python3.6/encodings/ascii.py:26: UnicodeDecodeError
[   10s] _____________ TestOverallWithEtcNotReadable.test_fedora19_release ______________
[   10s] 
[   10s] self = <tests.test_distro.TestOverallWithEtcNotReadable object at 0x7f2ed59415f8>
[   10s] 
[   10s]     def test_fedora19_release(self):
[   10s]         desired_outcome = {
[   10s]             'id': 'fedora',
[   10s]             'name': 'Fedora',
[   10s]             'pretty_name': u'Fedora 19 (Schr\u00F6dinger\u2019s Cat)',
[   10s]             'version': '19',
[   10s]             'pretty_version': u'19 (Schr\u00F6dinger\u2019s Cat)',
[   10s]             'best_version': '19',
[   10s]             'codename': u'Schr\u00F6dinger\u2019s Cat',
[   10s]             'major_version': '19'
[   10s]         }
[   10s] >       self._test_outcome(desired_outcome)
[   10s] 
[   10s] tests/test_distro.py:1047: 
[   10s] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
[   10s] tests/test_distro.py:914: in _test_outcome
[   10s]     assert self.distro.id() == outcome.get('id', '')
[   10s] distro.py:688: in id
[   10s]     distro_id = self.os_release_attr('id')
[   10s] distro.py:883: in os_release_attr
[   10s]     return self._os_release_info.get(attribute, '')
[   10s] distro.py:550: in __get__
[   10s]     ret = obj.__dict__[self._fname] = self._f(obj)
[   10s] distro.py:922: in _os_release_info
[   10s]     return self._parse_os_release_content(release_file)
[   10s] distro.py:953: in _parse_os_release_content
[   10s]     tokens = list(lexer)
[   10s] /usr/lib64/python3.6/shlex.py:295: in __next__
[   10s]     token = self.get_token()
[   10s] /usr/lib64/python3.6/shlex.py:105: in get_token
[   10s]     raw = self.read_token()
[   10s] /usr/lib64/python3.6/shlex.py:136: in read_token
[   10s]     nextchar = self.instream.read(1)
[   10s] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
[   10s] 
[   10s] self = <encodings.ascii.IncrementalDecoder object at 0x7f2ed5941b00>
[   10s] input = b'NAME=Fedora\nVERSION="19 (Schr\xc3\xb6dinger\xe2\x80\x99s Cat)"\nID=fedora\nVERSION_ID=19\nPRETTY_NAME="Fedora 19 (Schr\xc3\xb6dinger\xe2\x80\x99s Cat)"\nANSI_COLOR="0;34"\nCPE_NAME="cpe:/o:fedoraproject:fedora:19"\n'
[   10s] final = False
[   10s] 
[   10s]     def decode(self, input, final=False):
[   10s] >       return codecs.ascii_decode(input, self.errors)[0]
[   10s] E       UnicodeDecodeError: 'ascii' codec can't decode byte 0xc3 in position 29: ordinal not in range(128)
[   10s] 
[   10s] /usr/lib64/python3.6/encodings/ascii.py:26: UnicodeDecodeError
[   10s] _______________________ TestGetAttr.test_os_release_attr _______________________
[   10s] 
[   10s] self = <tests.test_distro.TestGetAttr object at 0x7f2ed5b01128>
[   10s] 
[   10s]     def test_os_release_attr(self):
[   10s] >       self._test_attr('os_release_info', 'os_release_attr')
[   10s] 
[   10s] tests/test_distro.py:1571: 
[   10s] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
[   10s] tests/test_distro.py:1563: in _test_attr
[   10s]     info = getattr(_distro, info_method)()
[   10s] distro.py:846: in os_release_info
[   10s]     return self._os_release_info
[   10s] distro.py:550: in __get__
[   10s]     ret = obj.__dict__[self._fname] = self._f(obj)
[   10s] distro.py:922: in _os_release_info
[   10s]     return self._parse_os_release_content(release_file)
[   10s] distro.py:953: in _parse_os_release_content
[   10s]     tokens = list(lexer)
[   10s] /usr/lib64/python3.6/shlex.py:295: in __next__
[   10s]     token = self.get_token()
[   10s] /usr/lib64/python3.6/shlex.py:105: in get_token
[   10s]     raw = self.read_token()
[   10s] /usr/lib64/python3.6/shlex.py:136: in read_token
[   10s]     nextchar = self.instream.read(1)
[   10s] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
[   10s] 
[   10s] self = <encodings.ascii.IncrementalDecoder object at 0x7f2ed5b01c50>
[   10s] input = b'NAME=Fedora\nVERSION="19 (Schr\xc3\xb6dinger\xe2\x80\x99s Cat)"\nID=fedora\nVERSION_ID=19\nPRETTY_NAME="Fedora 19 (Schr\xc3\xb6dinger\xe2\x80\x99s Cat)"\nANSI_COLOR="0;34"\nCPE_NAME="cpe:/o:fedoraproject:fedora:19"\n'
[   10s] final = False
[   10s] 
[   10s]     def decode(self, input, final=False):
[   10s] >       return codecs.ascii_decode(input, self.errors)[0]
[   10s] E       UnicodeDecodeError: 'ascii' codec can't decode byte 0xc3 in position 29: ordinal not in range(128)
[   10s] 
[   10s] /usr/lib64/python3.6/encodings/ascii.py:26: UnicodeDecodeError
[   10s] _____________________ TestGetAttr.test_distro_release_attr _____________________
[   10s] 
[   10s] self = <tests.test_distro.TestGetAttr object at 0x7f2ed5882710>
[   10s] 
[   10s]     def test_distro_release_attr(self):
[   10s] >       self._test_attr('distro_release_info', 'distro_release_attr')
[   10s] 
[   10s] tests/test_distro.py:1577: 
[   10s] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
[   10s] tests/test_distro.py:1563: in _test_attr
[   10s]     info = getattr(_distro, info_method)()
[   10s] distro.py:866: in distro_release_info
[   10s]     return self._distro_release_info
[   10s] distro.py:550: in __get__
[   10s]     ret = obj.__dict__[self._fname] = self._f(obj)
[   10s] distro.py:1111: in _distro_release_info
[   10s]     distro_info = self._parse_distro_release_file(filepath)
[   10s] distro.py:1134: in _parse_distro_release_file
[   10s]     return self._parse_distro_release_content(fp.readline())
[   10s] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
[   10s] 
[   10s] self = <encodings.ascii.IncrementalDecoder object at 0x7f2ed5882278>
[   10s] input = b'Fedora release 19 (Schr\xc3\xb6dinger\xe2\x80\x99s Cat)\n'
[   10s] final = False
[   10s] 
[   10s]     def decode(self, input, final=False):
[   10s] >       return codecs.ascii_decode(input, self.errors)[0]
[   10s] E       UnicodeDecodeError: 'ascii' codec can't decode byte 0xc3 in position 23: ordinal not in range(128)
[   10s] 
[   10s] /usr/lib64/python3.6/encodings/ascii.py:26: UnicodeDecodeError
[   10s] ______________________________ TestInfo.test_all _______________________________
[   10s] 
[   10s] self = <tests.test_distro.TestInfo object at 0x7f2ed5bc0e48>
[   10s] 
[   10s]     def test_all(self):
[   10s]         """Test info() by comparing its results with the results of specific
[   10s]             consolidated accessor functions.
[   10s]             """
[   10s]         def _test_all(info, best=False, pretty=False):
[   10s]             assert info['id'] == _distro.id()
[   10s]             assert info['version'] == _distro.version(pretty=pretty, best=best)
[   10s]             assert info['version_parts']['major'] == \
[   10s]                 _distro.major_version(best=best)
[   10s]             assert info['version_parts']['minor'] == \
[   10s]                 _distro.minor_version(best=best)
[   10s]             assert info['version_parts']['build_number'] == \
[   10s]                 _distro.build_number(best=best)
[   10s]             assert info['like'] == _distro.like()
[   10s]             assert info['codename'] == _distro.codename()
[   10s]             assert len(info['version_parts']) == 3
[   10s]             assert len(info) == 5
[   10s]     
[   10s]         for dist in DISTROS:
[   10s]             self._setup_for_distro(os.path.join(DISTROS_DIR, dist))
[   10s]     
[   10s]             _distro = distro.LinuxDistribution()
[   10s]     
[   10s] >           info = _distro.info()
[   10s] 
[   10s] tests/test_distro.py:1691: 
[   10s] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
[   10s] distro.py:828: in info
[   10s]     id=self.id(),
[   10s] distro.py:688: in id
[   10s]     distro_id = self.os_release_attr('id')
[   10s] distro.py:883: in os_release_attr
[   10s]     return self._os_release_info.get(attribute, '')
[   10s] distro.py:550: in __get__
[   10s]     ret = obj.__dict__[self._fname] = self._f(obj)
[   10s] distro.py:922: in _os_release_info
[   10s]     return self._parse_os_release_content(release_file)
[   10s] distro.py:953: in _parse_os_release_content
[   10s]     tokens = list(lexer)
[   10s] /usr/lib64/python3.6/shlex.py:295: in __next__
[   10s]     token = self.get_token()
[   10s] /usr/lib64/python3.6/shlex.py:105: in get_token
[   10s]     raw = self.read_token()
[   10s] /usr/lib64/python3.6/shlex.py:136: in read_token
[   10s]     nextchar = self.instream.read(1)
[   10s] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
[   10s] 
[   10s] self = <encodings.ascii.IncrementalDecoder object at 0x7f2ed5bc0be0>
[   10s] input = b'NAME=Fedora\nVERSION="19 (Schr\xc3\xb6dinger\xe2\x80\x99s Cat)"\nID=fedora\nVERSION_ID=19\nPRETTY_NAME="Fedora 19 (Schr\xc3\xb6dinger\xe2\x80\x99s Cat)"\nANSI_COLOR="0;34"\nCPE_NAME="cpe:/o:fedoraproject:fedora:19"\n'
[   10s] final = False
[   10s] 
[   10s]     def decode(self, input, final=False):
[   10s] >       return codecs.ascii_decode(input, self.errors)[0]
[   10s] E       UnicodeDecodeError: 'ascii' codec can't decode byte 0xc3 in position 29: ordinal not in range(128)
[   10s] 
[   10s] /usr/lib64/python3.6/encodings/ascii.py:26: UnicodeDecodeError
[   10s] ===================== 7 failed, 159 passed in 2.14 seconds =====================
[   10s] error: Bad exit status from /var/tmp/rpm-tmp.2vqZFC (%check)
[   10s] 
[   10s] 
[   10s] RPM build errors:
[   10s]     Bad exit status from /var/tmp/rpm-tmp.2vqZFC (%check)
[   10s] 
[   10s] milic failed "build python-distro.spec" at Wed Jun 13 15:14:52 UTC 2018.
[   10s]

closed time in 2 days

mcepl

issue commentpython-distro/distro

Failing tests on openSUSE

Yes, explicit setting of locale is not necessary anymore. Thank you.

mcepl

comment created time in 2 days

create barnchmcepl/spacewalk

branch : py3k-gzipstream

created branch time in 3 days

fork mcepl/spacewalk

The master repository for the Spacewalk Project.

https://spacewalkproject.github.io

fork in 3 days

issue commentholoviz/hvplot

Allow testing without network connection

Why don't you keep using the XDG_CACHE_HOME variable? No need to patch anything https://www.fatiando.org/pooch/latest/api/generated/pooch.os_cache.html#pooch.os_cache

I guess, it can go either way.

mcepl

comment created time in 5 days

issue openedholoviz/hvplot

Allow testing without network connection

ALL software version info

(this library, plus any other relevant software, e.g. bokeh, python, notebook, OS, browser, etc)

Complete build log with all packages versions and steps taken (the package is build for both Python 3.8 and Python 3.9). Specifically, hvplot is version 0.7.2. The build is part of the packaging effort on Linux (specifically openSUSE/Factory as of 2021-09-23).

Description of expected behavior and the observed behavior

When running the test suite, it fails, because inside of the building environment there is no network connection for security reasons (it is the same with all Linux and non-Linux distribution build systems).

This patch makes it possible to preload cache (per default in ~/.cache\/xarray_tutorial_data/) for tests and make it possible to run tests without network.

created time in 5 days

issue commentgoogle/pasta

Some tests are broken

+1 from me while packaging google-pasta for openSUSE/Factory:

[    6s] + PYTHONPATH=/home/abuild/rpmbuild/BUILDROOT/python-google-pasta-0.2.0-0.x86_64/usr/lib/python3.9/site-packages
[    6s] + PYTHONDONTWRITEBYTECODE=1
[    6s] + pytest-3.9 --ignore=_build.python36 --ignore=_build.python39 --ignore=_build.python38 -v
[    6s] ============================= test session starts ==============================
[    6s] platform linux -- Python 3.9.7, pytest-6.2.4, py-1.10.0, pluggy-0.13.1 -- /usr/bin/python3.9
[    6s] cachedir: .pytest_cache
[    6s] rootdir: /home/abuild/rpmbuild/BUILD/pasta-0.2.0
[    6s] collecting ... collected 242 items
[    6s]

[ ... many PASSED lines deleted ...]

[    7s]
[    7s] =================================== FAILURES ===================================
[    7s] ___________ PrefixSuffixGoldenTest.test_golden_prefix_suffix_fstring ___________
[    7s]
[    7s] self = <pasta.base.annotate_test.PrefixSuffixGoldenTest testMethod=test_golden_prefix_suffix_fstring>
[    7s]
[    7s]     def test(self):
[    7s]       with open(input_file, 'r') as handle:
[    7s]         src = handle.read()
[    7s]       t = ast_utils.parse(src)
[    7s]       annotator = annotate.AstAnnotator(src)
[    7s]       annotator.visit(t)
[    7s]
[    7s]       def escape(s):
[    7s]         return '' if s is None else s.replace('\n', '\\n')
[    7s]
[    7s]       result = '\n'.join(
[    7s]           "{0:12} {1:20} \tprefix=|{2}|\tsuffix=|{3}|\tindent=|{4}|".format(
[    7s]               str((getattr(n, 'lineno', -1), getattr(n, 'col_offset', -1))),
[    7s]               type(n).__name__ + ' ' + _get_node_identifier(n),
[    7s]               escape(fmt.get(n, 'prefix')),
[    7s]               escape(fmt.get(n, 'suffix')),
[    7s]               escape(fmt.get(n, 'indent')))
[    7s]           for n in ast.walk(t)) + '\n'
[    7s]
[    7s]       # If specified, write the golden data instead of checking it
[    7s]       if getattr(self, 'generate_goldens', False):
[    7s]         if not os.path.isdir(os.path.dirname(golden_file)):
[    7s]           os.makedirs(os.path.dirname(golden_file))
[    7s]         with open(golden_file, 'w') as f:
[    7s]           f.write(result)
[    7s]         print('Wrote: ' + golden_file)
[    7s]         return
[    7s]
[    7s]       try:
[    7s]         with open(golden_file, 'r') as f:
[    7s]           golden = f.read()
[    7s]       except IOError:
[    7s]         self.fail('Missing golden data.')
[    7s]
[    7s] >     self.assertMultiLineEqual(golden, result)
[    7s] E     AssertionError: '(-1,[5721 chars](21, 1)      Name a               \tprefix=||\[4307 chars]||\n' != '(-1,[5721 chars](21, 3)      Name a               \tprefix=||\[4307 chars]||\n'
[    7s] E       (-1, -1)     Module               	prefix=||	suffix=||	indent=||
[    7s] E       (1, 0)       Expr                 	prefix=||	suffix=|\n|	indent=||
[    7s] E       (3, 0)       Expr                 	prefix=|\n|	suffix=|\n|	indent=||
[    7s] E       (5, 0)       Expr                 	prefix=|\n|	suffix=|\n|	indent=||
[    7s] E       (7, 0)       Expr                 	prefix=|\n|	suffix=|\n|	indent=||
[    7s] E       (9, 0)       Expr                 	prefix=|\n|	suffix=|\n|	indent=||
[    7s] E       (11, 0)      Expr                 	prefix=|\n|	suffix=|\n|	indent=||
[    7s] E       (13, 0)      Expr                 	prefix=|\n|	suffix=|\n|	indent=||
[    7s] E       (15, 0)      Expr                 	prefix=|\n|	suffix=|\n|	indent=||
[    7s] E       (17, 0)      Expr                 	prefix=|\n|	suffix=|\n|	indent=||
[    7s] E       (21, 0)      Expr                 	prefix=|\n|	suffix=|\n|	indent=||
[    7s] E       (23, 0)      Expr                 	prefix=|\n|	suffix=|\n|	indent=||
[    7s] E       (25, 0)      Expr                 	prefix=|\n|	suffix=|\n|	indent=||
[    7s] E       (27, 0)      Expr                 	prefix=|\n|	suffix=|\n|	indent=||
[    7s] E       (29, 0)      Expr                 	prefix=|\n|	suffix=|\n|	indent=||
[    7s] E       (31, 0)      Expr                 	prefix=|\n|	suffix=|\n|	indent=||
[    7s] E       (33, 0)      Expr                 	prefix=|\n|	suffix=|\n|	indent=||
[    7s] E       (36, 0)      Expr                 	prefix=|\n|	suffix=|\n|	indent=||
[    7s] E       (38, 0)      Expr                 	prefix=|\n|	suffix=|\n|	indent=||
[    7s] E       (42, 0)      Expr                 	prefix=|\n|	suffix=|\n|	indent=||
[    7s] E       (45, 0)      Expr                 	prefix=|\n|	suffix=|\n|	indent=||
[    7s] E       (1, 0)       JoinedStr            	prefix=||	suffix=||	indent=||
[    7s] E       (3, 0)       JoinedStr            	prefix=||	suffix=||	indent=||
[    7s] E       (5, 0)       JoinedStr            	prefix=||	suffix=||	indent=||
[    7s] E       (7, 0)       JoinedStr            	prefix=||	suffix=||	indent=||
[    7s] E       (9, 0)       JoinedStr            	prefix=||	suffix=||	indent=||
[    7s] E       (11, 0)      JoinedStr            	prefix=||	suffix=||	indent=||
[    7s] E       (13, 0)      JoinedStr            	prefix=||	suffix=||	indent=||
[    7s] E       (15, 0)      JoinedStr            	prefix=||	suffix=||	indent=||
[    7s] E       (17, 0)      JoinedStr            	prefix=||	suffix=||	indent=||
[    7s] E       (21, 0)      JoinedStr            	prefix=||	suffix=||	indent=||
[    7s] E       (23, 0)      JoinedStr            	prefix=||	suffix=||	indent=||
[    7s] E       (25, 0)      JoinedStr            	prefix=||	suffix=||	indent=||
[    7s] E       (27, 0)      JoinedStr            	prefix=||	suffix=||	indent=||
[    7s] E       (29, 0)      JoinedStr            	prefix=||	suffix=||	indent=||
[    7s] E       (31, 0)      JoinedStr            	prefix=||	suffix=||	indent=||
[    7s] E       (33, 0)      JoinedStr            	prefix=||	suffix=||	indent=||
[    7s] E       (36, 0)      JoinedStr            	prefix=||	suffix=||	indent=||
[    7s] E       (38, 0)      Call                 	prefix=||	suffix=||	indent=||
[    7s] E       (42, 1)      JoinedStr            	prefix=|(|	suffix=|)|	indent=||
[    7s] E       (45, 1)      JoinedStr            	prefix=|(|	suffix=|)|	indent=||
[    7s] E       (1, 0)       Constant             	prefix=||	suffix=||	indent=||
[    7s] E       (3, 0)       FormattedValue       	prefix=||	suffix=||	indent=||
[    7s] E       (5, 0)       Constant             	prefix=||	suffix=||	indent=||
[    7s] E       (5, 0)       FormattedValue       	prefix=||	suffix=||	indent=||
[    7s] E       (5, 0)       Constant             	prefix=||	suffix=||	indent=||
[    7s] E       (7, 0)       Constant             	prefix=||	suffix=||	indent=||
[    7s] E       (7, 0)       FormattedValue       	prefix=||	suffix=||	indent=||
[    7s] E       (9, 0)       FormattedValue       	prefix=||	suffix=||	indent=||
[    7s] E       (9, 0)       Constant             	prefix=||	suffix=||	indent=||
[    7s] E       (11, 0)      Constant             	prefix=||	suffix=||	indent=||
[    7s] E       (11, 0)      FormattedValue       	prefix=||	suffix=||	indent=||
[    7s] E       (13, 0)      Constant             	prefix=||	suffix=||	indent=||
[    7s] E       (13, 0)      FormattedValue       	prefix=|  |	suffix=||	indent=||
[    7s] E       (15, 0)      Constant             	prefix=||	suffix=||	indent=||
[    7s] E       (15, 0)      FormattedValue       	prefix=||	suffix=||	indent=||
[    7s] E       (15, 0)      Constant             	prefix=||	suffix=||	indent=||
[    7s] E       (17, 0)      FormattedValue       	prefix=|\n  |	suffix=|\n|	indent=||
[    7s] E       (21, 0)      FormattedValue       	prefix=||	suffix=||	indent=||
[    7s] E       (23, 0)      FormattedValue       	prefix=||	suffix=||	indent=||
[    7s] E       (25, 0)      FormattedValue       	prefix=||	suffix=||	indent=||
[    7s] E       (27, 0)      FormattedValue       	prefix=||	suffix=||	indent=||
[    7s] E       (29, 0)      FormattedValue       	prefix=||	suffix=||	indent=||
[    7s] E       (31, 0)      FormattedValue       	prefix=||	suffix=||	indent=||
[    7s] E       (33, 0)      FormattedValue       	prefix=|\n  |	suffix=||	indent=||
[    7s] E       (36, 0)      FormattedValue       	prefix=||	suffix=||	indent=||
[    7s] E       (38, 0)      Name foo             	prefix=||	suffix=||	indent=||
[    7s] E       (38, 4)      JoinedStr            	prefix=||	suffix=||	indent=||
[    7s] E       (42, 1)      Constant             	prefix=||	suffix=||	indent=||
[    7s] E       (42, 1)      FormattedValue       	prefix=||	suffix=||	indent=||
[    7s] E       (42, 1)      Constant             	prefix=||	suffix=||	indent=||
[    7s] E       (45, 1)      Constant             	prefix=||	suffix=||	indent=||
[    7s] E       (45, 1)      FormattedValue       	prefix=||	suffix=||	indent=||
[    7s] E       (3, 3)       Name b               	prefix=||	suffix=||	indent=||
[    7s] E       (5, 5)       Name d               	prefix=||	suffix=||	indent=||
[    7s] E       (7, 5)       Name g               	prefix=||	suffix=||	indent=||
[    7s] E       (9, 3)       Name h               	prefix=||	suffix=||	indent=||
[    7s] E       (11, 9)      Name k               	prefix=||	suffix=||	indent=||
[    7s] E       (13, 6)      BinOp                	prefix=||	suffix=||	indent=||
[    7s] E       (15, 5)      Constant             	prefix=||	suffix=||	indent=||
[    7s] E       (18, 2)      BinOp                	prefix=||	suffix=||	indent=||
[    7s] E     - (21, 1)      Name a               	prefix=||	suffix=||	indent=||
[    7s] E     ?      ^
[    7s] E     + (21, 3)      Name a               	prefix=||	suffix=||	indent=||
[    7s] E     ?      ^
[    7s] E     - (23, 1)      Name b               	prefix=||	suffix=||	indent=||
[    7s] E     ?      ^
[    7s] E     + (23, 3)      Name b               	prefix=||	suffix=||	indent=||
[    7s] E     ?      ^
[    7s] E     - (25, 1)      Name c               	prefix=||	suffix=||	indent=||
[    7s] E     ?      ^
[    7s] E     + (25, 3)      Name c               	prefix=||	suffix=||	indent=||
[    7s] E     ?      ^
[    7s] E       (25, 0)      JoinedStr            	prefix=||	suffix=||	indent=||
[    7s] E     - (27, 1)      Name c               	prefix=||	suffix=|  |	indent=||
[    7s] E     ?      ^
[    7s] E     + (27, 3)      Name c               	prefix=||	suffix=|  |	indent=||
[    7s] E     ?      ^
[    7s] E       (27, 0)      JoinedStr            	prefix=||	suffix=| |	indent=||
[    7s] E     - (29, 1)      Name f               	prefix=||	suffix=||	indent=||
[    7s] E     ?      ^
[    7s] E     + (29, 3)      Name f               	prefix=||	suffix=||	indent=||
[    7s] E     ?      ^
[    7s] E       (29, 0)      JoinedStr            	prefix=||	suffix=||	indent=||
[    7s] E     - (31, 1)      Name h               	prefix=||	suffix=| |	indent=||
[    7s] E     ?      ^
[    7s] E     + (31, 3)      Name h               	prefix=||	suffix=| |	indent=||
[    7s] E     ?      ^
[    7s] E       (31, 0)      JoinedStr            	prefix=||	suffix=||	indent=||
[    7s] E       (34, 2)      Name l               	prefix=||	suffix=||	indent=||
[    7s] E       (33, 0)      JoinedStr            	prefix=||	suffix=| |	indent=||
[    7s] E       (36, 3)      Attribute c          	prefix=||	suffix=||	indent=||
[    7s] E       (-1, -1)     Load                 	prefix=||	suffix=||	indent=||
[    7s] E       (38, 4)      FormattedValue       	prefix=||	suffix=||	indent=||
[    7s] E       (38, 4)      Constant             	prefix=||	suffix=||	indent=||
[    7s] E       (42, 6)      Name e               	prefix=||	suffix=||	indent=||
[    7s] E       (46, 7)      Name h               	prefix=||	suffix=||	indent=||
[    7s] E       (-1, -1)     Load                 	prefix=||	suffix=||	indent=||
[    7s] E       (-1, -1)     Load                 	prefix=||	suffix=||	indent=||
[    7s] E       (-1, -1)     Load                 	prefix=||	suffix=||	indent=||
[    7s] E       (-1, -1)     Load                 	prefix=||	suffix=||	indent=||
[    7s] E       (-1, -1)     Load                 	prefix=||	suffix=||	indent=||
[    7s] E       (13, 6)      Constant             	prefix=||	suffix=| |	indent=||
[    7s] E       (-1, -1)     Add                  	prefix=||	suffix=||	indent=||
[    7s] E       (13, 10)     Constant             	prefix=||	suffix=| |	indent=||
[    7s] E       (18, 2)      Constant             	prefix=||	suffix=| |	indent=||
[    7s] E       (-1, -1)     Add                  	prefix=||	suffix=||	indent=||
[    7s] E       (18, 6)      Constant             	prefix=||	suffix=||	indent=||
[    7s] E       (-1, -1)     Load                 	prefix=||	suffix=||	indent=||
[    7s] E       (-1, -1)     Load                 	prefix=||	suffix=||	indent=||
[    7s] E       (-1, -1)     Load                 	prefix=||	suffix=||	indent=||
[    7s] E       (25, 0)      Constant             	prefix=||	suffix=||	indent=||
[    7s] E       (-1, -1)     Load                 	prefix=||	suffix=||	indent=||
[    7s] E       (27, 0)      Constant             	prefix=||	suffix=||	indent=||
[    7s] E       (-1, -1)     Load                 	prefix=||	suffix=||	indent=||
[    7s] E       (29, 0)      FormattedValue       	prefix=||	suffix=||	indent=||
[    7s] E       (-1, -1)     Load                 	prefix=||	suffix=||	indent=||
[    7s] E       (31, 0)      Constant             	prefix=||	suffix=||	indent=||
[    7s] E       (31, 0)      FormattedValue       	prefix=||	suffix=||	indent=||
[    7s] E       (31, 0)      Constant             	prefix=||	suffix=||	indent=||
[    7s] E       (31, 0)      FormattedValue       	prefix=||	suffix=||	indent=||
[    7s] E       (-1, -1)     Load                 	prefix=||	suffix=||	indent=||
[    7s] E       (33, 0)      Constant             	prefix=||	suffix=||	indent=||
[    7s] E       (33, 0)      FormattedValue       	prefix=||	suffix=||	indent=||
[    7s] E       (33, 0)      Constant             	prefix=||	suffix=||	indent=||
[    7s] E       (36, 4)      Attribute b          	prefix=|(|	suffix=|)|	indent=||
[    7s] E       (-1, -1)     Load                 	prefix=||	suffix=||	indent=||
[    7s] E       (38, 7)      Name d               	prefix=||	suffix=||	indent=||
[    7s] E       (-1, -1)     Load                 	prefix=||	suffix=||	indent=||
[    7s] E       (-1, -1)     Load                 	prefix=||	suffix=||	indent=||
[    7s] E       (29, 6)      Name g               	prefix=||	suffix=||	indent=||
[    7s] E       (31, 12)     Name i               	prefix=||	suffix=||	indent=||
[    7s] E       (31, 17)     Name j               	prefix=||	suffix=||	indent=||
[    7s] E       (34, 10)     Name m               	prefix=||	suffix=||	indent=||
[    7s] E       (36, 4)      Name a               	prefix=||	suffix=||	indent=||
[    7s] E       (-1, -1)     Load                 	prefix=||	suffix=||	indent=||
[    7s] E       (-1, -1)     Load                 	prefix=||	suffix=||	indent=||
[    7s] E       (-1, -1)     Load                 	prefix=||	suffix=||	indent=||
[    7s] E       (-1, -1)     Load                 	prefix=||	suffix=||	indent=||
[    7s] E       (-1, -1)     Load                 	prefix=||	suffix=||	indent=||
[    7s] E       (-1, -1)     Load                 	prefix=||	suffix=||	indent=||
[    7s] E       (-1, -1)     Load                 	prefix=||	suffix=||	indent=||
[    7s]
[    7s] pasta/base/annotate_test.py:376: AssertionError
[    7s] =============================== warnings summary ===============================
[    7s] pasta/augment/inline_test.py::InlineTest::test_inline_conditional_fails
[    7s]   /home/abuild/rpmbuild/BUILD/pasta-0.2.0/pasta/augment/inline_test.py:79: DeprecationWarning: Please use assertRaisesRegex instead.
[    7s]     with self.assertRaisesRegexp(inline.InlineError,
[    7s]
[    7s] pasta/augment/inline_test.py::InlineTest::test_inline_function_fails
[    7s]   /home/abuild/rpmbuild/BUILD/pasta-0.2.0/pasta/augment/inline_test.py:71: DeprecationWarning: Please use assertRaisesRegex instead.
[    7s]     with self.assertRaisesRegexp(
[    7s]
[    7s] pasta/augment/inline_test.py::InlineTest::test_inline_non_assign_fails
[    7s]   /home/abuild/rpmbuild/BUILD/pasta-0.2.0/pasta/augment/inline_test.py:86: DeprecationWarning: Please use assertRaisesRegex instead.
[    7s]     with self.assertRaisesRegexp(
[    7s]
[    7s] pasta/augment/inline_test.py::InlineTest::test_inline_non_constant_fails
[    7s]   /home/abuild/rpmbuild/BUILD/pasta-0.2.0/pasta/augment/inline_test.py:63: DeprecationWarning: Please use assertRaisesRegex instead.
[    7s]     with self.assertRaisesRegexp(inline.InlineError,
[    7s]
[    7s] -- Docs: https://docs.pytest.org/en/stable/warnings.html
[    7s] =========================== short test summary info ============================
[    7s] FAILED pasta/base/annotate_test.py::PrefixSuffixGoldenTest::test_golden_prefix_suffix_fstring
[    7s] ============ 1 failed, 218 passed, 23 skipped, 4 warnings in 0.83s =============
soupytwist

comment created time in 5 days

issue commentJimmXinu/FanFicFare

Cloudflare and FanFiction.net doing its thing again

Something that came up on the MR thread is that FFDL's cache doesn't get the cover image, if that's important to you.

OK, but if I use FFDL via use_nsapa_proxy:true then there is no difference?

chocolatechipcats

comment created time in 6 days

issue commentJimmXinu/FanFicFare

Cloudflare and FanFiction.net doing its thing again

Is there some comparison between advantages and disadvantages of flaresolverr and FanFictionDownloader? Are they essentially the same or is there some difference?

chocolatechipcats

comment created time in 6 days

push eventmcepl/cpython

Ken Jin

commit sha f24afda5917ce1710ad08ca34b2509f1f2b16de2

bpo-26110: Add ``CALL_METHOD_KW`` opcode to speedup method calls with keywords (GH-26014) * Add CALL_METHOD_KW * Make CALL_METHOD branchless too since it shares the same code * Place parentheses in STACK_SHRINK

view details

Pablo Galindo

commit sha c5b833046d9dbb2063f776703fc513b71664b859

bpo-44139: Use a more descriptive syntax error comprehension case in the What's New for 3.10 (GH-26145)

view details

Miguel Brito

commit sha 086b5c6ce16b524629428b4fd5c9213930bfca47

bpo-32133: Improve numbers docs (GH-26124)

view details

Pablo Galindo

commit sha 80b089179fa798c8ceaab2ff699c82499b2fcacd

bpo-44143: Fix crash in the parser when raising tokenizer errors with an exception set (GH-26144)

view details

Raymond Hettinger

commit sha fdfea4ab16ff65234dc30f51ed8056138ab19005

Improve speed and accuracy for correlation() (GH-26135)

view details

Barney Gale

commit sha 1a08c5ac49b748d5e4e4b508d22d3804e3cd4dcc

bpo-39950: Fix deprecation warning in test for `pathlib.Path.link_to()` (GH-26155)

view details

Batuhan Taskaya

commit sha 51cef8be8c77dff522bec6f95d6853031bf19038

bpo-44142: drop redundant parantheses when unparsing tuples as assignment targets (GH-26156)

view details

Ashwin Ramaswami

commit sha de367378f67d7e90e4015100b19277685a3c9bb3

Fix typo in comment (GH-26162)

view details

Serhii Hidenko

commit sha a42d98ed91cd1f08b2e9734ca6ca136dd10dff5d

Remove a redundant assignment in Tools/unittestgui/unittestgui.py (GH-21438)

view details

flizzywine

commit sha be54fb5ae73db507a0cdb1884d553aca5966f0e6

fix docstring typo in bdb.py (GH-22323)

view details

Zackery Spytz

commit sha 56df20d7014bfe2df9cd46dece93271b516c50f6

Fix a typo/error in a news entry (bidst_wheel -> bdist_wheel) (GH-24284) This error was fixed recently in `Doc/whatsnew/3.10.rst`. Automerge-Triggered-By: GH:iritkatriel

view details

Rafael Fontenelle

commit sha fdc7e52f5f1853e350407c472ae031339ac7f60c

[doc] Fix typo in os module (GH-24464) Automerge-Triggered-By: GH:iritkatriel

view details

Raymond Hettinger

commit sha b3f65e819f552561294a66e350a9f5a3131f7df2

Apply edits from Allen Downey's review of the linear_regression docs. (GH-26176)

view details

Sergey B Kirpichev

commit sha b102dd598dd2666b72e93ae53ae813d1e88f186c

bpo-44154: optimize Fraction pickling (GH-26186)

view details

Gregory P. Smith

commit sha c10392e7ddb3eafbd11e9ffe335c07648426715f

bpo-44145: Release the GIL around HMAC_Update. (GH-26157) It was always meant to be released for parallelization. This now matches the other similar code in the module. Thanks michaelforney for noticing!

view details

Pete Wicken

commit sha 83f0f8d62f279f846a92fede2244beaa0149b9d8

bpo-33433 Fix private address checking for IPv4 mapped IPv6. (GH-26172) For IPv4 mapped IPv6 addresses, defer privacy check to the mapped IPv4 address. Solves bug where public mapped IPv4 addresses are considered private by the IPv6 check. Automerge-Triggered-By: GH:gpshead

view details

Igor Bolshakov

commit sha f32c7950e0077b6d9a8e217c2796fc582f18ca08

bpo-43650: Fix MemoryError on zip.read in shutil._unpack_zipfile for large files (GH-25058) `shutil.unpack_archive()` tries to read the whole file into memory, making no use of any kind of smaller buffer. Process crashes for really large files: I.e. archive: ~1.7G, unpacked: ~10G. Before the crash it can easily take away all available RAM on smaller systems. Had to pull the code form `zipfile.Zipfile.extractall()` to fix this Automerge-Triggered-By: GH:gpshead

view details

Victor Stinner

commit sha eaede0ded72e67cee4a91c086847d54cb64ca74c

bpo-44131: Test Py_FrozenMain() (GH-26126) * Add test_frozenmain to test_embed * Add Programs/test_frozenmain.py * Add Programs/freeze_test_frozenmain.py * Add Programs/test_frozenmain.h * Add make regen-test-frozenmain * Add test_frozenmain command to Programs/_testembed * _testembed.c: add error(msg) function

view details

Victor Stinner

commit sha 834498e178684a7e2da49b4efe1adea33e0026b0

bpo-44131: Fix Makefile for test_frozenmain (GH-26203) Remove Programs/test_frozenmain.h Makefile target: it ran make in parallel which caused build errors on LTO+PGO builds.

view details

uniocto

commit sha 115dea9e2602b96b63390f00cc880e90c433efa2

bpo-25872: Add unit tests for linecache and threading (GH-25913)

view details

push time in 9 days

issue openedopenSUSE/osc

osc branch --nodevelproject SUSE:SLE-15-SP4:GA python-gzipstream doesn’t work

ihome@kusansky$ osc -A ibs branch --nodevelproject SUSE:SLE-15-SP4:GA python-gzipstream

Note: The branch has been created of a different project,
              SUSE:SLE-15:GA,
      which is the primary location of where development for
      that package takes place.
      That's also where you would normally make changes against.
      A direct branch of the specified package can be forced
      with the --nodevelproject option.

A working copy of the branched package can be checked out with:

osc -A https://api.suse.de co home:mcepl:branches:SUSE:SLE-15:GA/python-gzipstream
ihome@kusansky$

I was told to use -N instead, which seems to work, but --nodevelproject seems like NOOP to me.

created time in 9 days

issue commentcool-RR/combi

Four tests fail with python >= 3.8

I don't need to (we can happily live with those patches in our package), but it would be nice for others.

mcepl

comment created time in 10 days

issue openedgagbo/tree-sitter-org

Broken parsing of orgmode tree

screenshot-2021-09-18_00-09-1631919252

Not sure what else to add. The latest (master branch, https://github.com/neovim/neovim/commit/5e22fdd9c) neovim with all plugins freshly updated and this configuration:

require('orgmode').setup({
  org_agenda_files = {'~/.local/share/orgmode/*'},
  org_default_notes_file = '~/.local/share/orgmode/refile.org',
  org_todo_keywords = {'TODO', 'WAITING', 'DELEGATED', '|', 'DONE', 'WORKSFORME' },
  org_hide_leading_stars = true,
  org_todo_keyword_faces = {
    WAITING = ':foreground blue :weight bold',
    DELEGATED = ':background #FFFFFF :slant italic :underline on',
    DONE = ':background navy foreground yellow :underline on',
    WORKSFORME = ':background navy foreground white :underline on',
  },
})
vim.api.nvim_set_keymap('n', '<Leader>ww', ':e /home/matej/archiv/2021/SUSE/employee/TODO.org<CR>', { noremap = true, silent = true })
vim.cmd[[autocmd FileType org setlocal nofoldenable]]

created time in 11 days

issue commentbuggins/coolreader

Android: fake results in search (no cleanup of database on moving of files around)

Sorry, you don't want me to write a large C code.

mcepl

comment created time in 11 days

issue closedpygame/pygame

test_renderer_set_viewport fails on i586

When packaging pygame 2.0.1 (tarball z PyPI) for openSUSE test suite fails on i586 architecture:

[  212s] ======================================================================
[  212s] FAIL: test_renderer_set_viewport (pygame.tests.video_test.VideoModuleTest)
[  212s] works.
[  212s] ----------------------------------------------------------------------
[  212s] Traceback (most recent call last):
[  212s]   File "/home/abuild/rpmbuild/BUILDROOT/python-pygame-2.0.1-0.i386/usr/lib/python3.9/site-packages/pygame/tests/video_test.py", line 21, in test_renderer_set_viewport
[  212s]     self.assertEqual(renderer.get_viewport(), (0, 0, 1920, 1080))
[  212s] AssertionError: <rect(0, 0, 1917, 1077)> != (0, 0, 1920, 1080)
[  212s]
[  212s] ----------------------------------------------------------------------
[  212s] Ran 1 test in 0.002s

Complete build log listing all packages used and steps taken.

closed time in 11 days

mcepl

issue commentpygame/pygame

test_renderer_set_viewport fails on i586

Thank you very much. Obviously the patch helped.

mcepl

comment created time in 11 days

issue commentbuggins/coolreader

Android: fake results in search

No, it doesn't.

OK, I'll bite. Why it doesn't?

mcepl

comment created time in 11 days

issue openedpygame/pygame

test_renderer_set_viewport fails on i586

When packaging pygame 2.0.1 (tarball z PyPI) for openSUSE test suite fails on i586 architecture:

[  212s] ======================================================================
[  212s] FAIL: test_renderer_set_viewport (pygame.tests.video_test.VideoModuleTest)
[  212s] works.
[  212s] ----------------------------------------------------------------------
[  212s] Traceback (most recent call last):
[  212s]   File "/home/abuild/rpmbuild/BUILDROOT/python-pygame-2.0.1-0.i386/usr/lib/python3.9/site-packages/pygame/tests/video_test.py", line 21, in test_renderer_set_viewport
[  212s]     self.assertEqual(renderer.get_viewport(), (0, 0, 1920, 1080))
[  212s] AssertionError: <rect(0, 0, 1917, 1077)> != (0, 0, 1920, 1080)
[  212s]
[  212s] ----------------------------------------------------------------------
[  212s] Ran 1 test in 0.002s

Complete build log listing all packages used and steps taken.

created time in 12 days

issue commentbuggins/coolreader

Android: fake results in search

I would think that “Search for books in the folder” (or how it is called in English, this is my back-translation from Czech) would do the database cleanup, doesn't it?

mcepl

comment created time in 12 days

issue openedbuggins/coolreader

Android: fake results in search

Screenshot_20210916-142640

Happens to me all the time: when searching I get some results twice (although the file is in the filesystem only once) and one of the results (first in this case) just returns CoolReader to reading whatever I was reading before. Any ideas?

Using CoolReader from F-Droid.

created time in 12 days

issue commentcool-RR/combi

Four tests fail with python >= 3.8

OK, so just because I did the work:

We will probably eventually remove the package from openSUSE.

mcepl

comment created time in 12 days

issue commentcool-RR/combi

Four tests fail with python >= 3.8

Can you try removing Combi, installing python_toolbox and running the tests on it?

I would rather not. Is Combi project still alive or did you kill in favour of python_toolbox? Also, distributions (Linux or otherwise) usually quite prefer to use individual components, the sentence “The Python Toolbox contains 100+ useful little tools.” sounds like a disaster to me.

mcepl

comment created time in 12 days

issue openedcool-RR/combi

Four tests fail with python >= 3.8

When packaging this package (version 1.1.2 from the PyPI tarball) for openSUSE (and trying to remove dependency on nose from it, BTW, if you can rewrite that test_extensive construct into normal pytest/parametrized test, then I have a bottle of drink of your own choice to send you!), anyway when running the test suite with Python >= 3.8 I got these four tests to error (not even fail):

[   31s] + export PYTHONPATH=/home/abuild/rpmbuild/BUILDROOT/python-combi-1.1.2-0.x86_64/usr/lib/python3.6/site-packages
[   31s] + PYTHONPATH=/home/abuild/rpmbuild/BUILDROOT/python-combi-1.1.2-0.x86_64/usr/lib/python3.6/site-packages
[   31s] + python3.6 /home/abuild/rpmbuild/BUILDROOT/python-combi-1.1.2-0.x86_64/usr/lib/python3.6/site-packages/test_combi/scripts/_test_combi.py
[  120s] ..................................
[  120s] ----------------------------------------------------------------------
[  120s] Ran 34 tests in 87.666s
[  120s]
[  120s] OK
[  121s] ++ '[' -f _current_flavor ']'
[  121s] ++ cat _current_flavor
[  121s] + last_flavor=python36
[  121s] + '[' -z python36 ']'
[  121s] + '[' python36 '!=' python39 ']'
[  121s] + '[' -d build ']'
[  121s] + '[' -d _build.python39 ']'
[  121s] + echo python39
[  121s] + python_flavor=python39
[  121s] + export PYTHONPATH=/home/abuild/rpmbuild/BUILDROOT/python-combi-1.1.2-0.x86_64/usr/lib/python3.9/site-packages
[  121s] + PYTHONPATH=/home/abuild/rpmbuild/BUILDROOT/python-combi-1.1.2-0.x86_64/usr/lib/python3.9/site-packages
[  121s] + python3.9 /home/abuild/rpmbuild/BUILDROOT/python-combi-1.1.2-0.x86_64/usr/lib/python3.9/site-packages/test_combi/scripts/_test_combi.py
[  180s] ......E...E..EE...................
[  180s] ======================================================================
[  180s] ERROR: test_combi.test_extensive.test_1
[  180s] ----------------------------------------------------------------------
[  180s] Traceback (most recent call last):
[  180s]   File "/home/abuild/rpmbuild/BUILDROOT/python-combi-1.1.2-0.x86_64/usr/lib/python3.9/site-packages/combi/perming/perm_space.py", line 686, in <genexpr>
[  180s]     next(free_values_perm_iterator))
[  180s] StopIteration
[  180s]
[  180s] The above exception was the direct cause of the following exception:
[  180s]
[  180s] Traceback (most recent call last):
[  180s]   File "/usr/lib/python3.9/site-packages/nose/case.py", line 197, in runTest
[  180s]     self.test(*self.arg)
[  180s]   File "<string>", line 1, in test_1
[  180s]   File "<string>", line 1, in <lambda>
[  180s]   File "/home/abuild/rpmbuild/BUILDROOT/python-combi-1.1.2-0.x86_64/usr/lib/python3.9/site-packages/test_combi/test_extensive.py", line 223, in _check_variation_selection
[  180s]     assert perm_space.index(perm_space[-1]) == perm_space.length - 1
[  180s]   File "/home/abuild/rpmbuild/BUILDROOT/python-combi-1.1.2-0.x86_64/usr/lib/python3.9/site-packages/combi/perming/perm_space.py", line 683, in __getitem__
[  180s]     tuple(
[  180s] RuntimeError: generator raised StopIteration
[  180s]
[  180s] ======================================================================
[  180s] ERROR: test_combi.test_extensive.test_2
[  180s] ----------------------------------------------------------------------
[  180s] Traceback (most recent call last):
[  180s]   File "/home/abuild/rpmbuild/BUILDROOT/python-combi-1.1.2-0.x86_64/usr/lib/python3.9/site-packages/combi/perming/perm_space.py", line 686, in <genexpr>
[  180s]     next(free_values_perm_iterator))
[  180s] StopIteration
[  180s]
[  180s] The above exception was the direct cause of the following exception:
[  180s]
[  180s] Traceback (most recent call last):
[  180s]   File "/usr/lib/python3.9/site-packages/nose/case.py", line 197, in runTest
[  180s]     self.test(*self.arg)
[  180s]   File "<string>", line 1, in test_2
[  180s]   File "<string>", line 1, in <lambda>
[  180s]   File "/home/abuild/rpmbuild/BUILDROOT/python-combi-1.1.2-0.x86_64/usr/lib/python3.9/site-packages/test_combi/test_extensive.py", line 223, in _check_variation_selection
[  180s]     assert perm_space.index(perm_space[-1]) == perm_space.length - 1
[  180s]   File "/home/abuild/rpmbuild/BUILDROOT/python-combi-1.1.2-0.x86_64/usr/lib/python3.9/site-packages/combi/perming/perm_space.py", line 556, in __getitem__
[  180s]     return self.perm_type(self.undapplied[i], perm_space=self)
[  180s]   File "/home/abuild/rpmbuild/BUILDROOT/python-combi-1.1.2-0.x86_64/usr/lib/python3.9/site-packages/combi/perming/perm_space.py", line 683, in __getitem__
[  180s]     tuple(
[  180s] RuntimeError: generator raised StopIteration
[  180s]
[  180s] ======================================================================
[  180s] ERROR: test_combi.test_extensive.test_5
[  180s] ----------------------------------------------------------------------
[  180s] Traceback (most recent call last):
[  180s]   File "/home/abuild/rpmbuild/BUILDROOT/python-combi-1.1.2-0.x86_64/usr/lib/python3.9/site-packages/combi/perming/perm_space.py", line 686, in <genexpr>
[  180s]     next(free_values_perm_iterator))
[  180s] StopIteration
[  180s]
[  180s] The above exception was the direct cause of the following exception:
[  180s]
[  180s] Traceback (most recent call last):
[  180s]   File "/usr/lib/python3.9/site-packages/nose/case.py", line 197, in runTest
[  180s]     self.test(*self.arg)
[  180s]   File "<string>", line 1, in test_5
[  180s]   File "<string>", line 1, in <lambda>
[  180s]   File "/home/abuild/rpmbuild/BUILDROOT/python-combi-1.1.2-0.x86_64/usr/lib/python3.9/site-packages/test_combi/test_extensive.py", line 223, in _check_variation_selection
[  180s]     assert perm_space.index(perm_space[-1]) == perm_space.length - 1
[  180s]   File "/home/abuild/rpmbuild/BUILDROOT/python-combi-1.1.2-0.x86_64/usr/lib/python3.9/site-packages/combi/perming/perm_space.py", line 683, in __getitem__
[  180s]     tuple(
[  180s] RuntimeError: generator raised StopIteration
[  180s]
[  180s] ======================================================================
[  180s] ERROR: test_combi.test_extensive.test_6
[  180s] ----------------------------------------------------------------------
[  180s] Traceback (most recent call last):
[  180s]   File "/home/abuild/rpmbuild/BUILDROOT/python-combi-1.1.2-0.x86_64/usr/lib/python3.9/site-packages/combi/perming/perm_space.py", line 686, in <genexpr>
[  180s]     next(free_values_perm_iterator))
[  180s] StopIteration
[  180s]
[  180s] The above exception was the direct cause of the following exception:
[  180s]
[  180s] Traceback (most recent call last):
[  180s]   File "/usr/lib/python3.9/site-packages/nose/case.py", line 197, in runTest
[  180s]     self.test(*self.arg)
[  180s]   File "<string>", line 1, in test_6
[  180s]   File "<string>", line 1, in <lambda>
[  180s]   File "/home/abuild/rpmbuild/BUILDROOT/python-combi-1.1.2-0.x86_64/usr/lib/python3.9/site-packages/test_combi/test_extensive.py", line 223, in _check_variation_selection
[  180s]     assert perm_space.index(perm_space[-1]) == perm_space.length - 1
[  180s]   File "/home/abuild/rpmbuild/BUILDROOT/python-combi-1.1.2-0.x86_64/usr/lib/python3.9/site-packages/combi/perming/perm_space.py", line 554, in __getitem__
[  180s]     return self.unsliced[i + self.canonical_slice.start]
[  180s]   File "/home/abuild/rpmbuild/BUILDROOT/python-combi-1.1.2-0.x86_64/usr/lib/python3.9/site-packages/combi/perming/perm_space.py", line 683, in __getitem__
[  180s]     tuple(
[  180s] RuntimeError: generator raised StopIteration
[  180s]
[  180s] ----------------------------------------------------------------------
[  180s] Ran 34 tests in 57.982s
[  180s]
[  180s] FAILED (errors=4)

This is the complete build log with all versions of packages used and steps taken.

created time in 12 days

issue commentpyinstaller/pyinstaller

Help Taming RecursionError

@mcepl what version of PyInstaller is that? That test was removed in 53949b0, which should be in 4.5 or 4.5.1.

The log.txt you attached seems to suggest you are using PyInstaller 3.6?

Not by best moment, of course, when I upgraded to 4.5.1, everything works correctly. Thank you.

htgoebel

comment created time in 12 days

issue commentpyinstaller/pyinstaller

Help Taming RecursionError

I guess I am getting it when running the test suite while packaging PyInstaller for openSUSE. Given that my situation is slightly different: we don’t venv (the same tools are used for building all packages even non-Python ones), but the environment where the build happens is completely isolated by chroot, so only what’s specifically installed is there. log.txt is the complete log of all packages and their versions installed and all steps taken. The error is thus:

[  113s] =================================== FAILURES ===================================
[  113s] __________________________ TestDeeplyNested.testRegr ___________________________
[  113s] [gw1] linux -- Python 3.9.7 /usr/bin/python3.9
[  113s]
[  113s] self = <test_modulegraph.test_imports.TestDeeplyNested testMethod=testRegr>
[  113s]
[  113s]     def setUp(self):
[  113s]         root = os.path.join(
[  113s]                 os.path.dirname(os.path.abspath(__file__)),
[  113s]                 'testpkg-regr6')
[  113s]         self.mf = modulegraph.ModuleGraph(path=[ root ] + sys.path)
[  113s] >       self.mf.run_script(os.path.join(root, 'script.py'))
[  113s]
[  113s] tests/unit/test_modulegraph/test_imports.py:468:
[  113s] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
[  113s] PyInstaller/lib/modulegraph/modulegraph.py:1408: in run_script
[  113s]     self._scan_code(m, co, co_ast)
[  113s] PyInstaller/lib/modulegraph/modulegraph.py:2643: in _scan_code
[  113s]     self._process_imports(module)
[  113s] PyInstaller/lib/modulegraph/modulegraph.py:2842: in _process_imports
[  113s]     target_module = self._safe_import_hook(*import_info, **kwargs)[0]
[  113s] PyInstaller/lib/modulegraph/modulegraph.py:2296: in _safe_import_hook
[  113s]     target_modules = self.import_hook(
[  113s] PyInstaller/lib/modulegraph/modulegraph.py:1480: in import_hook
[  113s]     target_package, target_module_partname = self._find_head_package(
[  113s] PyInstaller/lib/modulegraph/modulegraph.py:1637: in _find_head_package
[  113s]     target_package = self._safe_import_module(
[  113s] PyInstaller/lib/modulegraph/modulegraph.py:2054: in _safe_import_module
[  113s]     module = self._load_module(
[  113s] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
[  113s]
[  113s] self = <ModuleGraph>, fqname = 'module'
[  113s] fp = <_io.StringIO object at 0x7fcd843269d0>
[  113s] pathname = '/home/abuild/rpmbuild/BUILD/PyInstaller-3.6/tests/unit/test_modulegraph/testpkg-regr6/module.py'
[  113s] info = ('.py', 'r', 1)
[  113s]
[  113s]     def _load_module(self, fqname, fp, pathname, info):
[  113s]         suffix, mode, typ = info
[  113s]         self.msgin(2, "load_module", fqname, fp and "fp", pathname)
[  113s]
[  113s]         if typ == imp.PKG_DIRECTORY:
[  113s]             if isinstance(mode, (list, tuple)):
[  113s]                 packagepath = mode
[  113s]             else:
[  113s]                 packagepath = []
[  113s]
[  113s]             m = self._load_package(fqname, pathname, packagepath)
[  113s]             self.msgout(2, "load_module ->", m)
[  113s]             return m
[  113s]
[  113s]         if typ == imp.PY_SOURCE:
[  113s]             contents = fp.read()
[  113s]             if isinstance(contents, bytes):
[  113s]                 contents += b'\n'
[  113s]             else:
[  113s]                 contents += '\n'
[  113s]
[  113s]             try:
[  113s]                 co = compile(contents, pathname, 'exec', ast.PyCF_ONLY_AST, True)
[  113s]                 if sys.version_info[:2] == (3, 5):
[  113s]                     # In Python 3.5 some syntax problems with async
[  113s]                     # functions are only reported when compiling to bytecode
[  113s]                     compile(co, '-', 'exec', 0, True)
[  113s]             except SyntaxError:
[  113s]                 co = None
[  113s]                 cls = InvalidSourceModule
[  113s]                 self.msg(2, "load_module: InvalidSourceModule", pathname)
[  113s]
[  113s]             else:
[  113s]                 cls = SourceModule
[  113s]
[  113s]         elif typ == imp.PY_COMPILED:
[  113s]             data = fp.read(4)
[  113s]             magic = imp.get_magic()
[  113s]             if data != magic:
[  113s]                 self.msg(2, "load_module: InvalidCompiledModule, "
[  113s]                          "bad magic number", pathname, data, magic)
[  113s]                 co = None
[  113s]                 cls = InvalidCompiledModule
[  113s]             else:
[  113s]                 if sys.version_info >= (3, 7):
[  113s]                     fp.read(12)
[  113s]                 elif sys.version_info >= (3, 4):
[  113s]                     fp.read(8)
[  113s]                 else:
[  113s]                     fp.read(4)
[  113s]                 try:
[  113s]                     co = marshal.loads(fp.read())
[  113s]                     cls = CompiledModule
[  113s]                 except Exception as exc:
[  113s]                     self.msg(2, "load_module: InvalidCompiledModule, "
[  113s]                              "Cannot load code", pathname, exc)
[  113s]                     co = None
[  113s]                     cls = InvalidCompiledModule
[  113s]
[  113s]         elif typ == imp.C_BUILTIN:
[  113s]             cls = BuiltinModule
[  113s]             co = None
[  113s]
[  113s]         else:
[  113s]             cls = Extension
[  113s]             co = None
[  113s]
[  113s]         m = self.createNode(cls, fqname)
[  113s]         m.filename = pathname
[  113s]         if co is not None:
[  113s]             try:
[  113s]                 if isinstance(co, ast.AST):
[  113s]                     co_ast = co
[  113s] >                   co = compile(co_ast, pathname, 'exec', 0, True)
[  113s] E                   RecursionError: maximum recursion depth exceeded while traversing 'expr' node
[  113s]
[  113s] PyInstaller/lib/modulegraph/modulegraph.py:2156: RecursionError
htgoebel

comment created time in 12 days

issue commentPr0Ger/PyAPNs2

hyper dependency is no longer maintained

FYI, found 2 forks where hyper was replaced with httpx: https://github.com/pocketlabs/PyAPNs2

https://github.com/pocketlabs/PyAPNs2/commit/f1220b8e0b87caf08f3f3d6d3ab9146e75c90543 doesn’t let PyAPNs2 pass its test suite.

https://github.com/karlwnw/PyAPNs2

kevindice

comment created time in 14 days

issue commentjazzband/django-push-notifications

ERROR: django-push-notifications 1.6.1 has requirement apns2<0.6.0,>=0.3.0, but you'll have apns2 0.7.1 which is incompatible.

apns2 is problematic as whole, because it depends on hyper which is dead. https://github.com/Pr0Ger/PyAPNs2/issues/126 (and for example https://github.com/zulip/zulip/pull/18733).

rocchidavide

comment created time in 14 days