profile
viewpoint
Fantix King fantix @uc-cdis Chicago, IL http://about.me/fantix I code in Python and more. @decentfox: Outsourcing, Consulting, Startup

fantix/aiocontextvars 47

Asyncio support for PEP-567 contextvars backport.

fantix/aintq 39

AintQ Is Not Task Queue - a Python asyncio task queue on PostgreSQL.

fantix/ArchLinux-x32 6

This repo is migrated to repos under https://github.com/archlinux-x32

fantix/ArchRepo 3

Powers the Arch Linux unofficial user repositories

d3b-center/d3b-lib-pfb-exporter 1

🏭Transform and export data from a relational database to a PFB (Portable Format for Bioinformatics) Avro file

fantix/authlib-gino 1

OpenID Connect provider implemented with Authlib and GINO.

fantix/aiohttp 0

HTTP client/server framework for asyncio

fantix/aiomysql 0

aiomysql is a library for accessing a MySQL database from the asyncio

delete branch fantix/aiocontextvars

delete branch : pyup-update-sphinx-2.0.1-to-3.1.2

delete time in 2 days

push eventfantix/aiocontextvars

pyup-bot

commit sha 0646a771af5cb1e1bdb19f89bedb8214af7c0cb6

Update sphinx from 2.0.1 to 3.2.0

view details

push time in 2 days

create barnchfantix/aiocontextvars

branch : pyup-update-sphinx-2.0.1-to-3.2.0

created branch time in 2 days

delete branch fantix/aiocontextvars

delete branch : pyup-update-tox-3.8.6-to-3.18.1

delete time in 4 days

push eventfantix/aiocontextvars

pyup-bot

commit sha 381b696b7ec3f9faf883a65306aa239779414528

Update tox from 3.8.6 to 3.19.0

view details

push time in 4 days

create barnchfantix/aiocontextvars

branch : pyup-update-tox-3.8.6-to-3.19.0

created branch time in 4 days

push eventuc-cdis/docker-firefox

Fantix King

commit sha fb34b78a1c0b23979961fdd86f1594d70ea84d36

allow customize landing URL

view details

push time in 5 days

push eventuc-cdis/docker-firefox

Fantix King

commit sha 7bd31ffbe43fac07fad727b034f7c0b6782a5f89

update Firefox version

view details

push time in 5 days

push eventuc-cdis/docker-firefox

Fantix King

commit sha 2b7317ba7be536e282c96fb1a381dc364a0250b9

pin to gen3.org for demo

view details

push time in 6 days

delete branch fantix/aiocontextvars

delete branch : pyup-update-pip-19.0.3-to-20.2

delete time in 6 days

push eventfantix/aiocontextvars

pyup-bot

commit sha 61d79a00f6fa1d80d9ca585dec3779a76c6dee20

Update pip from 19.0.3 to 20.2.1

view details

push time in 6 days

create barnchfantix/aiocontextvars

branch : pyup-update-pip-19.0.3-to-20.2.1

created branch time in 6 days

create barnchfantix/aiocontextvars

branch : pyup-update-pytest-4.4.0-to-6.0.1

created branch time in 11 days

delete branch fantix/aiocontextvars

delete branch : pyup-update-pytest-4.4.0-to-6.0.0

delete time in 11 days

push eventfantix/aiocontextvars

pyup-bot

commit sha db07d4a6500015e5f6602ff24eade718dfe11a46

Update pytest from 4.4.0 to 6.0.1

view details

push time in 11 days

delete branch fantix/aiocontextvars

delete branch : pyup-update-pytest-4.4.0-to-5.4.3

delete time in 12 days

push eventfantix/aiocontextvars

pyup-bot

commit sha c0e4bda60480fbfa8966fbfe7e12b3d2c6c36524

Update pytest from 4.4.0 to 6.0.0

view details

push time in 12 days

create barnchfantix/aiocontextvars

branch : pyup-update-pytest-4.4.0-to-6.0.0

created branch time in 12 days

delete branch fantix/aiocontextvars

delete branch : pyup-update-pip-19.0.3-to-20.1.1

delete time in 12 days

push eventfantix/aiocontextvars

pyup-bot

commit sha 3a0f8ea38ba6cdb08a4dc65f768c9ff49f2c49b1

Update pip from 19.0.3 to 20.2

view details

push time in 12 days

create barnchfantix/aiocontextvars

branch : pyup-update-pip-19.0.3-to-20.2

created branch time in 12 days

create barnchfantix/aiocontextvars

branch : pyup-update-tox-3.8.6-to-3.18.1

created branch time in 13 days

delete branch fantix/aiocontextvars

delete branch : pyup-update-tox-3.8.6-to-3.18.0

delete time in 13 days

push eventfantix/aiocontextvars

pyup-bot

commit sha 4809085177a86eb964b68d06222deabf9424778a

Update tox from 3.8.6 to 3.18.1

view details

push time in 13 days

delete branch fantix/aiocontextvars

delete branch : pyup-update-coverage-4.5.1-to-5.2

delete time in 17 days

push eventfantix/aiocontextvars

pyup-bot

commit sha 3c5e1429cd023ab555be6c34ca4477da0b3cec3a

Update coverage from 4.5.1 to 5.2.1

view details

push time in 17 days

create barnchfantix/aiocontextvars

branch : pyup-update-coverage-4.5.1-to-5.2.1

created branch time in 17 days

pull request commentuc-cdis/cdis_oauth2client

pre commit hooks

Right, the PyPI server password is encrypted in Travis's recommended way, so I think that's okay.

jiaqi216

comment created time in 17 days

delete branch fantix/aiocontextvars

delete branch : pyup-update-tox-3.8.6-to-3.17.1

delete time in 18 days

push eventfantix/aiocontextvars

pyup-bot

commit sha 7eecb28101760901eee4945d27c4ff5b935a7696

Update tox from 3.8.6 to 3.18.0

view details

push time in 18 days

create barnchfantix/aiocontextvars

branch : pyup-update-tox-3.8.6-to-3.18.0

created branch time in 18 days

Pull request review commentpython/cpython

bpo-41317: Remove reader on cancellation in asyncio.loop.sock_accept()

 async def sock_accept(self, sock):         if self._debug and sock.gettimeout() != 0:             raise ValueError("the socket must be non-blocking")         fut = self.create_future()-        self._sock_accept(fut, False, sock)+        self._sock_accept(fut, sock)         return await fut -    def _sock_accept(self, fut, registered, sock):+    def _sock_accept(self, fut, sock):         fd = sock.fileno()-        if registered:-            self.remove_reader(fd)-        if fut.done():-            return         try:             conn, address = sock.accept()             conn.setblocking(False)         except (BlockingIOError, InterruptedError):-            self.add_reader(fd, self._sock_accept, fut, True, sock)+            self._ensure_fd_no_transport(fd)+            handle = self.add_reader(fd, self._sock_accept, fut, sock)

should be self._add_reader

agronholm

comment created time in 18 days

push eventfantix/aiocontextvars

pyup-bot

commit sha 08891e53f766e06c1430efc1c2d923b321c7414f

Update cryptography from 2.6.1 to 3.0

view details

push time in 20 days

delete branch fantix/aiocontextvars

delete branch : pyup-update-cryptography-2.6.1-to-2.9.2

delete time in 20 days

create barnchfantix/aiocontextvars

branch : pyup-update-cryptography-2.6.1-to-3.0

created branch time in 20 days

push eventdecentfox/gapp-login

Fantix King

commit sha 0493ab6cac8ad53e435377fedb8932a04616140b

refactor SMS provider

view details

push time in 22 days

startedUSCreditCardGuide/airlines-to-china-covid-19

started time in 23 days

Pull request review commentuc-cdis/metadata-service

(PXP-5524): feat/objects

+from .main import get_app+from . import logger++app = get_app()+++@app.on_event("shutdown")+def shutdown_event():+    logger.info("Closing async client.")+    global app+    app.async_client.close()

could we pls put this into get_app()?

Avantol13

comment created time in 25 days

Pull request review commentuc-cdis/metadata-service

(PXP-5524): feat/objects

+from collections import Iterable+from enum import Enum++from authutils.token.fastapi import access_token+from asyncpg import UniqueViolationError+from fastapi import Depends, HTTPException, APIRouter, Security+from fastapi.security import HTTPBearer, HTTPAuthorizationCredentials+from gen3authz.client.arborist.async_client import ArboristClient+import httpx+from starlette.requests import Request+from starlette.responses import JSONResponse+from starlette.status import (+    HTTP_201_CREATED,+    HTTP_409_CONFLICT,+    HTTP_400_BAD_REQUEST,+    HTTP_401_UNAUTHORIZED,+    HTTP_403_FORBIDDEN,+    HTTP_500_INTERNAL_SERVER_ERROR,+)+from pydantic import BaseModel++from . import config, logger+from .models import Metadata++mod = APIRouter()+arborist = ArboristClient()++# auto_error=False prevents FastAPI from raises a 403 when the request is missing+# an Authorization header. Instead, we want to return a 401 to signify that we did+# not recieve valid credentials+bearer = HTTPBearer(auto_error=False)+++class FileUploadStatus(str, Enum):+    NOT_STARTED = "not_uploaded"+    DONE = "uploaded"+    ERROR = "error"+++class CreateObjInput(BaseModel):+    """+    Create object.++    file_name (str): Name for the file being uploaded+    authz (dict): authorization block with requirements for what's being uploaded+    aliases (list, optional): unique name to allow using in place of whatever GUID gets+        created for this upload+    metadata (dict, optional): any additional metadata to attach to the upload+    """++    file_name: str+    authz: dict+    aliases: list = None+    metadata: dict = None+++@mod.post("/objects")+async def create_object(+    body: CreateObjInput,+    request: Request,+    token: HTTPAuthorizationCredentials = Security(bearer),+):+    """+    Create object placeholder and attach metadata, return Upload url to the user.++    Args:+        body (CreateObjInput): input body for create object+    """+    try:+        # NOTE: token can be None if no Authorization header was provided, we expect+        #       this to cause a downstream exception since it is invalid+        token_claims = await access_token("user", "openid", purpose="access")(token)+    except Exception as exc:+        logger.error(exc, exc_info=True)+        raise HTTPException(+            HTTP_401_UNAUTHORIZED,+            f"Could not verify, parse, and/or validate scope from provided access token.",+        )++    file_name = body.dict().get("file_name")+    authz = body.dict().get("authz")+    aliases = body.dict().get("aliases") or []+    metadata = body.dict().get("metadata")+    logger.debug(f"validating authz block input: {authz}")++    if not _is_authz_version_supported(authz):+        raise HTTPException(HTTP_400_BAD_REQUEST, f"Unsupported authz version: {authz}")++    logger.debug(f"validated authz.resource_paths: {authz.get('resource_paths')}")++    if not isinstance(authz.get("resource_paths"), Iterable):+        raise HTTPException(+            HTTP_400_BAD_REQUEST,+            f"Invalid authz.resource_paths, must be valid list of resources, got: {authz.get('resource_paths')}",+        )++    metadata = metadata or {}++    # get user id from token claims+    uploader = token_claims.get("sub")+    auth_header = str(request.headers.get("Authorization", ""))++    blank_guid, signed_upload_url = await _create_blank_record_and_url(+        file_name, authz, auth_header+    )++    if aliases:+        await _create_aliases_for_record(aliases, blank_guid, auth_header)++    await _add_metadata(blank_guid, metadata, authz, uploader)++    response = {+        "guid": blank_guid,+        "aliases": aliases,+        "metadata": metadata,+        "upload_url": signed_upload_url,+    }++    return JSONResponse(response, HTTP_201_CREATED)+++async def _create_aliases_for_record(aliases: list, blank_guid: str, auth_header: str):+    aliases_data = {"aliases": [{"value": alias} for alias in aliases]}+    logger.debug(f"trying to create aliases: {aliases_data}")+    try:+        async with httpx.AsyncClient() as client:

Right, I think it is okay to do that. To be clean, we may want to call close() on application shutdown.

Avantol13

comment created time in 25 days

Pull request review commentdecentfox/gapp-login

Add SMS provider

  config: Config = load_entry_point("config", Config) ++def get_sms_provider(settings: str):+    """Configs used to create a SMS provider instance. Sample:+    {+        "params": {+            "secret_id": "xxx",+            "secret_key": "xxx",+            "sms_app_id": "234123",+            "sms_template_id": "123456"+            "sms_sign": "HiDay"+        },+        "type": "gapp_login.sms.provider.Tencent"

This is a good idea - would be better if we could make SMS providers plugins - the Tencent one can be built-in, but would be nice if we could support arbitrary providers from 3rd-party libraries.

KyleXie

comment created time in 25 days

Pull request review commentdecentfox/gapp-login

Add SMS provider

  config: Config = load_entry_point("config", Config) ++def get_sms_provider(settings: str):+    """Configs used to create a SMS provider instance. Sample:+    {+        "params": {+            "secret_id": "xxx",+            "secret_key": "xxx",+            "sms_app_id": "234123",+            "sms_template_id": "123456"+            "sms_sign": "HiDay"

I'd suggest making these config values separate ones in a separate config file for "Tencent SMS" only.

KyleXie

comment created time in 25 days

delete branch fantix/aiocontextvars

delete branch : pyup-update-tox-3.8.6-to-3.17.0

delete time in 25 days

push eventfantix/aiocontextvars

pyup-bot

commit sha 5bccedad69739095bce13b1b8193b8c25c98b275

Update tox from 3.8.6 to 3.17.1

view details

push time in 25 days

create barnchfantix/aiocontextvars

branch : pyup-update-tox-3.8.6-to-3.17.1

created branch time in 25 days

Pull request review commentuc-cdis/metadata-service

(PXP-5524): feat/objects

+from collections import Iterable+from enum import Enum++from authutils.token.fastapi import access_token+from asyncpg import UniqueViolationError+from fastapi import Depends, HTTPException, APIRouter, Security+from fastapi.security import HTTPBearer, HTTPAuthorizationCredentials+from gen3authz.client.arborist.async_client import ArboristClient+import httpx+from starlette.requests import Request+from starlette.responses import JSONResponse+from starlette.status import (+    HTTP_201_CREATED,+    HTTP_409_CONFLICT,+    HTTP_400_BAD_REQUEST,+    HTTP_401_UNAUTHORIZED,+    HTTP_403_FORBIDDEN,+    HTTP_500_INTERNAL_SERVER_ERROR,+)+from pydantic import BaseModel++from . import config, logger+from .models import Metadata++mod = APIRouter()+arborist = ArboristClient()

(trivial) unused variable arborist

Avantol13

comment created time in a month

Pull request review commentuc-cdis/metadata-service

(PXP-5524): feat/objects

+from collections import Iterable+from enum import Enum++from authutils.token.fastapi import access_token+from asyncpg import UniqueViolationError+from fastapi import Depends, HTTPException, APIRouter, Security+from fastapi.security import HTTPBearer, HTTPAuthorizationCredentials+from gen3authz.client.arborist.async_client import ArboristClient+import httpx+from starlette.requests import Request+from starlette.responses import JSONResponse+from starlette.status import (+    HTTP_201_CREATED,+    HTTP_409_CONFLICT,+    HTTP_400_BAD_REQUEST,+    HTTP_401_UNAUTHORIZED,+    HTTP_403_FORBIDDEN,+    HTTP_500_INTERNAL_SERVER_ERROR,+)+from pydantic import BaseModel++from . import config, logger+from .models import Metadata++mod = APIRouter()+arborist = ArboristClient()++# auto_error=False prevents FastAPI from raises a 403 when the request is missing+# an Authorization header. Instead, we want to return a 401 to signify that we did+# not recieve valid credentials+bearer = HTTPBearer(auto_error=False)+++class FileUploadStatus(str, Enum):+    NOT_STARTED = "not_uploaded"+    DONE = "uploaded"+    ERROR = "error"+++class CreateObjInput(BaseModel):+    """+    Create object.++    file_name (str): Name for the file being uploaded+    authz (dict): authorization block with requirements for what's being uploaded+    aliases (list, optional): unique name to allow using in place of whatever GUID gets+        created for this upload+    metadata (dict, optional): any additional metadata to attach to the upload+    """++    file_name: str+    authz: dict+    aliases: list = None+    metadata: dict = None+++@mod.post("/objects")+async def create_object(+    body: CreateObjInput,+    request: Request,+    token: HTTPAuthorizationCredentials = Security(bearer),+):+    """+    Create object placeholder and attach metadata, return Upload url to the user.++    Args:+        body (CreateObjInput): input body for create object+    """+    try:+        # NOTE: token can be None if no Authorization header was provided, we expect+        #       this to cause a downstream exception since it is invalid+        token_claims = await access_token("user", "openid", purpose="access")(token)+    except Exception as exc:+        logger.error(exc, exc_info=True)+        raise HTTPException(+            HTTP_401_UNAUTHORIZED,+            f"Could not verify, parse, and/or validate scope from provided access token.",+        )++    file_name = body.dict().get("file_name")+    authz = body.dict().get("authz")+    aliases = body.dict().get("aliases") or []+    metadata = body.dict().get("metadata")

just curious - wouldn't body.file_nname work?

Avantol13

comment created time in a month

Pull request review commentuc-cdis/metadata-service

(PXP-5524): feat/objects

+from collections import Iterable+from enum import Enum++from authutils.token.fastapi import access_token+from asyncpg import UniqueViolationError+from fastapi import Depends, HTTPException, APIRouter, Security+from fastapi.security import HTTPBearer, HTTPAuthorizationCredentials+from gen3authz.client.arborist.async_client import ArboristClient+import httpx+from starlette.requests import Request+from starlette.responses import JSONResponse+from starlette.status import (+    HTTP_201_CREATED,+    HTTP_409_CONFLICT,+    HTTP_400_BAD_REQUEST,+    HTTP_401_UNAUTHORIZED,+    HTTP_403_FORBIDDEN,+    HTTP_500_INTERNAL_SERVER_ERROR,+)+from pydantic import BaseModel++from . import config, logger+from .models import Metadata++mod = APIRouter()+arborist = ArboristClient()++# auto_error=False prevents FastAPI from raises a 403 when the request is missing+# an Authorization header. Instead, we want to return a 401 to signify that we did+# not recieve valid credentials+bearer = HTTPBearer(auto_error=False)+++class FileUploadStatus(str, Enum):+    NOT_STARTED = "not_uploaded"+    DONE = "uploaded"+    ERROR = "error"+++class CreateObjInput(BaseModel):+    """+    Create object.++    file_name (str): Name for the file being uploaded+    authz (dict): authorization block with requirements for what's being uploaded+    aliases (list, optional): unique name to allow using in place of whatever GUID gets+        created for this upload+    metadata (dict, optional): any additional metadata to attach to the upload+    """++    file_name: str+    authz: dict+    aliases: list = None+    metadata: dict = None+++@mod.post("/objects")+async def create_object(+    body: CreateObjInput,+    request: Request,+    token: HTTPAuthorizationCredentials = Security(bearer),+):+    """+    Create object placeholder and attach metadata, return Upload url to the user.++    Args:+        body (CreateObjInput): input body for create object+    """+    try:+        # NOTE: token can be None if no Authorization header was provided, we expect+        #       this to cause a downstream exception since it is invalid+        token_claims = await access_token("user", "openid", purpose="access")(token)+    except Exception as exc:+        logger.error(exc, exc_info=True)+        raise HTTPException(+            HTTP_401_UNAUTHORIZED,+            f"Could not verify, parse, and/or validate scope from provided access token.",+        )++    file_name = body.dict().get("file_name")+    authz = body.dict().get("authz")+    aliases = body.dict().get("aliases") or []+    metadata = body.dict().get("metadata")+    logger.debug(f"validating authz block input: {authz}")

(not a big deal) https://blog.pilosus.org/posts/2020/01/24/python-f-strings-in-logging/

Avantol13

comment created time in a month

Pull request review commentuc-cdis/metadata-service

(PXP-5524): feat/objects

+from collections import Iterable+from enum import Enum++from authutils.token.fastapi import access_token+from asyncpg import UniqueViolationError+from fastapi import Depends, HTTPException, APIRouter, Security+from fastapi.security import HTTPBearer, HTTPAuthorizationCredentials+from gen3authz.client.arborist.async_client import ArboristClient+import httpx+from starlette.requests import Request+from starlette.responses import JSONResponse+from starlette.status import (+    HTTP_201_CREATED,+    HTTP_409_CONFLICT,+    HTTP_400_BAD_REQUEST,+    HTTP_401_UNAUTHORIZED,+    HTTP_403_FORBIDDEN,+    HTTP_500_INTERNAL_SERVER_ERROR,+)+from pydantic import BaseModel++from . import config, logger+from .models import Metadata++mod = APIRouter()+arborist = ArboristClient()++# auto_error=False prevents FastAPI from raises a 403 when the request is missing+# an Authorization header. Instead, we want to return a 401 to signify that we did+# not recieve valid credentials+bearer = HTTPBearer(auto_error=False)+++class FileUploadStatus(str, Enum):+    NOT_STARTED = "not_uploaded"+    DONE = "uploaded"+    ERROR = "error"+++class CreateObjInput(BaseModel):+    """+    Create object.++    file_name (str): Name for the file being uploaded+    authz (dict): authorization block with requirements for what's being uploaded+    aliases (list, optional): unique name to allow using in place of whatever GUID gets+        created for this upload+    metadata (dict, optional): any additional metadata to attach to the upload+    """++    file_name: str+    authz: dict+    aliases: list = None+    metadata: dict = None+++@mod.post("/objects")+async def create_object(+    body: CreateObjInput,+    request: Request,+    token: HTTPAuthorizationCredentials = Security(bearer),+):+    """+    Create object placeholder and attach metadata, return Upload url to the user.++    Args:+        body (CreateObjInput): input body for create object+    """+    try:+        # NOTE: token can be None if no Authorization header was provided, we expect+        #       this to cause a downstream exception since it is invalid+        token_claims = await access_token("user", "openid", purpose="access")(token)+    except Exception as exc:+        logger.error(exc, exc_info=True)+        raise HTTPException(+            HTTP_401_UNAUTHORIZED,+            f"Could not verify, parse, and/or validate scope from provided access token.",+        )++    file_name = body.dict().get("file_name")+    authz = body.dict().get("authz")+    aliases = body.dict().get("aliases") or []+    metadata = body.dict().get("metadata")+    logger.debug(f"validating authz block input: {authz}")++    if not _is_authz_version_supported(authz):+        raise HTTPException(HTTP_400_BAD_REQUEST, f"Unsupported authz version: {authz}")++    logger.debug(f"validated authz.resource_paths: {authz.get('resource_paths')}")++    if not isinstance(authz.get("resource_paths"), Iterable):+        raise HTTPException(+            HTTP_400_BAD_REQUEST,+            f"Invalid authz.resource_paths, must be valid list of resources, got: {authz.get('resource_paths')}",+        )++    metadata = metadata or {}++    # get user id from token claims+    uploader = token_claims.get("sub")+    auth_header = str(request.headers.get("Authorization", ""))++    blank_guid, signed_upload_url = await _create_blank_record_and_url(+        file_name, authz, auth_header+    )++    if aliases:+        await _create_aliases_for_record(aliases, blank_guid, auth_header)++    await _add_metadata(blank_guid, metadata, authz, uploader)++    response = {+        "guid": blank_guid,+        "aliases": aliases,+        "metadata": metadata,+        "upload_url": signed_upload_url,+    }++    return JSONResponse(response, HTTP_201_CREATED)+++async def _create_aliases_for_record(aliases: list, blank_guid: str, auth_header: str):+    aliases_data = {"aliases": [{"value": alias} for alias in aliases]}+    logger.debug(f"trying to create aliases: {aliases_data}")+    try:+        async with httpx.AsyncClient() as client:

sorry for this very late decision - we might want to sharer a single HTTP client instance across the application: https://www.python-httpx.org/advanced/#why-use-a-client

Avantol13

comment created time in a month

delete branch fantix/aiocontextvars

delete branch : pyup-update-tox-3.8.6-to-3.16.1

delete time in a month

push eventfantix/aiocontextvars

pyup-bot

commit sha df67a9e2f3538694410c5b0326fc4d6a4f26f196

Update tox from 3.8.6 to 3.17.0

view details

push time in a month

create barnchfantix/aiocontextvars

branch : pyup-update-tox-3.8.6-to-3.17.0

created branch time in a month

Pull request review commentuc-cdis/fence

(fix) RAS perefer UserID

 def get_user_id(self, code):             return {"error": err_msg}          username = None-        if userinfo.get("preferred_username"):-            username = userinfo["preferred_username"]-        elif userinfo.get("UserID"):+        if userinfo.get("UserID"):             username = userinfo["UserID"]         elif claims.get("sub"):

Yes, it is! Thanks, Binam 👍

BinamB

comment created time in a month

pull request commentpython/cpython

New asyncio ssl implementation

Yes, I believe it's fine to move on. The only missing part is

porting a part of functional tests from uvloop

I'll try to help with the porting when time.

asvetlov

comment created time in a month

push eventuc-cdis/fence

vpsx

commit sha d335c3551d376bfd686308ec64d7342e2507c738

fix(clear-user-visas): clear user visas on login before fetch

view details

Fantix King

commit sha 06081cf619121dcc1842ac877cbfa37f51fe8aaf

Merge pull request #794 from uc-cdis/feat/clear-user-visas fix(clear-user-visas): clear user visas on login before fetch

view details

push time in a month

PR merged uc-cdis/fence

fix(clear-user-visas): clear user visas on login before fetch

New Features

  • Clear user ga4gh visas upon login, before fetching new ones; assumes RAS is sole visa issuer for now.
+7 -0

2 comments

1 changed file

vpsx

pr closed time in a month

Pull request review commentuc-cdis/gen3utils

PXP-6317 Initialize Poetry as dependency management tool

+[tool.poetry]+name = "gen3utils"+version = "0.4.14"+description = "Gen3 Library Template"+authors = ["CTDS UChicago <cdis@uchicago.edu>"]+license = "Apache-2.0"+readme = "README.md"+repository = "https://github.com/uc-cdis/gen3utils"+include = [+  "CHANGELOG.md",+  "gen3utils/manifest/validation_config.yaml",+  "LICENSE",+  "NOTICE",+  "README.md",

I think LICENSE, README.md and the yaml file in source code are included by default.

johnfrancismccann

comment created time in a month

Pull request review commentuc-cdis/gen3utils

PXP-6317 Initialize Poetry as dependency management tool

 dist: xenial language: python python: - 3.6-cache:-- pip-- apt+env:+  global:+  - secure: KY3stltR4M1obId8R1s98gUYdxCqZdfnUMJ+5qRiXIdMy+ZsyHlZ/cVtjpIqPBkmDVXRpC9sp66j7ipNl/WEA1Su5mkQSuec/h2d9aN57MrzXClPs4qSadyCvOfA8wjfKbuw6eTBvX6WF7UA6eC4nv07q+sAp2CuaMRm8fi5jbk9fcF+3QCFmd5ORuBXoVBRnU3lvsf3ih8l0ESxFfgjAfsuDoXYV9/8eyhknn9rpSHuX1lP0Ut/+Cs40bF8M3ujknA8GfdkMF9dRdSOvqyX7Jz4A/AiXKG69ZJXxR1XdZuZmdlMytjBw/XrSjNm/m2/gYca+MAPGA1/FEO20yf7a9VItJVBppO+QJ6QFZ+gB0vz7lRy88grAk1iFwi0+kMmXgcInjlb+iStw+MIbdBTzPW7AFkIEDqyN5XKhJgah8cFvYE0GOP/ROXr3AgBEvT3yj7kO+fHE7bmCkpjEaTIKkosb+/jTO3ty8YWHjT7NVaNFmX5i6gxKa3jbiSLtPkEsByNZOYtuIOWIkvnLOVNiC9Y9E6ZN0F+qiKinuDYqvmxx+HvHhJmjO+/E5DSh0cqgN3OFYUB/1kVEI/B+MaX/KbVdAA8KCdXvaClURcbBSZlD4nqOsGUcDmGbri/lMmG/9ozVrmlwyfZ6cfRapmBRouC6EcQ2FGLqZe4r3LYngc= # pragma: allowlist secret install:-- pip install pipenv-- pipenv install --dev --deploy-- pipenv graph+- 'curl -sSL https://raw.githubusercontent.com/python-poetry/poetry/master/get-poetry.py | python'+- source $HOME/.poetry/env+- which poetry+- poetry install -vv script:-- pipenv run pytest -vv --cov=gen3utils --cov-report xml ./tests-after_script:-- pipenv run python-codacy-coverage -r coverage.xml+- poetry run pytest -vv ./tests

Looks like Codacy is not installed for uc-cdis org. Let's worry about this laterr.

johnfrancismccann

comment created time in a month

push eventuc-cdis/fence

BinamB

commit sha 0a46282d5030f02162d9d98710a5c3e05ba8521b

(feat) Support RAS as idp

view details

BinamB

commit sha da5b9697f26c9008e8b63c2a42d2eff901356c1a

ras

view details

BinamB

commit sha b76a7599e3e7c0fb6c52f3c19b64f75bd93eecb4

change userdatamodel

view details

BinamB

commit sha 67519ad9ac01c20c63c43b111c32a55f1f4465f6

callback-fix

view details

BinamB

commit sha 37983d330f296de4c5b28f3a547eec990976ab15

add useid logic

view details

BinamB

commit sha c78fdd915d49ced7fe59aa6602fe91285b8a08d2

access_token

view details

BinamB

commit sha 11b7666c26b7d6697a47ec1f17d37346b4ab772f

review changes

view details

BinamB

commit sha 2e4a33e51ed888263789a04c46d26755c570961f

move err_msg

view details

BinamB

commit sha 096aa7434c7fb88c561052d18b288469d7fa20d9

move err catch

view details

BinamB

commit sha bfe40dba576e82e88ecb0d46fdca25bef1ce1707

scopes

view details

BinamB

commit sha 6063f6017f3c77b636731603e27c2c7001609496

pin dependency and what not

view details

BinamB

commit sha 703c69729523a0002a72d41041d6a02bee1bce62

fromat

view details

BinamB

commit sha 998ec1f19c34718eb14596ff37022773c1ab5008

Merge branch 'master' of github.com:uc-cdis/fence into feat/ras/oidc-handshake

view details

BinamB

commit sha 067ae24c77e8c960dd67a4de69b02e244d3936a2

Merge branch 'master' of github.com:uc-cdis/fence into feat/ras/oidc-handshake

view details

Fantix King

commit sha c4e3114d5a2b5d5a6a259ec6a0b217e0c194b726

Merge pull request #787 from uc-cdis/feat/ras/oidc-handshake PXP-6047 Support RAS as idp

view details

push time in a month

PR merged uc-cdis/fence

PXP-6047 Support RAS as idp test-google-googleDataAccessTest

Description about what this pull request does.

Please make sure to follow the DEV guidelines before asking for review.

New Features

  • Implemented RAS as identity provider.

Breaking Changes

Bug Fixes

Improvements

Dependency updates

Deployment changes

+147 -5

2 comments

8 changed files

BinamB

pr closed time in a month

delete branch fantix/aiocontextvars

delete branch : pyup-update-coverage-4.5.1-to-5.1

delete time in a month

push eventfantix/aiocontextvars

pyup-bot

commit sha 14e0cc7bda144d4770272b7c243b6fbe4a8d6517

Update coverage from 4.5.1 to 5.2

view details

push time in a month

create barnchfantix/aiocontextvars

branch : pyup-update-coverage-4.5.1-to-5.2

created branch time in a month

delete branch fantix/aiocontextvars

delete branch : pyup-update-sphinx-2.0.1-to-3.1.1

delete time in a month

push eventfantix/aiocontextvars

pyup-bot

commit sha f3be4f7943738c0c641bd2e6dbb1d6c9a657d341

Update sphinx from 2.0.1 to 3.1.2

view details

push time in a month

create barnchfantix/aiocontextvars

branch : pyup-update-sphinx-2.0.1-to-3.1.2

created branch time in a month

delete branch fantix/aiocontextvars

delete branch : pyup-update-tox-3.8.6-to-3.16.0

delete time in a month

push eventfantix/aiocontextvars

pyup-bot

commit sha 8c4a783fbb5b748019b47a16d3ae2a3a3eb78e60

Update tox from 3.8.6 to 3.16.1

view details

push time in a month

create barnchfantix/aiocontextvars

branch : pyup-update-tox-3.8.6-to-3.16.1

created branch time in a month

Pull request review commentuc-cdis/gen3sdk-python

PXP-6220 Add bucket manifest merging

+import os+import sys+import glob++import logging+import csv++from collections import OrderedDict+++def merge_bucket_manifests(+    directory=".",+    manifest_extension="tsv",+    delimiter="\t",+    merge_column="md5",+    output_manifest="merged-bucket-manifest.tsv",+):+    """+    Merge all of the input manifests in the provided directory into a single+    output manifest. Files contained in the input manifests are merged on the+    basis of a common hash (i.e. merge_column). The url and authz values for+    matching input files are concatenated with spaces in the merged output file+    record.++    Args:+        directory(str): path of the directory containing the input manifests+        manifest_extension(str): the extension for the input manifests+        delimiter(str): the delimiter that should be used for reading the input+        and writing the output manifests+        merge_column(str): the common hash used to merge files. it is unique+        for every file in the output manifest+        output_manifest(str): the file to write the output manifest to++    Returns:+        None+    """+    all_rows = {}+    logging.info(f"Iterating over manifests in {directory} directory")+    for input_manifest in glob.glob(os.path.join(directory, f"*.{manifest_extension}")):+        with open(input_manifest) as f:+            logging.info(+                f"Reading, parsing, and merging files from {input_manifest} manifest"+            )+            total_row_count = sum(1 for row in f)+            f.seek(0)++            reader = csv.reader(f, delimiter=delimiter)+            headers = OrderedDict([(h, i) for i, h in enumerate(next(reader))])+            for i, row in enumerate(reader):+                _merge_row(all_rows, headers, row, merge_column)+                if i % 1000 == 0 or i + 2 == total_row_count:+                    _update_progress(i + 2, total_row_count)+            logging.info("")++    with open(output_manifest, "w") as csvfile:+        logging.info(f"Writing merged manifest to {output_manifest}")+        writer = csv.writer(csvfile, delimiter=delimiter)+        writer.writerow(headers.keys())++        total_row_count = len(all_rows)+        for i, hashh in enumerate(all_rows):+            writer.writerow(all_rows[hashh])+            if i % 1000 == 0 or i + 2 == total_row_count:+                _update_progress(i + 2, total_row_count)+        logging.info("")+++def _merge_row(all_rows, headers, row_to_merge, merge_column):+    """+    Update all_rows with row_to_merge.++    Args:+        all_rows(dict): maps a merge_column value to a single merged row+        headers(OrderedDict): maps manifest headers to index of each header+        (e.g for "url, size, md5, authz", headers would be { "url": 0, "size":+        1, "md5": 2, "authz": 3 })+        row_to_merge(list(str)): the row to update all_rows with+        merge_column(str): the header of the column that is used to merge rows+        (e.g. "md5")++    Returns:+        None++    """+    hashh = row_to_merge[headers[merge_column]]+    if hashh in all_rows:+        size = row_to_merge[headers["size"]]+        if size != all_rows[hashh][headers["size"]]:+            raise csv.Error(+                f"Could not merge file with {merge_column} equal to {hashh} while reading {input_manifest} because of size mismatch."+            )++        url = row_to_merge[headers["url"]]+        if url not in all_rows[hashh][headers["url"]]:+            all_rows[hashh][headers["url"]] += f" {url}"++        authz = row_to_merge[headers["authz"]]+        if authz not in all_rows[hashh][headers["authz"]]:+            all_rows[hashh][headers["authz"]] += f" {authz}"+    else:+        all_rows[hashh] = row_to_merge+++def _update_progress(rows_read, total_row_count):+    """+    Print progress bar and percentage to STDOUT for how much of a file has been+    read from or written to.++    Args:+        rows_read(int): the number of rows read so far in file+        total_row_count(int): the total number of rows in file++    Returns:+        None++    """+    progress = int(rows_read / total_row_count * 100)+    pound_signs = progress // 10+    print(f"\r[{'#'*(pound_signs)}{' '*(10-pound_signs)}] {progress}%", end="")

If we reduce the sampling frequency - raise 1000 to 10000 for example, would the impact become 1.5%?

johnfrancismccann

comment created time in a month

delete branch fantix/aiocontextvars

delete branch : pyup-update-tox-3.8.6-to-3.15.2

delete time in a month

push eventfantix/aiocontextvars

pyup-bot

commit sha ae0ce425810849aa16231919c0c53315ff421e9d

Update tox from 3.8.6 to 3.16.0

view details

push time in a month

create barnchfantix/aiocontextvars

branch : pyup-update-tox-3.8.6-to-3.16.0

created branch time in a month

delete branch fantix/aiocontextvars

delete branch : pyup-update-watchdog-0.9.0-to-0.10.2

delete time in 2 months

push eventfantix/aiocontextvars

pyup-bot

commit sha c3e1fed4b4a281d411298405af61d4e56ae3651a

Update watchdog from 0.9.0 to 0.10.3

view details

push time in 2 months

create barnchfantix/aiocontextvars

branch : pyup-update-watchdog-0.9.0-to-0.10.3

created branch time in 2 months

issue commentMagicStack/uvloop

test_sockets.py::TestAIOSockets test timing out with 3.8.1

right - sorry I think we should skip the whole Python 3.8 as the fix was only backported to 3.9. I'll create a PR for that.

doko42

comment created time in 2 months

delete branch fantix/aiocontextvars

delete branch : pyup-update-pytest-asyncio-0.10.0-to-0.12.0

delete time in 2 months

push eventfantix/aiocontextvars

pyup-bot

commit sha e185fd421b1a1ea63a7642a649b736a8af2256e7

Update pytest-asyncio from 0.10.0 to 0.14.0

view details

push time in 2 months

push eventuc-cdis/metadata-service

Fantix King

commit sha 487d695876e695b4abdbea4e88d74d5be4596006

fix failinng test

view details

push time in 2 months

Pull request review commentdecentfox/gapp-login

Added update wechat user info endpoint

 async def login_wechat(     return rv  +@router.post("/users/update/wxa")+async def update_wechat_account_info(+    encrypted_data: str = Form(...),+    iv: str = Form(...),+    user: User = Security(require_user),+):+    identity = await WeChatIdentity.get(user.get_identity_id())+    wechat_session_key = identity.profile.get("wechat_session_key")+    session_key = base64.b64decode(wechat_session_key)+    encrypted_data = base64.b64decode(encrypted_data)+    iv = base64.b64decode(iv)+    decryptor = Cipher(+        algorithms.AES(session_key), modes.CBC(iv), backend=default_backend(),+    ).decryptor()+    data = decryptor.update(encrypted_data) + decryptor.finalize()+    unpadder = PKCS7(algorithms.AES.block_size).unpadder()+    data = unpadder.update(data) + unpadder.finalize()+    data = json.loads(data)+    open_id = data.pop("openId")+    unionid = data.pop("unionId", None)+    wm = data.pop("watermark")+    if wm["appid"] not in config.WECHAT_CLIENTS:+        raise HTTPException(status.HTTP_400_BAD_REQUEST, "Bad wechat AppID")++    if open_id != identity.sub:+        raise HTTPException(status.HTTP_400_BAD_REQUEST, "Wechat openId mismatched")++    await identity.update(wechat_user_info=data, wechat_unionid=unionid,).apply()+    return dict(data=data)+++@router.get("/users/wechat")

(future) authlib-gino 0.4 would probably want to include this in the user_info endpoint, but we could use this as for now

XeniaLu

comment created time in 2 months

Pull request review commentdecentfox/gapp-login

Added update wechat user info endpoint

 async def login_wechat(     openid = data["openid"]     unionid = data.get("unionid")     async with db.transaction() as tx:-        result = await (+        user = await (             WeChatIdentity.outerjoin(User)             .select()             .with_for_update(of=WeChatIdentity)             .where(WeChatIdentity.sub == openid)             .where(WeChatIdentity.idp == idp)-            .gino.load((WeChatIdentity, User))+            .gino.load((User.load(current_identity=WeChatIdentity)))

(trivial) redundant parenthesis

XeniaLu

comment created time in 2 months

delete branch fantix/authlib-gino

delete branch : token-identity

delete time in 2 months

push eventpython-gino/authlib-gino

Fantix King

commit sha 547252d2e0c2e2d0d8cdb74e4a039f2c7b29906a

add identity in token

view details

push time in 2 months

PR merged python-gino/authlib-gino

Add identity in token
+105 -33

0 comment

8 changed files

fantix

pr closed time in 2 months

PR opened python-gino/authlib-gino

Add identity in token
+105 -33

0 comment

8 changed files

pr created time in 2 months

push eventfantix/authlib-gino

Fantix King

commit sha 4ff2e998e543d0d2b16aebd100cb8fa98e4be623

add identity in token

view details

push time in 2 months

create barnchfantix/authlib-gino

branch : token-identity

created branch time in 2 months

fork fantix/authlib-gino

OpenID Connect provider implemented with Authlib and GINO.

fork in 2 months

Pull request review commentdecentfox/gapp-login

support json response for wxa login

 async def login_wechat(         rv = await auth.create_authorization_response(request, user, ctx)         if rv.status_code >= 400:             tx.raise_rollback()+        rv_location = rv.headers["location"]+        if rv_location.startswith("wxa://"):+            params = parse.parse_qs(rv_location.split("?", 1)[1])+            return auth.handle_response(200, dict(code=params.get("code")), {})

other than code, I think state is also needed

XeniaLu

comment created time in 2 months

Pull request review commentdecentfox/gapp-login

support json response for wxa login

 async def login_wechat(         rv = await auth.create_authorization_response(request, user, ctx)         if rv.status_code >= 400:             tx.raise_rollback()+        rv_location = rv.headers["location"]+        if rv_location.startswith("wxa://"):+            params = parse.parse_qs(rv_location.split("?", 1)[1])+            return auth.handle_response(200, dict(code=params.get("code")), {})

should be fine to return just a dict innstance

XeniaLu

comment created time in 2 months

Pull request review commentdecentfox/gapp-login

Added update wechat user info endpoint

 async def login_wechat(     return rv  +@router.post("/users/update/wxa")+async def update_wechat_account_info(+    user: User = Security(require_user),+    encrypted_data: str = Form(...),+    iv: str = Form(...),+    app_id: str = Form(...),+):+    encrypted_data = base64.b64decode(encrypted_data)+    iv = base64.b64decode(iv)+    identity = await (+        WeChatIdentity.query.select_from(User.outerjoin(WeChatIdentity))+        .where(User.id == user.id)+        .where(WeChatIdentity.wechat_app_id == app_id)+        .where(db.func.starts_with(WeChatIdentity.idp, "WECHAT"))

I think we could add client_id to the authorization code. I'll add to authlib-gino with a PR.

XeniaLu

comment created time in 2 months

pull request commentpython-gino/gino

[WIP] mysql support

Codacy Here is an overview of what got changed by this pull request:


Issues
======
- Added 3
           

Complexity increasing per file
==============================
- src/gino/dialects/aiomysql.py  7
- src/gino/dialects/base.py  4
- src/gino/strategies.py  1
         

Complexity decreasing per file
==============================
+ src/gino/crud.py  -2
         

Clones added
============
- src/gino/dialects/aiomysql.py  4
         

See the complete overview on Codacy

wwwjfy

comment created time in 2 months

push eventdecentfox/gapp-login

Xenia Lu

commit sha 8422591f6475116121277766338ed3b242174dc4

added WeChat login framework (#1) * added WeChat login framework * fixed wechat login progress * check unionid to use existing user * fixed async http client close issue & index issue * simplified get_wechat_clients cast function

view details

push time in 2 months

PR merged decentfox/gapp-login

added WeChat login framework
+459 -0

0 comment

11 changed files

XeniaLu

pr closed time in 2 months

Pull request review commentdecentfox/gapp-login

added WeChat login framework

+import logging+from urllib import parse++from authlib_gino.fastapi_session.api import auth, login_context+from authlib_gino.fastapi_session.models import User+from fastapi import APIRouter, HTTPException, Form, FastAPI, Depends+from starlette import status+from starlette.requests import Request++from . import config+from .models import db, WeChatIdentity+from .utils import WeChatAuthError++log = logging.getLogger(__name__)+router = APIRouter()+++@router.put("/login/wechat")+async def login_wechat(+    request: Request, code: str = Form(...), ctx=Depends(login_context)+):+    idp = ctx["idp"]+    idp_params = dict(parse.parse_qsl(parse.unquote(ctx["idp_params"])))+    appid = idp_params.get("appid")+    if not appid or appid not in config.WECHAT_CLIENTS:+        raise HTTPException(status.HTTP_403_FORBIDDEN, "Wrong wechat AppID")++    async with config.WECHAT_CLIENTS[appid] as client:+        try:+            data = await client.request_token(code=code)+        except WeChatAuthError as e:+            raise HTTPException(status.HTTP_403_FORBIDDEN, e.error)++        scope = idp_params.get("scope")+        user_info = {}+        if scope in {"snsapi_userinfo", "snsapi_login"}:+            try:+                user_info = await client.get_user_info(+                    access_token=data["access_token"], openid=data["openid"]+                )+            except WeChatAuthError as e:+                raise HTTPException(status.HTTP_403_FORBIDDEN, e.error)++    openid = data["openid"]+    unionid = data.get("unionid")+    async with db.transaction() as tx:+        result = await (+            WeChatIdentity.outerjoin(User)+            .select()+            .with_for_update(of=WeChatIdentity)+            .where(WeChatIdentity.sub == openid)+            .where(WeChatIdentity.idp == idp)+            .gino.load((WeChatIdentity, User))+            .first()+        )+        identity_data = dict(+            wechat_unionid=unionid,+            wechat_session_key=data.get("session_key"),+            wechat_refresh_token=data.get("refresh_token"),+            wechat_user_info=user_info,+        )+        if result is None:+            if unionid:+                user = await (+                    User.query.select_from(WeChatIdentity.outerjoin(User))+                    .where(WeChatIdentity.wechat_unionid == unionid)+                    .gino.first()+                )+                if not user:+                    user = await User.create(name=user_info.get("nickname") or openid)+            await WeChatIdentity.create(+                sub=openid, idp=idp, user_id=user.id, **identity_data,+            )+        else:+            identity, user = result+            identity_data["wechat_user_info"].update(identity.wechat_user_info or {})

This is wrong - this will overwrite the new user info with old data. If we're now putting the whole user info in a separate field, I think it is okay to just remove this line (letting the new user info overwrite the previous one without merging).

XeniaLu

comment created time in 2 months

Pull request review commentdecentfox/gapp-login

added WeChat login framework

+from typing import Optional

file name: maybe clients.py?

XeniaLu

comment created time in 2 months

Pull request review commentdecentfox/gapp-login

added WeChat login framework

+import logging+from urllib import parse++from authlib_gino.fastapi_session.api import auth, login_context+from authlib_gino.fastapi_session.models import User+from fastapi import APIRouter, HTTPException, Form, FastAPI, Depends+from starlette import status+from starlette.requests import Request++from . import config+from .models import db, WeChatIdentity+from .utils import WeChatAuthError++log = logging.getLogger(__name__)+router = APIRouter()+++@router.put("/login/wechat")+async def login_wechat(+    request: Request, code: str = Form(...), ctx=Depends(login_context)+):+    idp = ctx["idp"]+    idp_params = dict(parse.parse_qsl(parse.unquote(ctx["idp_params"])))+    appid = idp_params.get("appid")+    if not appid or appid not in config.WECHAT_CLIENTS:+        raise HTTPException(status.HTTP_403_FORBIDDEN, "Wrong wechat AppID")++    async with config.WECHAT_CLIENTS[appid] as client:+        try:+            data = await client.request_token(code=code)

(trivial) request_token(code)

XeniaLu

comment created time in 2 months

Pull request review commentdecentfox/gapp-login

added WeChat login framework

+import json+import typing++from authlib_gino.fastapi_session.gino_app import load_entry_point+from starlette.config import Config++from . import utils++config: Config = load_entry_point("config", Config)+++class WeChatClients:+    """A list of WeChat clients, the value can be a JSON string like this:+    [+        {+            "AppID": "...",+            "AppSecret": "...",+            "Type": "gapp_login.wechat.MiniProgramClient"+        }+    ]+    """++    def __init__(self, value: str):+        clients = json.loads(value)+        self._clients = clients+        self._client_mapping = {client["AppID"]: client for client in clients}++    def __len__(self) -> int:+        return len(self._clients)++    def __getitem__(self, appid: str) -> utils.WeChatBaseClient:+        client_conf = self._client_mapping.get(appid)+        client_cls = client_conf["Type"].split(".")[-1]+        return getattr(utils, client_cls)(appid, client_conf["AppSecret"])
import importlib
module, cls_name = client_conf["Type"].rsplit(".", 1)
client_cls = getattr(importlib.import_module(module), cls_name)

This should be done in __init__ I think, so that the client instances can be shared across the application. That would make this class just a regular dict - maybe we only need a function as the cast here?

XeniaLu

comment created time in 2 months

Pull request review commentdecentfox/gapp-login

added WeChat login framework

+import logging+from urllib import parse++from authlib_gino.fastapi_session.api import auth, login_context+from authlib_gino.fastapi_session.models import User+from fastapi import APIRouter, HTTPException, Form, FastAPI, Depends+from starlette import status+from starlette.requests import Request++from . import config+from .models import db, WeChatIdentity+from .utils import WeChatAuthError++log = logging.getLogger(__name__)+router = APIRouter()+++@router.put("/login/wechat")+async def login_wechat(+    request: Request, code: str = Form(...), ctx=Depends(login_context)+):+    idp = ctx["idp"]+    idp_params = dict(parse.parse_qsl(parse.unquote(ctx["idp_params"])))+    appid = idp_params.get("appid")+    if not appid or appid not in config.WECHAT_CLIENTS:+        raise HTTPException(status.HTTP_403_FORBIDDEN, "Wrong wechat AppID")++    async with config.WECHAT_CLIENTS[appid] as client:

In order to properly share the underlying AsyncClient, we only need to aclose() the AsyncClient instances before the application stops for just once, and remove this async with here. (AsyncClient.__aenter__ is not doing anything as you noticed that too)

XeniaLu

comment created time in 2 months

more