profile
viewpoint
mike bayer zzzeek Red Hat http://techspot.zzzeek.org SQLAlchemy, Alembic, Mako, dogpile.cache

zzzeek/alembic 408

THIS IS NOT THE OFFICIAL REPO - PLEASE SUBMIT PRs ETC AT: http://github.com/sqlalchemy/alembic

zzzeek/mako 166

Mirror of Mako Templates for Python - gitthub main is now at https://github.com/sqlalchemy/mako

gordthompson/sqlalchemy-access 14

A Microsoft Access dialect for SQLAlchemy.

zzzeek/nova_poc 7

Proof of concept of some ORM "quick win" patterns in OpenStack, as applied to one particular API method in Nova

zzzeek/akiban_python 6

Python DBAPI wrapper for Akiban server

zzzeek/oslo.db 2

OpenStack Common DB Code

zzzeek/findimports 1

Static analysis of Python import statements

sqlalchemy/testgerrit 0

a project used to test our infrastructure, e.g. git pushes, mirroring, CI, etc.

zzzeek/aiogevent 0

fork of https://bitbucket.org/haypo/aiogevent

zzzeek/asyncpg 0

A fast PostgreSQL Database Client Library for Python/asyncio.

issue openedsqlalchemy/sqlalchemy

write log messages to stderr

Hello everyone.

Describe the bug At the moment sqlalchemy prints out messages to stdout by default, e.g when setting echo=True in create_engine. This is also stated out clearly in docs.

Messages by sqlalchemy are usually useful for diagnostics. According to https://www.gnu.org/software/libc/manual/html_node/Standard-Streams.html it is recommended to use stderr for such a purpose. This becomes more important, if the program, in which sqlalchemy is used, writes its output to stdout, as there is no (easy) way which messages are a result of the program and which one are debug messages by sqlalchemy.

Expected behavior Messages by sqlalchemy should be printed out to stderr by default, or at least this should be configurable.

Is this the correct line that is responsable for the current behavior?

fin swimmer

created time in a few seconds

push eventzzzeek/alembic

CaselIT

commit sha e43d175233f206336c0637488e9ed231f421539a

Fix workflow test window fail Need to disable asyncio before calling into pytest session start or windows will fails randomly. Closes: #783 Pull-request: https://github.com/sqlalchemy/alembic/pull/783 Pull-request-sha: c92f7a2c0460899c942fcc668cbffe672b59df01 Change-Id: I9026334db651aa977fbc809494f449e38ca16a6f

view details

push time in 13 minutes

PR closed sqlalchemy/alembic

Reviewers
Fix workflow test window fail

Need to disable asyncio before calling into pytest session start or windows will fails randomly.

Successful run on windows https://github.com/CaselIT/alembic/actions/runs/496948121

+9 -1

4 comments

2 changed files

CaselIT

pr closed time in 13 minutes

push eventsqlalchemy/alembic

CaselIT

commit sha e43d175233f206336c0637488e9ed231f421539a

Fix workflow test window fail Need to disable asyncio before calling into pytest session start or windows will fails randomly. Closes: #783 Pull-request: https://github.com/sqlalchemy/alembic/pull/783 Pull-request-sha: c92f7a2c0460899c942fcc668cbffe672b59df01 Change-Id: I9026334db651aa977fbc809494f449e38ca16a6f

view details

push time in 13 minutes

pull request commentsqlalchemy/alembic

Fix workflow test window fail

Gerrit review https://gerrit.sqlalchemy.org/c/sqlalchemy/alembic/+/2491 has been merged. Congratulations! :)

CaselIT

comment created time in 14 minutes

pull request commentsqlalchemy/alembic

Fix workflow test window fail

mike bayer (zzzeek) wrote:

Code-Review+2 Workflow+2

View this in Gerrit at https://gerrit.sqlalchemy.org/c/sqlalchemy/alembic/+/2491

CaselIT

comment created time in 14 minutes

issue commentsqlalchemy/dogpile.cache

Make the Region Invalidation Strategy more flexible

I like the extensions that @zzzeek proposed, however...

Is this best implemented with a region invalidation though?

This looks like a usecase for “wraps” or a custom deserializer/backend. At least I have done similar things with that concept. (I’m not sure if the internal payload is available in wraps hooks or not). You can just perform the date operations there, and issue a miss so it repopulates with the generator.

JonathanWylie

comment created time in an hour

issue commentsqlalchemy/sqlalchemy

echo=true with hide_parameters=false not working

That is right, but I can't get the query to come with the parameters replaced :)

tiagotaveiragomes

comment created time in an hour

issue commentsqlalchemy/sqlalchemy

Index columns are not correctly reflected

I'm sorry, but I think that makes it clearer.

Thank you very much!

pythonmeister

comment created time in 2 hours

push eventzzzeek/sqlalchemy

Mike Bayer

commit sha e48d97b46a4ad19d4efe3c22d23d89d0ed92a310

Document Table/Column accessors As Sphinx will not allow us to add attributes to the .rst file while maintaining order, these have to be added as class-level attributes. Inlcude notes that "index" and "unique" parameters, while indicated by Column.index / Column.unique, do not actually indicate if the column is part of an index. Fixes: #5851 Change-Id: I18fbaf6c504c4b1005b4c51057f80397fb48b387

view details

mike bayer

commit sha 3df145c57c966e1511bccc117161563123ec6f0b

Merge "Document Table/Column accessors"

view details

push time in 2 hours

push eventsqlalchemy/sqlalchemy

Mike Bayer

commit sha e48d97b46a4ad19d4efe3c22d23d89d0ed92a310

Document Table/Column accessors As Sphinx will not allow us to add attributes to the .rst file while maintaining order, these have to be added as class-level attributes. Inlcude notes that "index" and "unique" parameters, while indicated by Column.index / Column.unique, do not actually indicate if the column is part of an index. Fixes: #5851 Change-Id: I18fbaf6c504c4b1005b4c51057f80397fb48b387

view details

mike bayer

commit sha 3df145c57c966e1511bccc117161563123ec6f0b

Merge "Document Table/Column accessors"

view details

push time in 2 hours

issue closedsqlalchemy/sqlalchemy

Index columns are not correctly reflected

Description I created a small test to learn about the reflection capabilities of SQLAlchemy. I used Python 3.9.1 amd64 under Windows 10 and SQLAlchemy 1.4.0b1 for this.


engine = create_engine('sqlite:///:memory:')
connection = engine.connect()
connection.execute("CREATE TABLE T1(ID INTEGER PRIMARY KEY, NAME CHAR(10) )")
connection.execute("CREATE TABLE T2(ID INTEGER PRIMARY KEY, T1_ID INTEGER, NAME VARCHAR, FOREIGN KEY (T1_ID) REFERENCES T1(ID))")
connection.execute("CREATE INDEX T1_NAME ON T1(NAME)")

metadata = MetaData(bind=connection)
metadata.reflect()

for t in metadata.sorted_tables:
    print(t.name)
    for c in t.columns:
        print(c.name, c.type, "indexed" if c.index else "not indexed", c.autoincrement, c.foreign_keys, c.primary_key)

Expected behavior The value of c.index must be True for T1.ID, T2.ID and T1.NAME.

To Reproduce See above.

Please do not use Flask-SQLAlchemy or any other third-party extensions or dependencies in test cases. The test case must illustrate the problem without using any third party SQLAlchemy extensions. Otherwise, please report the bug to those projects first.

# Insert code here

Error

# Copy error here. Please include the full stack trace.

Versions.

  • OS: Windows 10.0.19042.746
  • Python: 3.9.1
  • SQLAlchemy: 1.4.0b1
  • Database: SQLite
  • DBAPI:

Additional context N/A

Have a nice day!

closed time in 2 hours

pythonmeister

issue closedsqlalchemy/sqlalchemy

Nested subqueries with aliased queries/tables

Describe your question I'm trying to do multi-level nested query, but have some troubles I need to unnest several arrays in json fields and select some of them

Example

  • table is check, it has id (int) and response (json) fields
  • json example (contents of response column):
{
  "Results": [
    {
      "Name": "FooBar",
      "Field": "SomeValue"
    },
    {
      "Name": "SpamEggs",
      "Field": "AnotherValue"
    }
  ]
}

Here's simplified SQL example which I'm trying to achieve:

SELECT result_elem -> 'Field' as field
FROM "check" AS check_, json_array_elements(
    (
      SELECT check_inside.response -> 'Results'
      FROM "check" as check_inside
      WHERE check_inside.id = check_.id
    )
) AS result_elem
WHERE result_elem ->> 'Name' = 'FooBar';

at first I'm trying to implement inner subquery

python code:

from sqlalchemy import func
from sqlalchemy.orm import aliased
from sqlalchemy.dialects import postgresql

check_inside = aliased(Check, name="check_inside")
check_ = aliased(Check, name="check_")

q_result_elems = session.query(
    check_inside.response["Results"],
).filter(
    check_inside.id == check_.id,
)
subq_result_elems = q_result_elems.subquery(name="subq_result_elems")

print(str(q_result_elems.statement.compile(compile_kwargs={"literal_binds": True}, dialect=postgresql.dialect())))

prints this:

SELECT check_inside.response -> 'Results' AS anon_1 
FROM "check" AS check_inside, "check" AS check_ 
WHERE check_inside.id = check_.id

it already differs on the FROM line then I'm completely lost. this is kinda pseudo-code, which doesn't work at all:

q_foobar_fields = session.query(
    subq_result_elems.op("->")("Field").label("field"),
).select_from(
    check_,
    func.json_array_elements(
        subq_result_elems,
    ).label("result_elem")
).filter(
    subq_result_elems.op("->>")("Name") == "FooBar"
)

I tried a bit another way:

# added a label 'results' above and then creating a new query
q_foobar_fields = session.query(
    func.json_array_elements(subq_result_elems.c.results.op("->")("Field"))
).filter(
    subq_result_elems.c.results.op("->>")("Name") == "FooBar",
)

print(str(q_foobar_fields.statement.compile(compile_kwargs={"literal_binds": True}, dialect=postgresql.dialect())))

but it creates this SQL:

SELECT json_array_elements(subq_result_elems.results -> 'Field') AS json_array_elements_1
FROM (SELECT check_inside.response -> 'Results' AS results
      FROM "check" AS check_inside,
           "check" AS check_
      WHERE check_inside.id = check_.id) AS subq_result_elems
WHERE (subq_result_elems.results ->> 'Name') = 'FooBar';

and here 'FROM' is different from what I need (and it creates multiple rows for each check item)

How can I pass an aliased subquery and refer it in the 'select' part?

Thanks in advance!

Versions

  • SQLAlchemy: 1.3.18
  • Database: Postgres

Have a nice day!

closed time in 5 hours

mahenzon

issue commentsqlalchemy/sqlalchemy

echo=true with hide_parameters=false not working

So I ended up passing a logger at the DBAPI level by building the connection object on the do_connect event handler.

import logging
logging.basicConfig(level=logging.DEBUG)

import psycopg2
import sqlalchemy as sqa

handler = logging.FileHandler(filename="sql.log", mode="w")

sql_logger = logging.getLogger(__name__)
sql_logger.addHandler(handler)

@event.listens_for(sqa.engine.Engine, "do_connect")
def receive_do_connect(dialect, conn_rec, cargs, cparams):
  cparams["connection_factory"] = psycopg2.extras.LoggingConnection
  conn = psycopg2.connect(*cargs, **cparams)
  conn.initialize(sql_logger)
  return conn

engine = sqa.create_engine(metadata.conn) # Will log the actual queries to the file

df.to_sql(engine, ...)

Thank you for you help understanding this matter!

tiagotaveiragomes

comment created time in 6 hours

issue commentsqlalchemy/sqlalchemy

Index columns are not correctly reflected

Looks good! Precise and clear.

pythonmeister

comment created time in 9 hours

issue openedsqlalchemy/dogpile.cache

Make the Region Invalidation Strategy more flexible

I have a use case where I can only determine if cache entry has expired by knowing the key and the time it was created. As it stands the RegionInvalidationStrategy interface only receives the creation time. If it was passed the key as well then I could implement the behaviour I need.

The use case is this: I am querying the google analytics API, and caching the results in Redis. Although you can query for recent data, it is not guaranteed to be up to date. The key is the datetime for the analytics data, I want to expire a cache entry if it was created less than a day after key (i.e it was recent data when it was created), but only if it is now more than an hour after it was created. I have currently implemented this by overriding Region._is_cache_miss in a subclass. This is not great, because it only applies to the get_or_create class of methods, get_multi and get use self._unexpired_value_fn, which of course I could also override, but given RegionInvalidationStrategy is the proper way to manage invalidation I think it should be done there, and it also means it only has to be done in one place.

created time in 14 hours

issue commentsqlalchemy/sqlalchemy

echo=true with hide_parameters=false not working

Thank you very much for the clarification!

Maybe I can illustrate my difficulty with my use case. I need to write to a file the SQL generated from pandas df.to_sql() method using an engine object.

With the echo=True flag the queries are logged to std.out with the actual parameter values. However when I do logging.getLogger('sqlalchemy.engine').addHandler(logging.FileHandler(...)) the logged queries on the file do not show the bound parameter values.

Thank you so much!

tiagotaveiragomes

comment created time in 18 hours

pull request commentsqlalchemy/alembic

Fix workflow test window fail

Federico Caselli (CaselIT) wrote:

Code-Review+1 Workflow+1

Sometimes on windows when pytest calls the session start the module alembic.testing has not yet had the change to run, so asyncio is still enabled

Successful pipeline https://github.com/CaselIT/alembic/actions/runs/496948121

View this in Gerrit at https://gerrit.sqlalchemy.org/c/sqlalchemy/alembic/+/2491

CaselIT

comment created time in 19 hours

pull request commentsqlalchemy/alembic

Fix workflow test window fail

New Gerrit review created for change c92f7a2c0460899c942fcc668cbffe672b59df01: https://gerrit.sqlalchemy.org/c/sqlalchemy/alembic/+/2491

CaselIT

comment created time in 19 hours

PR opened sqlalchemy/alembic

Fix workflow test window fail

Need to disable asyncio before calling into pytest session start or windows will fails randomly.

+9 -1

0 comment

2 changed files

pr created time in 19 hours

issue commentsqlalchemy/sqlalchemy

Index columns are not correctly reflected

Mike Bayer has proposed a fix for this issue in the master branch:

Document Table/Column accessors https://gerrit.sqlalchemy.org/c/sqlalchemy/sqlalchemy/+/2490

pythonmeister

comment created time in 20 hours

push eventzzzeek/alembic

Mike Bayer

commit sha f902d0c7d9975bacd5ccf5caebc69d2bda66dfc8

retrieve 1.3 transaction from branched connection properly Fixed regression where Alembic would fail to create a transaction properly if the :class:`sqlalchemy.engine.Connection` were a so-called "branched" connection, that is, one where the ``.connect()`` method had been called to create a "sub" connection. Change-Id: I5319838a08686ede7dc873ce5d39428b1afdf6ff Fixes: #782

view details

push time in 20 hours

push eventsqlalchemy/alembic

Mike Bayer

commit sha f902d0c7d9975bacd5ccf5caebc69d2bda66dfc8

retrieve 1.3 transaction from branched connection properly Fixed regression where Alembic would fail to create a transaction properly if the :class:`sqlalchemy.engine.Connection` were a so-called "branched" connection, that is, one where the ``.connect()`` method had been called to create a "sub" connection. Change-Id: I5319838a08686ede7dc873ce5d39428b1afdf6ff Fixes: #782

view details

push time in 20 hours

issue closedsqlalchemy/alembic

1.5 transaction model did not expect branched connections

Describe the bug Hello, after the recent alembic 1.5.0 release our alembic upgrades have started failing with a stack trace in SQLAlchemy code.

The bottom of the stack trace is:

connection = <sqlalchemy.engine.base.Connection object at 0x144efe990>

    def _get_connection_transaction(connection):
        if sqla_14:
            return connection.get_transaction()
        else:
>           return connection._Connection__transaction
E           AttributeError: 'Connection' object has no attribute '_Connection__transaction'

We're currently on SQLAlchemy==1.3.22 (the latest non-beta release) Upgrading to SQLAlchemy=1.4.0b1 fixes the problem.

Expected behavior Alembic upgrades succeed without needing to upgrade to a beta version of SQLAlchemy

To Reproduce Check out the repo at https://github.com/dagster-io/dagster and run our automated test suite against master. Can provide more details/repro steps here if needed.

Error


=============================================================================================================== FAILURES ===============================================================================================================
_______________________________________________________________________________________________________ test_asset_key_structure _______________________________________________________________________________________________________

    def test_asset_key_structure():
        src_dir = file_relative_path(__file__, "compat_tests/snapshot_0_9_16_asset_key_structure")
        with copy_directory(src_dir) as test_dir:
            asset_storage = ConsolidatedSqliteEventLogStorage(test_dir)
            asset_keys = asset_storage.get_all_asset_keys()
            assert len(asset_keys) == 5
    
            # get a structured asset key
            asset_key = AssetKey(["dashboards", "cost_dashboard"])
    
            # check that backcompat events are read
            assert asset_storage.has_asset_key(asset_key)
            events = asset_storage.get_asset_events(asset_key)
            assert len(events) == 1
            run_ids = asset_storage.get_asset_run_ids(asset_key)
            assert len(run_ids) == 1
    
>           asset_storage.upgrade()

python_modules/dagster/dagster_tests/core_tests/storage_tests/test_assets.py:273: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
python_modules/dagster/dagster/core/storage/event_log/sqlite/consolidated_sqlite_event_log.py:105: in upgrade
    run_alembic_upgrade(alembic_config, conn)
python_modules/dagster/dagster/core/storage/sql.py:28: in run_alembic_upgrade
    upgrade(alembic_config, rev)
../.pyenv/versions/3.7.8/envs/dagster-3.7.8/lib/python3.7/site-packages/alembic/command.py:294: in upgrade
    script.run_env()
../.pyenv/versions/3.7.8/envs/dagster-3.7.8/lib/python3.7/site-packages/alembic/script/base.py:481: in run_env
    util.load_python_file(self.dir, "env.py")
../.pyenv/versions/3.7.8/envs/dagster-3.7.8/lib/python3.7/site-packages/alembic/util/pyfiles.py:97: in load_python_file
    module = load_module_py(module_id, path)
../.pyenv/versions/3.7.8/envs/dagster-3.7.8/lib/python3.7/site-packages/alembic/util/compat.py:182: in load_module_py
    spec.loader.exec_module(module)
<frozen importlib._bootstrap_external>:728: in exec_module
    ???
<frozen importlib._bootstrap>:219: in _call_with_frames_removed
    ???
python_modules/dagster/dagster/core/storage/event_log/sqlite/alembic/env.py:15: in <module>
    run_migrations_online(context, config, target_metadata)
python_modules/dagster/dagster/core/storage/sqlite.py:23: in run_migrations_online
    run_migrations_online_(*args, **kwargs)
python_modules/dagster/dagster/core/storage/sql.py:131: in run_migrations_online
    context.run_migrations()
<string>:8: in run_migrations
    ???
../.pyenv/versions/3.7.8/envs/dagster-3.7.8/lib/python3.7/site-packages/alembic/runtime/environment.py:813: in run_migrations
    self.get_context().run_migrations(**kw)
../.pyenv/versions/3.7.8/envs/dagster-3.7.8/lib/python3.7/site-packages/alembic/runtime/migration.py:549: in run_migrations
    with self.begin_transaction(_per_migration=True):
../.pyenv/versions/3.7.8/envs/dagster-3.7.8/lib/python3.7/site-packages/alembic/runtime/migration.py:395: in begin_transaction
    self.connection
../.pyenv/versions/3.7.8/envs/dagster-3.7.8/lib/python3.7/site-packages/alembic/util/sqla_compat.py:84: in _safe_begin_connection_transaction
    transaction = _get_connection_transaction(connection)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

connection = <sqlalchemy.engine.base.Connection object at 0x144efe990>

    def _get_connection_transaction(connection):
        if sqla_14:
            return connection.get_transaction()
        else:
>           return connection._Connection__transaction
E           AttributeError: 'Connection' object has no attribute '_Connection__transaction'

../.pyenv/versions/3.7.8/envs/dagster-3.7.8/lib/python3.7/site-packages/alembic/util/sqla_compat.py:105: AttributeError

Versions.

  • OS:
  • Python: 3.7.8
  • Alembic: 1.5.1
  • SQLAlchemy: 1.3.22
  • Database:
  • DBAPI:

Have a nice day!

closed time in 20 hours

gibsondan

push eventzzzeek/sqlalchemy

Federico Caselli

commit sha 442ca5c000aab9faa69d514e7902c9d903cbd987

Disallow non-native psycopg2 Unicode in Python 3; update docs Fixed issue where the psycopg2 dialect would silently pass the ``use_native_unicode=False`` flag without actually having any effect under Python 3, as the psycopg2 DBAPI uses Unicode unconditionally under Python 3. This usage now raises an :class:`_exc.ArgumentError` when used under Python 3. Added test support for Python 2. Additionally, added documentation for client_encoding parameter that may be passed to libpq directly via psycopg2. Change-Id: I40ddf6382c157fa9399c21f0e01064197ea100f8

view details

mike bayer

commit sha 0af22b4cfc21a8798b6fab125275b735302575fa

Merge "Disallow non-native psycopg2 Unicode in Python 3; update docs"

view details

push time in 20 hours

push eventsqlalchemy/sqlalchemy

Federico Caselli

commit sha 442ca5c000aab9faa69d514e7902c9d903cbd987

Disallow non-native psycopg2 Unicode in Python 3; update docs Fixed issue where the psycopg2 dialect would silently pass the ``use_native_unicode=False`` flag without actually having any effect under Python 3, as the psycopg2 DBAPI uses Unicode unconditionally under Python 3. This usage now raises an :class:`_exc.ArgumentError` when used under Python 3. Added test support for Python 2. Additionally, added documentation for client_encoding parameter that may be passed to libpq directly via psycopg2. Change-Id: I40ddf6382c157fa9399c21f0e01064197ea100f8

view details

mike bayer

commit sha 0af22b4cfc21a8798b6fab125275b735302575fa

Merge "Disallow non-native psycopg2 Unicode in Python 3; update docs"

view details

push time in 20 hours

issue commentsqlalchemy/sqlalchemy

Index columns are not correctly reflected

Hi,

The Inspector class does its job, thanks for the hint. Perhaps the best would be to document that.

pythonmeister

comment created time in 21 hours

issue commentsqlalchemy/alembic

1.5 transaction model did not expect branched connections

Mike Bayer has proposed a fix for this issue in the master branch:

retrieve 1.3 transaction from branched connection properly https://gerrit.sqlalchemy.org/c/sqlalchemy/alembic/+/2489

gibsondan

comment created time in 21 hours

issue openedsqlalchemy/sqlalchemy

echo=true with hide_parameters=false not working

Describe the bug I am trying to use sqa.create_engine(conn, echo=True, hide_parameters=False) to log queries with actual parameter values, but it looks like the parameters are not being replaced.

To Reproduce

import sqlalchemy as sqa

echo_engine = sqa.create_engine(
  "postgresql://localhost/postgres",
  echo=True,
  hide_parameters=False
)

echo_engine.execute(sqa.text("select :some_private_name"), {"some_private_name": "1"})
# Insert code here

Error

2021-01-19 19:23:07,829 INFO sqlalchemy.engine.base.Engine select version()
2021-01-19 19:23:07,829 INFO sqlalchemy.engine.base.Engine {}
2021-01-19 19:23:07,832 INFO sqlalchemy.engine.base.Engine select current_schema()
2021-01-19 19:23:07,833 INFO sqlalchemy.engine.base.Engine {}
2021-01-19 19:23:07,840 INFO sqlalchemy.engine.base.Engine SELECT CAST('test plain returns' AS VARCHAR(60)) AS anon_1
2021-01-19 19:23:07,840 INFO sqlalchemy.engine.base.Engine {}
2021-01-19 19:23:07,844 INFO sqlalchemy.engine.base.Engine SELECT CAST('test unicode returns' AS VARCHAR(60)) AS anon_1
2021-01-19 19:23:07,844 INFO sqlalchemy.engine.base.Engine {}
2021-01-19 19:23:07,845 INFO sqlalchemy.engine.base.Engine show standard_conforming_strings
2021-01-19 19:23:07,845 INFO sqlalchemy.engine.base.Engine {}
2021-01-19 19:23:07,847 INFO sqlalchemy.engine.base.Engine select %(some_private_name)s
2021-01-19 19:23:07,847 INFO sqlalchemy.engine.base.Engine {'some_private_name': '1'}

I would expect select %(some_private_name)s to be select 1

Versions.

  • OS: MacOS BigSur
  • Python: python3.8
  • SQLAlchemy: 1.3.22
  • Database: postgresql
  • DBAPI: postgresql://

Have a nice day!

created time in 21 hours

issue commentsqlalchemy/sqlalchemy

warn when postgresql distinct columns used on other dialect

Could you provide an example? I'm not sure I understand the problem

sqlalchemy-bot

comment created time in 21 hours

more