GithubHelp home page GithubHelp logo

googlecloudplatform / cloud-sql-python-connector Goto Github PK

View Code? Open in Web Editor NEW
273.0 28.0 65.0 1.74 MB

A Python library for connecting securely to your Cloud SQL instances.

License: Apache License 2.0

Python 90.73% Shell 9.27%
libraries cloud-sql python

cloud-sql-python-connector's Introduction

cloud-sql-python-connector image

Cloud SQL Python Connector

Open In Colab CI pypi PyPI download month python

The Cloud SQL Python Connector is a Cloud SQL connector designed for use with the Python language. Using a Cloud SQL connector provides a native alternative to the Cloud SQL Auth Proxy while providing the following benefits:

  • IAM Authorization: uses IAM permissions to control who/what can connect to your Cloud SQL instances
  • Improved Security: uses robust, updated TLS 1.3 encryption and identity verification between the client connector and the server-side proxy, independent of the database protocol.
  • Convenience: removes the requirement to use and distribute SSL certificates, as well as manage firewalls or source/destination IP addresses.
  • (optionally) IAM DB Authentication: provides support for Cloud SQL’s automatic IAM DB AuthN feature.

The Cloud SQL Python Connector is a package to be used alongside a database driver. Currently supported drivers are:

Installation

You can install this library with pip install, specifying the driver based on your database dialect.

MySQL

pip install "cloud-sql-python-connector[pymysql]"

Postgres

There are two different database drivers that are supported for the Postgres dialect:

pg8000

pip install "cloud-sql-python-connector[pg8000]"

asyncpg

pip install "cloud-sql-python-connector[asyncpg]"

SQL Server

pip install "cloud-sql-python-connector[pytds]"

APIs and Services

This package requires the following to successfully make Cloud SQL Connections:

  • IAM principal (user, service account, etc.) with the Cloud SQL Client role. This IAM principal will be used for credentials.
  • The Cloud SQL Admin API to be enabled within your Google Cloud Project. By default, the API will be called in the project associated with the IAM principal.

Credentials

This library uses the Application Default Credentials (ADC) strategy for resolving credentials. Please see these instructions for how to set your ADC (Google Cloud Application vs Local Development, IAM user vs service account credentials), or consult the google.auth package.

To explicitly set a specific source for the credentials, see Configuring the Connector below.

Usage

This package provides several functions for authorizing and encrypting connections. These functions are used with your database driver to connect to your Cloud SQL instance.

The instance connection name for your Cloud SQL instance is always in the format "project:region:instance".

How to use this Connector

To connect to Cloud SQL using the connector, inititalize a Connector object and call its connect method with the proper input parameters.

The Connector itself creates connection objects by calling its connect method but does not manage database connection pooling. For this reason, it is recommended to use the connector alongside a library that can create connection pools, such as SQLAlchemy. This will allow for connections to remain open and be reused, reducing connection overhead and the number of connections needed.

In the Connector's connect method below, input your connection string as the first positional argument and the name of the database driver for the second positional argument. Insert the rest of your connection keyword arguments like user, password and database. You can also set the optional timeout or ip_type keyword arguments.

To use this connector with SQLAlchemy, use the creator argument for sqlalchemy.create_engine:

from google.cloud.sql.connector import Connector
import sqlalchemy

# initialize Connector object
connector = Connector()

# function to return the database connection
def getconn() -> pymysql.connections.Connection:
    conn: pymysql.connections.Connection = connector.connect(
        "project:region:instance",
        "pymysql",
        user="my-user",
        password="my-password",
        db="my-db-name"
    )
    return conn

# create connection pool
pool = sqlalchemy.create_engine(
    "mysql+pymysql://",
    creator=getconn,
)

The returned connection pool engine can then be used to query and modify the database.

# insert statement
insert_stmt = sqlalchemy.text(
    "INSERT INTO my_table (id, title) VALUES (:id, :title)",
)

with pool.connect() as db_conn:
    # insert into database
    db_conn.execute(insert_stmt, parameters={"id": "book1", "title": "Book One"})

    # query database
    result = db_conn.execute(sqlalchemy.text("SELECT * from my_table")).fetchall()

    # commit transaction (SQLAlchemy v2.X.X is commit as you go)
    db_conn.commit()

    # Do something with the results
    for row in result:
        print(row)

To close the Connector object's background resources, call its close() method as follows:

connector.close()

Note

For more examples of using SQLAlchemy to manage connection pooling with the connector, please see Cloud SQL SQLAlchemy Samples.

Configuring the Connector

If you need to customize something about the connector, or want to specify defaults for each connection to make, you can initialize a Connector object as follows:

from google.cloud.sql.connector import Connector

# Note: all parameters below are optional
connector = Connector(
    ip_type="public",  # can also be "private" or "psc"
    enable_iam_auth=False,
    timeout=30,
    credentials=custom_creds, # google.auth.credentials.Credentials
    refresh_strategy="lazy",  # can be "lazy" or "background"
)

Using Connector as a Context Manager

The Connector object can also be used as a context manager in order to automatically close and cleanup resources, removing the need for explicit calls to connector.close().

Connector as a context manager:

from google.cloud.sql.connector import Connector
import pymysql
import sqlalchemy

# helper function to return SQLAlchemy connection pool
def init_connection_pool(connector: Connector) -> sqlalchemy.engine.Engine:
    # function used to generate database connection
    def getconn() -> pymysql.connections.Connection:
        conn = connector.connect(
            "project:region:instance",
            "pymysql",
            user="my-user",
            password="my-password",
            db="my-db-name"
        )
        return conn

    # create connection pool
    pool = sqlalchemy.create_engine(
        "mysql+pymysql://",
        creator=getconn,
    )
    return pool

# initialize Cloud SQL Python Connector as context manager
with Connector() as connector:
    # initialize connection pool
    pool = init_connection_pool(connector)
    # insert statement
    insert_stmt = sqlalchemy.text(
        "INSERT INTO my_table (id, title) VALUES (:id, :title)",
    )

    # interact with Cloud SQL database using connection pool
    with pool.connect() as db_conn:
        # insert into database
        db_conn.execute(insert_stmt, parameters={"id": "book1", "title": "Book One"})

        # commit transaction (SQLAlchemy v2.X.X is commit as you go)
        db_conn.commit()

        # query database
        result = db_conn.execute(sqlalchemy.text("SELECT * from my_table")).fetchall()

        # Do something with the results
        for row in result:
            print(row)

Configuring a Lazy Refresh (Cloud Run, Cloud Functions etc.)

The Connector's refresh_strategy argument can be set to "lazy" to configure the Python Connector to retrieve connection info lazily and as-needed. Otherwise, a background refresh cycle runs to retrive the connection info periodically. This setting is useful in environments where the CPU may be throttled outside of a request context, e.g., Cloud Run, Cloud Functions, etc.

To set the refresh strategy, set the refresh_strategy keyword argument when initializing a Connector:

connector = Connector(refresh_strategy="lazy")

Specifying IP Address Type

The Cloud SQL Python Connector can be used to connect to Cloud SQL instances using both public and private IP addresses, as well as Private Service Connect (PSC). To specify which IP address type to connect with, set the ip_type keyword argument when initializing a Connector() or when calling connector.connect().

Possible values for ip_type are "public" (default value), "private", and "psc".

Example:

conn = connector.connect(
    "project:region:instance",
    "pymysql",
    ip_type="private"  # use private IP
... insert other kwargs ...
)

Important

If specifying Private IP or Private Service Connect (PSC), your application must be attached to the proper VPC network to connect to your Cloud SQL instance. For most applications this will require the use of a VPC Connector.

Automatic IAM Database Authentication

Connections using Automatic IAM database authentication are supported when using Postgres or MySQL drivers. First, make sure to configure your Cloud SQL Instance to allow IAM authentication and add an IAM database user.

Now, you can connect using user or service account credentials instead of a password. In the call to connect, set the enable_iam_auth keyword argument to true and the user argument to the appropriately formatted IAM principal.

Postgres: For an IAM user account, this is the user's email address. For a service account, it is the service account's email without the .gserviceaccount.com domain suffix.

MySQL: For an IAM user account, this is the user's email address, without the @ or domain name. For example, for [email protected], set the user argument to test-user. For a service account, this is the service account's email address without the @project-id.iam.gserviceaccount.com suffix.

Example:

conn = connector.connect(
     "project:region:instance",
     "pg8000",
     user="[email protected]",
     db="my-db-name",
     enable_iam_auth=True,
 )

SQL Server (MSSQL)

Important

If your SQL Server instance is set to enforce SSL connections, you need to download the CA certificate for your instance and include cafile={path to downloaded certificate} and validate_host=False. This is a workaround for a known issue.

Active Directory Authentication

Active Directory authentication for SQL Server instances is currently only supported on Windows. First, make sure to follow these steps to set up a Managed AD domain and join your Cloud SQL instance to the domain. See here for more info on Cloud SQL Active Directory integration.

Once you have followed the steps linked above, you can run the following code to return a connection object:

conn = connector.connect(
    "project:region:instance",
    "pytds",
    db="my-db-name",
    active_directory_auth=True,
    server_name="public.[instance].[location].[project].cloudsql.[domain]",
)

Or, if using Private IP:

conn = connector.connect(
    "project:region:instance",
    "pytds",
    db="my-db-name",
    active_directory_auth=True,
    server_name="private.[instance].[location].[project].cloudsql.[domain]",
    ip_type="private"
)

Using the Python Connector with Python Web Frameworks

The Python Connector can be used alongside popular Python web frameworks such as Flask, FastAPI, etc, to integrate Cloud SQL databases within your web applications.

Note

For serverless environments such as Cloud Functions, Cloud Run, etc, it may be beneficial to initialize the Connector with the lazy refresh strategy. i.e. Connector(refresh_strategy="lazy")

See Configuring a Lazy Refresh

Flask-SQLAlchemy

Flask-SQLAlchemy is an extension for Flask that adds support for SQLAlchemy to your application. It aims to simplify using SQLAlchemy with Flask by providing useful defaults and extra helpers that make it easier to accomplish common tasks.

You can configure Flask-SQLAlchemy to connect to a Cloud SQL database from your web application through the following:

from flask import Flask
from flask_sqlalchemy import SQLAlchemy
from google.cloud.sql.connector import Connector


# initialize Python Connector object
connector = Connector()

# Python Connector database connection function
def getconn():
    conn = connector.connect(
        "project:region:instance-name", # Cloud SQL Instance Connection Name
        "pg8000",
        user="my-user",
        password="my-password",
        db="my-database",
        ip_type="public"  # "private" for private IP
    )
    return conn


app = Flask(__name__)

# configure Flask-SQLAlchemy to use Python Connector
app.config['SQLALCHEMY_DATABASE_URI'] = "postgresql+pg8000://"
app.config['SQLALCHEMY_ENGINE_OPTIONS'] = {
    "creator": getconn
}

# initialize the app with the extension
db = SQLAlchemy()
db.init_app(app)

For more details on how to use Flask-SQLAlchemy, check out the Flask-SQLAlchemy Quickstarts

FastAPI

FastAPI is a modern, fast (high-performance), web framework for building APIs with Python based on standard Python type hints.

You can configure FastAPI to connect to a Cloud SQL database from your web application using SQLAlchemy ORM through the following:

from sqlalchemy import create_engine
from sqlalchemy.engine import Engine
from sqlalchemy.ext.declarative import declarative_base
from sqlalchemy.orm import sessionmaker
from google.cloud.sql.connector import Connector

# helper function to return SQLAlchemy connection pool
def init_connection_pool(connector: Connector) -> Engine:
    # Python Connector database connection function
    def getconn():
        conn = connector.connect(
            "project:region:instance-name", # Cloud SQL Instance Connection Name
            "pg8000",
            user="my-user",
            password="my-password",
            db="my-database",
            ip_type="public"  # "private" for private IP
        )
        return conn

    SQLALCHEMY_DATABASE_URL = "postgresql+pg8000://"

    engine = create_engine(
        SQLALCHEMY_DATABASE_URL , creator=getconn
    )
    return engine

# initialize Cloud SQL Python Connector
connector = Connector()

# create connection pool engine
engine = init_connection_pool(connector)

# create SQLAlchemy ORM session
SessionLocal = sessionmaker(autocommit=False, autoflush=False, bind=engine)

Base = declarative_base()

To learn more about integrating a database into your FastAPI application, follow along the FastAPI SQL Database guide.

Async Driver Usage

The Cloud SQL Connector is compatible with asyncio to improve the speed and efficiency of database connections through concurrency. You can use all non-asyncio drivers through the Connector.connect_async function, in addition to the following asyncio database drivers:

The Cloud SQL Connector has a helper create_async_connector function that is recommended for asyncio database connections. It returns a Connector object that uses the current thread's running event loop. This is different than Connector() which by default initializes a new event loop in a background thread.

The create_async_connector allows all the same input arguments as the Connector object.

Once a Connector object is returned by create_async_connector you can call its connect_async method, just as you would the connect method:

import asyncpg

import sqlalchemy
from sqlalchemy.ext.asyncio import AsyncEngine, create_async_engine

from google.cloud.sql.connector import Connector, create_async_connector

async def init_connection_pool(connector: Connector) -> AsyncEngine:
    # initialize Connector object for connections to Cloud SQL
    async def getconn() -> asyncpg.Connection:
        conn: asyncpg.Connection = await connector.connect_async(
            "project:region:instance",  # Cloud SQL instance connection name
            "asyncpg",
            user="my-user",
            password="my-password",
            db="my-db-name"
            # ... additional database driver args
        )
        return conn

    # The Cloud SQL Python Connector can be used along with SQLAlchemy using the
    # 'async_creator' argument to 'create_async_engine'
    pool = create_async_engine(
        "postgresql+asyncpg://",
        async_creator=getconn,
    )
    return pool

async def main():
    # initialize Connector object for connections to Cloud SQL
    connector = await create_async_connector()

    # initialize connection pool
    pool = await init_connection_pool(connector)

    # example query
    async with pool.connect() as conn:
        await conn.execute(sqlalchemy.text("SELECT NOW()"))

    # close Connector
    await connector.close_async()

    # dispose of connection pool
    await pool.dispose()

For more details on additional database arguments with an asyncpg.Connection , please visit the official documentation.

Async Context Manager

An alternative to using the create_async_connector function is initializing a Connector as an async context manager, removing the need for explicit calls to connector.close_async() to cleanup resources.

Note

This alternative requires that the running event loop be passed in as the loop argument to Connector().

import asyncio
import asyncpg

import sqlalchemy
from sqlalchemy.ext.asyncio import AsyncEngine, create_async_engine

from google.cloud.sql.connector import Connector

async def init_connection_pool(connector: Connector) -> AsyncEngine:
    # initialize Connector object for connections to Cloud SQL
    async def getconn() -> asyncpg.Connection:
            conn: asyncpg.Connection = await connector.connect_async(
                "project:region:instance",  # Cloud SQL instance connection name
                "asyncpg",
                user="my-user",
                password="my-password",
                db="my-db-name"
                # ... additional database driver args
            )
            return conn

    # The Cloud SQL Python Connector can be used along with SQLAlchemy using the
    # 'async_creator' argument to 'create_async_engine'
    pool = create_async_engine(
        "postgresql+asyncpg://",
        async_creator=getconn,
    )
    return pool

async def main():
    # initialize Connector object for connections to Cloud SQL
    loop = asyncio.get_running_loop()
    async with Connector(loop=loop) as connector:
        # initialize connection pool
        pool = await init_connection_pool(connector)

        # example query
        async with pool.connect() as conn:
            await conn.execute(sqlalchemy.text("SELECT NOW()"))

        # dispose of connection pool
        await pool.dispose()

Debug Logging

The Cloud SQL Python Connector uses the standard Python logging module for debug logging support.

Add the below code to your application to enable debug logging with the Cloud SQL Python Connector:

import logging

logging.basicConfig(format="%(asctime)s [%(levelname)s]: %(message)s")
logger = logging.getLogger(name="google.cloud.sql.connector")
logger.setLevel(logging.DEBUG)

For more details on configuring logging, please refer to the Python logging docs.

Support policy

Major version lifecycle

This project uses semantic versioning, and uses the following lifecycle regarding support for a major version:

Active - Active versions get all new features and security fixes (that wouldn’t otherwise introduce a breaking change). New major versions are guaranteed to be "active" for a minimum of 1 year. Deprecated - Deprecated versions continue to receive security and critical bug fixes, but do not receive new features. Deprecated versions will be publicly supported for 1 year. Unsupported - Any major version that has been deprecated for >=1 year is considered publicly unsupported.

Supported Python Versions

We follow the Python Version Support Policy used by Google Cloud Libraries for Python. Changes in supported Python versions will be considered a minor change, and will be listed in the release notes.

Release cadence

This project aims for a minimum monthly release cadence. If no new features or fixes have been added, a new PATCH version with the latest dependencies is released.

Contributing

We welcome outside contributions. Please see our Contributing Guide for details on how best to contribute.

cloud-sql-python-connector's People

Contributors

averikitsch avatar crwilcox avatar danielroseman avatar dependabot[bot] avatar enkelli avatar enocom avatar github-actions[bot] avatar hessjcg avatar jackwotherspoon avatar kmskelton avatar kurtisvg avatar parthea avatar pataquets avatar rednam-ntn avatar release-please[bot] avatar renovate-bot avatar ruyadorno avatar shivaal-scio avatar shubha-rajan avatar ttosta-google avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

cloud-sql-python-connector's Issues

Certificate error after running program for 24 hours

Bug Description

SQL connection works fine, but after running program for 24 hours, I get a bad certificate error.

My program launches AI platform instances every 12 hours. Once I do a db query after the 24 hour mark, I get this error:
Error: (2013, 'Lost connection to MySQL server during query ([SSL: SSLV3_ALERT_BAD_CERTIFICATE] sslv3 alert bad certificate (_ssl.c:2622))')

Example code (or command)

def executeQuery(query, params=None):
    db = getConnection()
    cursor = db.cursor(DictCursor)
    response = cursor.execute(query, params)
    result = cursor.fetchall()
    cursor.close()
    db.close()

    return result

async def executeQueryAsync(query, params=None):
    return await get_event_loop().run_in_executor(None, lambda: executeQuery(query, params=params))


def getConnection():
    return _getPyConnection()
    

def _getPyConnection():
    return connector.connect(
        instanceName,
        "pymysql",
        **dbCloudSqlConnector,
        autocommit=True
    )

Environment

  1. OS type and version: Windows
  2. Python version: 3.8.5
  3. Connector version: 1.0.2

Allow credential impersonation

Would like to use service account impersonation.

I've executed gcloud auth application-default login --impersonate-service-account=<name>@<project>.iam.gserviceaccount.com.

Currently, using the library throws:

    raise exceptions.DefaultCredentialsError(
google.auth.exceptions.DefaultCredentialsError: The file /home/gcg/.config/gcloud/application_default_credentials.json does not have a valid type. Type is impersonated_service_account, expected one of ('authorized_user', 'service_account', 'external_account').

Why are we getting 429 "Too Many Request" responses?

Bug Description

Facing runtime error 429, Too Many Requests to www.googleapis.com/sql/v1beta4/projects//instances/

Stacktrace

File "/usr/local/lib/python3.7/site-packages/google/cloud/sql/connector/connector.py", line 123, in connect raise (e) File "/usr/local/lib/python3.7/site-packages/google/cloud/sql/connector/connector.py", line 119, in connect return icm.connect(driver, ip_types, timeout, **kwargs) File "/usr/local/lib/python3.7/site-packages/google/cloud/sql/connector/instance_connection_manager.py", line 495, in connect connection = connect_future.result(timeout) File "/usr/local/lib/python3.7/concurrent/futures/_base.py", line 435, in result return self.__get_result() File "/usr/local/lib/python3.7/concurrent/futures/_base.py", line 384, in __get_result raise self._exception File "/usr/local/lib/python3.7/site-packages/google/cloud/sql/connector/instance_connection_manager.py", line 531, in _connect instance_data = await self._current File "/usr/local/lib/python3.7/site-packages/google/cloud/sql/connector/instance_connection_manager.py", line 424, in _perform_refresh await refresh_task File "/usr/local/lib/python3.7/site-packages/google/cloud/sql/connector/instance_connection_manager.py", line 322, in _get_instance_data metadata, ephemeral_cert = await asyncio.gather(metadata_task, ephemeral_task) File "/usr/local/lib/python3.7/site-packages/google/cloud/sql/connector/refresh_utils.py", line 88, in _get_metadata resp = await client_session.get(url, headers=headers, raise_for_status=True) File "/usr/local/lib/python3.7/site-packages/aiohttp/client.py", line 625, in _request resp.raise_for_status() File "/usr/local/lib/python3.7/site-packages/aiohttp/client_reqrep.py", line 1005, in raise_for_status headers=self.headers, RuntimeError: aiohttp.client_exceptions.ClientResponseError: 429, message='Too Many Requests', url=URL('https://www.googleapis.com/sql/v1beta4/projects/<project-id>/instances/<instance-name>')

How to reproduce

When trying to write to Cloud SQL (MySQL) database. Our use case involves writing few hundred thousand records, but we do it in batches

Environment

  1. OS type and version: Linux
  2. Python version: 3.8
  3. Connector version: 0.2.1

connector.connect() hangs on exit

Repro:

python3 -m venv cloudsql_env
source cloudsql_env/bin/activate
pip install git+https://github.com/GoogleCloudPlatform/cloud-sql-python-connector@41e317ecdf6131d933ee455cd68fc4006aac8584

ipython
> from google.cloud.sql.connector import connector
> conn = connector.connect(instance_connection_string=<instance>, driver='pymysql', user=<user>, password=<password>, db=<db>)
> exit
[hangs]

This is hanging because the _loop.run_forever thread created on connector.py#L33 is still running in the background. I was able to fix this by setting it as a daemon thread, so that it gets killed when the program exits:

_thread = Thread(target=_loop.run_forever, daemon=True)

@crwilcox I believe this was Ryan's intern project, but I'd really appreciate if this could be fixed, even if this is no longer actively maintained. Would you mind making the change? I've confirmed that with them, I can make a connection using this library. (FWIW I believe the creator of the previous issue was not affected because Dataflow might kill the process on its own.)

Add timeout in InstanceConnectionManager.connect

Currently InstanceConnectionManager.connect blocks and waits for the result of self._current_instance_data before moving on to connect using the OpenSSL SSLcontext object. Connect needs some way to timeout.

AttributeError: 'Credentials' object has no attribute 'with_scopes'

Reproduce:

python3 -m venv cloudsql_env
source cloudsql_env/bin/activate
pip install git+https://github.com/GoogleCloudPlatform/cloud-sql-python-connector@5b02e80149864369a0b8f6bc142d55c278a6e796

ipython
> from google.cloud.sql.connector import connector
> conn = connector.connect("<my-project>:us-central1:sql-1", "pymysql", user="root", password="<my-password", db="<my-db>")

---------------------------------------------------------------------------
AttributeError                            Traceback (most recent call last)
<ipython-input-2-2f8a588de201> in <module>
      4     user="root",
      5     password="<>",
----> 6     db="<>")

~/workspace/scio/cloudsql_env/lib/python3.7/site-packages/google/cloud/sql/connector/connector.py in connect(instance_connection_string, driver, **kwargs)
     66 
     67     loop = _get_loop()
---> 68     icm = InstanceConnectionManager(instance_connection_string, loop)
     69     return icm.connect(driver, user=kwargs.pop("user"), **kwargs)

~/workspace/scio/cloudsql_env/lib/python3.7/site-packages/google/cloud/sql/connector/InstanceConnectionManager.py in __init__(self, instance_connection_string, loop)
    152 
    153         self._loop = loop
--> 154         self._auth_init()
    155         self._priv_key, pub_key = generate_keys()
    156         self._pub_key = pub_key.decode("UTF-8")

~/workspace/scio/cloudsql_env/lib/python3.7/site-packages/google/cloud/sql/connector/InstanceConnectionManager.py in _auth_init(self)
    373 
    374         credentials, project = google.auth.default()
--> 375         scoped_credentials = credentials.with_scopes(
    376             [
    377                 "https://www.googleapis.com/auth/sqlservice.admin",

AttributeError: 'Credentials' object has no attribute 'with_scopes'

Clean up perform_refresh and schedule_refresh methods

These methods are a bit confusing - see if we can make them a bit cleaner now that we aren't using locking on the backend.

  • Update _perform_refresh to return an InstanceMetadata object instead of a task to eliminate use of nested tasks
  • Move static methods to their own files

SSLError(SSLEOFError(8, 'EOF occurred in violation of protocol (_ssl.c:852)'))

pymysql.err.OperationalError: (2003, "Can't connect to MySQL server on '35.226.49.18' (timed out)")
NO MATCH
The above exception was the direct cause of the following exception:
NO MATCH
Is there anyway to avoid this timeout issue.

def getconn() -> pymysql.connections.Connection:
    conn: pymysql.connections.Connection = connector.connect(
        connection_name,
        "pymysql",
        user=db_user,
        password=db_pass,
        db=db_name,
        port=3306,
    )
    return conn

pool = sqlalchemy.create_engine(
    # Equivalent URL:
    # mysql+pymysql://<db_user>:<db_pass>@<db_host>:<db_port>/<db_name>
    "mysql+pymysql://",
    creator=getconn,
    **db_config
)

'Connection' object has no attribute 'cursor'

Bug Description

I'm not sure exactly what is going wrong. I can use pg8000 for inserting data but selecting data doesn't seem to work. Maybe a better error could be added if this is an authentication or configuration problem.

Example code (or command)

def init_connection_engine(
    cloudsql_prn, cloudsql_connstring
) -> sqlalchemy.engine.Engine:
    def get_dbname(url):
        return re.match(".*dbname='(.*?)'", url).groups()[0]

    def get_user(url):
        return re.match(".*user='(.*?)'", url).groups()[0]

    def get_password(url):
        return re.match(".*password='(.*?)'", url).groups()[0]

    def getconn() -> pg8000.dbapi.Connection:
        return connector.connect(
            instance_connection_string=cloudsql_prn,
            driver="pg8000",
            user=get_user(cloudsql_connstring),
            password=get_password(cloudsql_connstring),
            db=get_dbname(cloudsql_connstring),
        )

    engine = sqlalchemy.create_engine("postgresql+pg8000://", creator=getconn)
    engine.dialect.description_encoding = None
    return engine

metadata_dev = init_connection_engine(
    os.getenv("CLOUDSQL_METADATA_DEV_PRN"),
    os.getenv("CLOUDSQL_METADATA_DEV_CONNSTRING"),
)

def fetchall_dict(engine, *args):
    inspect(engine)
    with engine.connect() as conn:
        try:
            inspect(conn)
            cur = conn.cursor()
            log.debug(*args)
            cur.execute(*args)

            rows = cur.fetchall()
            keys = [k[0] for k in cur.description]
            return [dict(zip(keys, row)) for row in rows]

        except Exception as e:
            log.info(*args)
            log.error(format_exc())
            raise Exception(e)

fetchall_dict(metadata_dev, "SELECT 1")

Stacktrace

[14:23:15] DEBUG    Using selector: EpollSelector                                                                                                                                               selector_events.py:59
           DEBUG    Checking None for explicit credentials as part of auth process...                                                                                                                 _default.py:206
           DEBUG    Checking Cloud SDK credentials as part of auth process...                                                                                                                         _default.py:181
[14:23:17] DEBUG    Updating instance data                                                                                                                                         instance_connection_manager.py:260
           DEBUG    Creating context                                                                                                                                               instance_connection_manager.py:303
           DEBUG    Making request: POST https://oauth2.googleapis.com/token                                                                                                                          requests.py:182
           DEBUG    Starting new HTTPS connection (1): oauth2.googleapis.com:443                                                                                                                connectionpool.py:971
           DEBUG    https://oauth2.googleapis.com:443 "POST /token HTTP/1.1" 200 None                                                                                                           connectionpool.py:452
           DEBUG    Requesting metadata                                                                                                                                                           refresh_utils.py:86
           DEBUG    Requesting ephemeral certificate                                                                                                                                             refresh_utils.py:135
           DEBUG    Entered connect method                                                                                                                                         instance_connection_manager.py:515
[14:23:18] DEBUG    Entering sleep                                                                                                                                                 instance_connection_manager.py:464
╭──────────────────────────────── <class 'sqlalchemy.engine.base.Connection'> ─────────────────────────────────╮
│ Provides high-level functionality for a wrapped DB-API connection.                                           │
│                                                                                                              │
│ ╭──────────────────────────────────────────────────────────────────────────────────────────────────────────╮ │
│ │ <sqlalchemy.engine.base.Connection object at 0x7ffbcf37acd0>                                             │ │
│ ╰──────────────────────────────────────────────────────────────────────────────────────────────────────────╯ │
│                                                                                                              │
│                   closed = False                                                                             │
│               connection = <sqlalchemy.pool.base._ConnectionFairy object at 0x7ffb9cb8bfa0>                  │
│  default_isolation_level = 'READ COMMITTED'                                                                  │
│                  dialect = <sqlalchemy.dialects.postgresql.pg8000.PGDialect_pg8000 object at 0x7ffbcaeecd30> │
│                 dispatch = <sqlalchemy.event.base.JoinedConnectionEventsDispatch object at 0x7ffb9cb73c70>   │
│                   engine = Engine(postgresql+pg8000://)                                                      │
│                     info = {}                                                                                │
│              invalidated = False                                                                             │
│ should_close_with_result = False                                                                             │
╰──────────────────────────────────────────────────────────────────────────────────────────────────────────────╯
[14:23:19] ERROR    Traceback (most recent call last):                                                                                                                                             pg_metadata.py:146
                      File "pg_metadata.py", line 136, in fetchall_dict                                                                                      
                        cur = conn.cursor()                                                                                                                                                                          
                    AttributeError: 'Connection' object has no attribute 'cursor'                                                                                                                                    
                                                                                                                                                                                                                     
Traceback (most recent call last):
  File "pg_metadata.py", line 136, in fetchall_dict
    cur = conn.cursor()
AttributeError: 'Connection' object has no attribute 'cursor'

Environment

  1. OS type and version: Fedora 35
  2. Python version: Python 3.9.9
  3. Connector version: 0.4.3

Adding support for service account credentials

Feature Description

Adding an optional credentials parameter to connector.Connect() so we can specify our own service account credentials.
It would be passed in as google.oauth2.service_account:

  • service_account.Credentials.from_service_account_info() or
  • service_account.Credentials.from_service_account_file().

If not specified, the default credentials would apply as implemtented now.

Action Required: Fix Renovate Configuration

There is an error with this repository's Renovate configuration that needs to be fixed. As a precaution, Renovate will stop PRs until it is resolved.

Error type: Cannot find preset's package (github>whitesource/merge-confidence:beta)

Setting up connector - Credentials

Hi. I'm trying to set up a connection from my virtual machine instance in Google Cloud to a MySQL instance, also in Google Cloud but on a different account.

I have installed the google.cloud.sql.connector library. When I run:

connector.connect(
"your:connection:string:",
"pymysql",
user="root",
password="shhh",
db="your-db-name"
)

with my correct details, I get the following error:

AttributeError: 'Credentials' object has no attribute 'with_scopes'

Any ideas?
I'm trying to run the connector in Python. I have

Dependency Dashboard

This issue lists Renovate updates and detected dependencies. Read the Dependency Dashboard docs to learn more.

Open

These updates have all been created already. Click a checkbox below to force a retry/rebase of any.

Detected dependencies

github-actions
.github/workflows/codeql.yml
  • actions/checkout v4.1.7@692973e3d937129bcbf40652eb9f2f61becf3332
  • github/codeql-action v3.25.11@b611370bb5703a7efb587f9d136a52ea24c5c38c
  • github/codeql-action v3.25.11@b611370bb5703a7efb587f9d136a52ea24c5c38c
  • github/codeql-action v3.25.11@b611370bb5703a7efb587f9d136a52ea24c5c38c
.github/workflows/coverage.yml
  • actions/github-script v7.0.1@60a0d83039c74a4aee543508d2ffcb1c3799cdea
  • actions/setup-python v5.1.1@39cd14951b08e74b54015e9e001cdefcf80e669f
  • actions/checkout v4.1.7@692973e3d937129bcbf40652eb9f2f61becf3332
  • actions/checkout v4.1.7@692973e3d937129bcbf40652eb9f2f61becf3332
.github/workflows/labels.yaml
  • actions/checkout v4.1.7@692973e3d937129bcbf40652eb9f2f61becf3332
  • micnncim/action-label-syncer v1.3.0@3abd5ab72fda571e69fffd97bd4e0033dd5f495c
.github/workflows/lint.yml
  • actions/github-script v7.0.1@60a0d83039c74a4aee543508d2ffcb1c3799cdea
  • actions/setup-python v5.1.1@39cd14951b08e74b54015e9e001cdefcf80e669f
  • actions/checkout v4.1.7@692973e3d937129bcbf40652eb9f2f61becf3332
.github/workflows/scorecard.yml
  • actions/checkout v4.1.7@692973e3d937129bcbf40652eb9f2f61becf3332
  • ossf/scorecard-action v2.3.3@dc50aa9510b46c811795eb24b2f1ba02a914e534
  • actions/upload-artifact v4.3.4@0b2256b8c012f0828dc542b3febcab082c67f72b
  • github/codeql-action v3.25.11@b611370bb5703a7efb587f9d136a52ea24c5c38c
.github/workflows/tests.yml
  • actions/github-script v7.0.1@60a0d83039c74a4aee543508d2ffcb1c3799cdea
  • actions/checkout v4.1.7@692973e3d937129bcbf40652eb9f2f61becf3332
  • actions/setup-python v5.1.1@39cd14951b08e74b54015e9e001cdefcf80e669f
  • google-github-actions/auth v2.1.3@71fee32a0bb7e97b4d33d548e7d957010649d8fa
  • google-github-actions/get-secretmanager-secrets v2.1.3@dc4a1392bad0fd60aee00bb2097e30ef07a1caae
  • actions/github-script v7.0.1@60a0d83039c74a4aee543508d2ffcb1c3799cdea
  • actions/checkout v4.1.7@692973e3d937129bcbf40652eb9f2f61becf3332
  • actions/setup-python v5.1.1@39cd14951b08e74b54015e9e001cdefcf80e669f
  • google-github-actions/auth v2.1.3@71fee32a0bb7e97b4d33d548e7d957010649d8fa
pip_requirements
requirements-test.txt
  • pytest ==8.2.2
  • mock ==5.1.0
  • pytest-cov ==5.0.0
  • pytest-asyncio ==0.23.7
  • SQLAlchemy ==2.0.31
  • sqlalchemy-pytds ==1.0.0
  • sqlalchemy-stubs ==0.4
  • PyMySQL ==1.1.1
  • pg8000 ==1.31.2
  • asyncpg ==0.29.0
  • python-tds ==1.15.0
  • aioresponses ==0.7.6
  • pytest-aiohttp ==1.0.5
requirements.txt
  • aiohttp ==3.9.5
  • cryptography ==42.0.8
  • Requests ==2.32.3
  • google-auth ==2.32.0
pip_setup
setup.py
  • cryptography >=42.0.0
  • google-auth >=2.28.0
  • PyMySQL >=1.1.0
  • pg8000 >=1.31.1
  • python-tds >=1.15.0
  • asyncpg >=0.29.0

  • Check this box to trigger a request for Renovate to run again on this repository

Handle deletion of ICMs asynchronously

Currently, the del method for ICMs forces synchronous deletion. Instead, we should implement a coroutine for task cancellation and deletion in the ICM class and call that on all ICMs at the same time.

Change 'ip_types' Connector parameter to 'ip_type'.

The Connector object and connect method currently have an ip_types parameter. This parameter should be changed to ip_type to align with documentation and add clarity to the parameter's functionality.

Add deprecation warning for ip_types as it will be removed in a future release.

Creating connection fails with "There is no current event loop in thread 'ThreadPoolExecutor-0_0'"

Bug Description

When creating connection using the google.cloud.sql.connector, I run into the following error. "There is no current event loop in thread 'ThreadPoolExecutor-0_0'". This was not the case 3 days ago before the 0.5.0 version was released. I was using 0.4.1, but that broke as well.

Example code (or command)

from google.cloud.sql.connector import connector

connector.connect(instance_connection_string=instance_connection_string, driver="pg8000", user=username, password=password, db=db_name)

Stacktrace

File "/layers/google.python.pip/pip/lib/python3.8/site-packages/leaguestock_player_index_processor/CloudSqlUtility.py", line 14, in __init__
    self.connection = connector.connect(instance_connection_string=instance_connection_string,
  File "/layers/google.python.pip/pip/lib/python3.8/site-packages/google/cloud/sql/connector/connector.py", line 176, in connect
    return _default_connector.connect(instance_connection_string, driver, **kwargs)
  File "/layers/google.python.pip/pip/lib/python3.8/site-packages/google/cloud/sql/connector/connector.py", line 116, in connect
    icm = InstanceConnectionManager(
  File "/layers/google.python.pip/pip/lib/python3.8/site-packages/google/cloud/sql/connector/instance_connection_manager.py", line 277, in __init__
    self._refresh_rate_limiter = AsyncRateLimiter(
  File "/layers/google.python.pip/pip/lib/python3.8/site-packages/google/cloud/sql/connector/rate_limiter.py", line 49, in __init__
    self._lock = asyncio.Lock()
  File "/opt/python3.8/lib/python3.8/asyncio/locks.py", line 164, in __init__
    self._loop = events.get_event_loop()
  File "/opt/python3.8/lib/python3.8/asyncio/events.py", line 639, in get_event_loop
    raise RuntimeError('There is no current event loop in thread %r.'
RuntimeError: There is no current event loop in thread 'ThreadPoolExecutor-0_0'."

How to reproduce

  1. Call connector.connect with appropriate arguments

Environment

  1. OS type and version: This runs from gcp cloud functions with python38 runtime
  2. Python version: 3.8.2
  3. Connector version: 0.5.0 AND 0.4.1 (0.4.1. was working until 0.5.0 was released)

Move static functions into private module functions

We have a couple of static methods being used in the ICM class that would probably be better as just standalone private functions. It should be possible to just pull out the refresh functionality into it's own file. Examples:

get_metadata:

@staticmethod
async def _get_metadata(
client_session: aiohttp.ClientSession,
credentials: Credentials,
project: str,
instance: str,
) -> Dict[str, Union[Dict, str]]:

get_ephemeral:

@staticmethod
async def _get_ephemeral(
client_session: aiohttp.ClientSession,
credentials: Credentials,
project: str,
instance: str,
pub_key: str,
) -> str:

Update unit tests to run on mock instances

Currently the tests in /unit run on a real instance. We should probably update those to use a mock instance, and let the system/e2e tests run against real instances instead.

Update tests with Connector object.

SSL Issue "certificate verify failed: unable to get local issuer certificate "

Bug Description

I'm trying to setup the library to connect to a postgresql instance. This is the code to connect:

from google.cloud.sql.connector import connector

def get_conn():
      conn = connector.connect(
          "<instance_name>",
          "pg8000",
          user=PG_USER,
          password=PG_PASS,
          db=PG_DB,
      ) # type: Connection
...

I have run the below to authenticate

gcloud auth application-default login

I run into an error with SSL certification failing.

Stack trace

An error occurred while performing refresh.Scheduling another refresh attempt immediately
Traceback (most recent call last):
  File "/Users/farzadsharifbakhtiar/projects/project/project-AI/.venv/lib/python3.7/site-packages/aiohttp/connector.py", line 969, in _wrap_create_connection
    return await self._loop.create_connection(*args, **kwargs)  # type: ignore  # noqa
  File "/Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/asyncio/base_events.py", line 989, in create_connection
    ssl_handshake_timeout=ssl_handshake_timeout)
  File "/Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/asyncio/base_events.py", line 1017, in _create_connection_transport
    await waiter
  File "/Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/asyncio/sslproto.py", line 530, in data_received
    ssldata, appdata = self._sslpipe.feed_ssldata(data)
  File "/Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/asyncio/sslproto.py", line 189, in feed_ssldata
    self._sslobj.do_handshake()
  File "/Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/ssl.py", line 774, in do_handshake
    self._sslobj.do_handshake()
ssl.SSLCertVerificationError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1091)

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/Users/farzadsharifbakhtiar/projects/project/project-AI/.venv/lib/python3.7/site-packages/google/cloud/sql/connector/instance_connection_manager.py", line 424, in _perform_refresh
    await refresh_task
  File "/Users/farzadsharifbakhtiar/projects/project/project-AI/.venv/lib/python3.7/site-packages/google/cloud/sql/connector/instance_connection_manager.py", line 322, in _get_instance_data
    metadata, ephemeral_cert = await asyncio.gather(metadata_task, ephemeral_task)
  File "/Users/farzadsharifbakhtiar/projects/project/project-AI/.venv/lib/python3.7/site-packages/google/cloud/sql/connector/refresh_utils.py", line 169, in _get_ephemeral
    url, headers=headers, json=data, raise_for_status=True
  File "/Users/farzadsharifbakhtiar/projects/project/project-AI/.venv/lib/python3.7/site-packages/aiohttp/client.py", line 521, in _request
    req, traces=traces, timeout=real_timeout
  File "/Users/farzadsharifbakhtiar/projects/project/project-AI/.venv/lib/python3.7/site-packages/aiohttp/connector.py", line 535, in connect
    proto = await self._create_connection(req, traces, timeout)
  File "/Users/farzadsharifbakhtiar/projects/project/project-AI/.venv/lib/python3.7/site-packages/aiohttp/connector.py", line 892, in _create_connection
    _, proto = await self._create_direct_connection(req, traces, timeout)
  File "/Users/farzadsharifbakhtiar/projects/project/project-AI/.venv/lib/python3.7/site-packages/aiohttp/connector.py", line 1051, in _create_direct_connection
    raise last_exc
  File "/Users/farzadsharifbakhtiar/projects/project/project-AI/.venv/lib/python3.7/site-packages/aiohttp/connector.py", line 1032, in _create_direct_connection
    client_error=client_error,
  File "/Users/farzadsharifbakhtiar/projects/project/project-AI/.venv/lib/python3.7/site-packages/aiohttp/connector.py", line 971, in _wrap_create_connection
    raise ClientConnectorCertificateError(req.connection_key, exc) from exc
aiohttp.client_exceptions.ClientConnectorCertificateError: Cannot connect to host www.googleapis.com:443 ssl:True [SSLCertVerificationError: (1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1091)')]
Traceback (most recent call last):
  File "/Users/farzadsharifbakhtiar/projects/project/project-AI/.venv/lib/python3.7/site-packages/aiohttp/connector.py", line 969, in _wrap_create_connection
    return await self._loop.create_connection(*args, **kwargs)  # type: ignore  # noqa
  File "/Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/asyncio/base_events.py", line 989, in create_connection
    ssl_handshake_timeout=ssl_handshake_timeout)
  File "/Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/asyncio/base_events.py", line 1017, in _create_connection_transport
    await waiter
  File "/Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/asyncio/sslproto.py", line 530, in data_received
    ssldata, appdata = self._sslpipe.feed_ssldata(data)
  File "/Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/asyncio/sslproto.py", line 189, in feed_ssldata
    self._sslobj.do_handshake()
  File "/Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/ssl.py", line 774, in do_handshake
    self._sslobj.do_handshake()
ssl.SSLCertVerificationError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1091)

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "<string>", line 1, in <module>
  File "/Users/farzadsharifbakhtiar/projects/project/project-AI/tests/sanity/gcloud_postgre.py", line 4, in test
    conn = create_postgre_connection()
  File "/Users/farzadsharifbakhtiar/projects/project/project-AI/src/init.py", line 60, in create_postgre_connection
    db=PG_DB,
  File "/Users/farzadsharifbakhtiar/projects/project/project-AI/.venv/lib/python3.7/site-packages/google/cloud/sql/connector/connector.py", line 123, in connect
    raise (e)
  File "/Users/farzadsharifbakhtiar/projects/project/project-AI/.venv/lib/python3.7/site-packages/google/cloud/sql/connector/instance_connection_manager.py", line 433, in _perform_refresh
    instance_data = await self._current
  File "/Users/farzadsharifbakhtiar/projects/project/project-AI/.venv/lib/python3.7/site-packages/google/cloud/sql/connector/connector.py", line 119, in connect
    return icm.connect(driver, ip_types, timeout, **kwargs)
  File "/Users/farzadsharifbakhtiar/projects/project/project-AI/.venv/lib/python3.7/site-packages/google/cloud/sql/connector/instance_connection_manager.py", line 495, in connect
    connection = connect_future.result(timeout)
  File "/Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/concurrent/futures/_base.py", line 435, in result
    return self.__get_result()
  File "/Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/concurrent/futures/_base.py", line 384, in __get_result
    raise self._exception
  File "/Users/farzadsharifbakhtiar/projects/project/project-AI/.venv/lib/python3.7/site-packages/google/cloud/sql/connector/instance_connection_manager.py", line 531, in _connect
    instance_data = await self._current
  File "/Users/farzadsharifbakhtiar/projects/project/project-AI/.venv/lib/python3.7/site-packages/google/cloud/sql/connector/instance_connection_manager.py", line 462, in _schedule_refresh
    delay = await self.seconds_until_refresh()
  File "/Users/farzadsharifbakhtiar/projects/project/project-AI/.venv/lib/python3.7/site-packages/google/cloud/sql/connector/instance_connection_manager.py", line 390, in seconds_until_refresh
    expiration = (await self._current).expiration
  File "/Users/farzadsharifbakhtiar/projects/project/project-AI/.venv/lib/python3.7/site-packages/google/cloud/sql/connector/instance_connection_manager.py", line 322, in _get_instance_data
    metadata, ephemeral_cert = await asyncio.gather(metadata_task, ephemeral_task)
  File "/Users/farzadsharifbakhtiar/projects/project/project-AI/.venv/lib/python3.7/site-packages/google/cloud/sql/connector/refresh_utils.py", line 169, in _get_ephemeral
    url, headers=headers, json=data, raise_for_status=True
  File "/Users/farzadsharifbakhtiar/projects/project/project-AI/.venv/lib/python3.7/site-packages/aiohttp/client.py", line 521, in _request
    req, traces=traces, timeout=real_timeout
  File "/Users/farzadsharifbakhtiar/projects/project/project-AI/.venv/lib/python3.7/site-packages/aiohttp/connector.py", line 535, in connect
    proto = await self._create_connection(req, traces, timeout)
  File "/Users/farzadsharifbakhtiar/projects/project/project-AI/.venv/lib/python3.7/site-packages/aiohttp/connector.py", line 892, in _create_connection
    _, proto = await self._create_direct_connection(req, traces, timeout)
  File "/Users/farzadsharifbakhtiar/projects/project/project-AI/.venv/lib/python3.7/site-packages/aiohttp/connector.py", line 1051, in _create_direct_connection
    raise last_exc
  File "/Users/farzadsharifbakhtiar/projects/project/project-AI/.venv/lib/python3.7/site-packages/aiohttp/connector.py", line 1032, in _create_direct_connection
    client_error=client_error,
  File "/Users/farzadsharifbakhtiar/projects/project/project-AI/.venv/lib/python3.7/site-packages/aiohttp/connector.py", line 971, in _wrap_create_connection
    raise ClientConnectorCertificateError(req.connection_key, exc) from exc
aiohttp.client_exceptions.ClientConnectorCertificateError: Cannot connect to host www.googleapis.com:443 ssl:True [SSLCertVerificationError: (1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1091)')]

Environment

  1. OS type and version: macOS 10.15.7
  2. Python version: 3.7.9
  3. Connector version: pg8000==1.21.1, cloud-sql-python-connector==0.4.0

Add tests and verify compatibility with Windows

Python has different behaviors for Windows, with the major two being:

  1. NamedTemporaryFile objects cannot be reopened on Windows (Python docs).

  2. Python uses the ProtractorEventLoop instead of the SelectorEventLoop (Python docs). Might want to run some tests on Windows to ensure that it's not breaking

Add support for Python 3.10

Python 3.10 was released October 2021. Certain async event loop properties have been deprecated in this new release. Because of this the Cloud SQL Python connector has a few pieced of code that need to be cleaned up and the deprecated functionality removed.

Example error with Python 3.10:
TypeError: As of 3.10, the *loop* parameter was removed from Lock() since it is no longer necessary

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.