### Running All Unit Tests with Poetry and Pytest Source: https://github.com/databricks/databricks-sql-python/blob/main/CONTRIBUTING.md This command executes all unit tests for the `databricks-sql-connector` project. Unit tests do not require a Databricks account and are run using Pytest via Poetry's isolated environment. ```Bash poetry run python -m pytest tests/unit ``` -------------------------------- ### Rendered SQL for pyformat Paramstyle Example Source: https://github.com/databricks/databricks-sql-python/blob/main/docs/parameters.md This SQL snippet shows the result of the `pyformat` parameter substitution from the corresponding Python example. The parameters `value1` and `value2` are inlined directly into the SQL query as literal values, demonstrating the effect of `use_inline_params=True`. ```sql SELECT field FROM table WHERE field = 'foo' AND another_field = 20 ``` -------------------------------- ### Executing PySQLCoreTestSuite End-to-End Tests (Bash) Source: https://github.com/databricks/databricks-sql-python/blob/main/CONTRIBUTING.md This command executes the `PySQLCoreTestSuite` using `pytest` via `poetry run`. This suite covers the basic features and behaviors of the Databricks SQL Python connector and is the default location for new tests. ```bash poetry run python -m pytest tests/e2e/driver_tests.py::PySQLCoreTestSuite ``` -------------------------------- ### Rendered SQL for Sequence Parameter Example Source: https://github.com/databricks/databricks-sql-python/blob/main/docs/parameters.md This SQL output shows how a sequence parameter passed from Python is rendered into the SQL query. The list of values is expanded into a comma-separated list enclosed in parentheses, suitable for an `IN` clause. ```sql SELECT field FROM table WHERE field IN (1,2,3,4,5) ``` -------------------------------- ### Connecting to Databricks SQL and Querying Data (Python) Source: https://github.com/databricks/databricks-sql-python/blob/main/README.md This Python example demonstrates establishing a connection to Databricks SQL using the `databricks-sql-connector`, retrieving connection details from environment variables. It shows how to create a cursor, execute a parameterized SQL query, fetch and iterate through the results, and ensure proper resource cleanup by closing the cursor and connection. ```python import os from databricks import sql host = os.getenv("DATABRICKS_HOST") http_path = os.getenv("DATABRICKS_HTTP_PATH") connection = sql.connect( server_hostname=host, http_path=http_path) cursor = connection.cursor() cursor.execute('SELECT :param `p`, * FROM RANGE(10)', {"param": "foo"}) result = cursor.fetchall() for row in result: print(row) cursor.close() connection.close() ``` -------------------------------- ### Running a Specific Unit Test File Source: https://github.com/databricks/databricks-sql-python/blob/main/CONTRIBUTING.md This command allows developers to run tests contained within a single specified unit test file, such as `tests.py`, using Pytest and Poetry. This is useful for focused testing during development, targeting only relevant changes. ```Bash poetry run python -m pytest tests/unit/tests.py ``` -------------------------------- ### Checking Code Formatting with Black (Bash) Source: https://github.com/databricks/databricks-sql-python/blob/main/CONTRIBUTING.md This command uses Black, a Python code formatter, to check for formatting issues in the `src` directory. The `--check` flag ensures that no files are modified, only reported if not compliant. ```bash poetry run python3 -m black src --check ``` -------------------------------- ### Explicit Type Inference with TDbsqlParameter in Python Source: https://github.com/databricks/databricks-sql-python/blob/main/docs/parameters.md Illustrates how to explicitly define parameter types using `StringParameter` and `IntegerParameter` from `databricks.sql.parameters`. It provides examples for both `named` and `?` (positional) paramstyles, emphasizing that `TDbsqlParameter` objects must be passed within a list. ```Python from databricks import sql from databricks.sql.parameters import StringParameter, IntegerParameter # with `named` markers with sql.connect(...) as conn: with conn.cursor() as cursor(): query = "SELECT field FROM table WHERE field = :value1 AND another_field = :value2" parameters = [ StringParameter(name="value1", value="foo"), IntegerParameter(name="value2", value=20) ] result = cursor.execute(query, parameters=parameters).fetchone() # with `?` markers with sql.connect(...) as conn: with conn.cursor() as cursor(): query = "SELECT field FROM table WHERE field = ? AND another_field = ?" parameters = [ StringParameter(value="foo"), IntegerParameter(value=20) ] result = cursor.execute(query, parameters=parameters).fetchone() ``` -------------------------------- ### Running a Specific Unit Test Method Source: https://github.com/databricks/databricks-sql-python/blob/main/CONTRIBUTING.md This command targets and executes a single test method within a specific test class and file. It's used for highly granular testing, enabling developers to quickly verify changes to a particular function or behavior without running the entire test suite. ```Bash poetry run python -m pytest tests/unit/tests.py::ClientTestSuite::test_closing_connection_closes_commands ``` -------------------------------- ### Using %s Positional Paramstyle with Databricks SQL Python Connector Source: https://github.com/databricks/databricks-sql-python/blob/main/docs/parameters.md This example illustrates the use of the `%s` paramstyle, where parameters are passed as a list. While functional with `use_inline_params=True`, this method is not PEP-249 compliant and can cause issues with SQL `LIKE` clauses due to syntax overlap. ```python from databricks import sql with sql.connect(..., use_inline_params=True) as conn: with conn.cursor() as cursor(): query = "SELECT field FROM table WHERE field = %s AND another_field = %s" parameters = ["foo", 20] result = cursor.execute(query, parameters=parameters).fetchone() ``` -------------------------------- ### Configuring Databricks SQL Connection in test.env File Source: https://github.com/databricks/databricks-sql-python/blob/main/CONTRIBUTING.md Alternatively, connection details for the Databricks SQL endpoint can be stored in a `test.env` file at the repository root. This file contains the host, HTTP path, access token, and a staging ingestion user for test execution. ```plaintext host="****.cloud.databricks.com" http_path="/sql/1.0/warehouses/***" access_token="dapi***" staging_ingestion_user="***@example.com" ``` -------------------------------- ### Configuring Databricks SQL Connection Environment Variables (Bash) Source: https://github.com/databricks/databricks-sql-python/blob/main/CONTRIBUTING.md This snippet shows how to set environment variables for connecting to a Databricks SQL endpoint. These variables are crucial for running end-to-end tests, specifying the host, HTTP path, access token, catalog, and schema. ```bash export host="" export http_path="" export access_token="" export catalog="" export schema="" ``` -------------------------------- ### Signing Git Commits Source: https://github.com/databricks/databricks-sql-python/blob/main/CONTRIBUTING.md This snippet illustrates the required sign-off line to be added to every Git commit message when contributing to the project. It certifies that the contributor has the right to submit the patch under an open-source license, and contributors must use their real name. ```Git Signed-off-by: Joe Smith ``` -------------------------------- ### Executing Query with Qmark Parameters - Python Source: https://github.com/databricks/databricks-sql-python/blob/main/docs/parameters.md Illustrates executing a SQL query using the `qmark` paramstyle in Python with the `databricks-sql-connector`. Parameters are passed as a list, where the order of values corresponds to the order of `qmark` variables in the SQL query. The list length must match the variable marker count. ```Python from databricks import sql with sql.connect(...) as conn: with conn.cursor() as cursor(): query = "SELECT field FROM table WHERE field = ? AND another_field = ?" parameters = ["foo", 20] result = cursor.execute(query, parameters=parameters).fetchone() ``` -------------------------------- ### SQL Syntax for Native Parameters - Multiple Paramstyles Source: https://github.com/databricks/databricks-sql-python/blob/main/docs/parameters.md Illustrates the three supported PEP-249 paramstyles for SQL queries: `named` (e.g., `:value`), `qmark` (e.g., `?`), and `pyformat` (e.g., `%(value)s`). A query must use exactly one paramstyle. The `pyformat` style is legacy and will be deprecated. ```SQL -- named paramstyle SELECT * FROM table WHERE field = :value -- qmark paramstyle SELECT * FROM table WHERE field = ? -- pyformat paramstyle (legacy) SELECT * FROM table WHERE field = %(value)s ``` -------------------------------- ### Executing Query with Named Parameters - Python Source: https://github.com/databricks/databricks-sql-python/blob/main/docs/parameters.md Demonstrates how to execute a SQL query using the `named` paramstyle in Python with the `databricks-sql-connector`. Parameters are passed as a dictionary where keys match the variable markers in the SQL query. The number of parameters must exactly match the variable markers. ```Python from databricks import sql with sql.connect(...) as conn: with conn.cursor() as cursor(): query = "SELECT field FROM table WHERE field = :value1 AND another_field = :value2" parameters = {"value1": "foo", "value2": 20} result = cursor.execute(query, parameters=parameters).fetchone() ``` -------------------------------- ### Using pyformat Paramstyle with Databricks SQL Python Connector Source: https://github.com/databricks/databricks-sql-python/blob/main/docs/parameters.md This snippet demonstrates how to use the `pyformat` paramstyle, which requires parameters to be passed as a dictionary. This method is compliant with PEP-249 and utilizes `use_inline_params=True` for parameter substitution directly into the query string. ```python from databricks import sql with sql.connect(..., use_inline_params=True) as conn: with conn.cursor() as cursor(): query = "SELECT field FROM table WHERE field = %(value1)s AND another_field = %(value2)s" parameters = {"value1": "foo", "value2": 20} result = cursor.execute(query, parameters=parameters).fetchone() ``` -------------------------------- ### Rewriting SQL Queries from pyformat to Named Paramstyle Source: https://github.com/databricks/databricks-sql-python/blob/main/docs/parameters.md Demonstrates how SQL queries written with `pyformat` style parameters (e.g., `%(param)s`) are rewritten to the `named` paramstyle (e.g., `:param`) by the Databricks SQL connector when `use_inline_params=False`. This feature is for backward compatibility and will be deprecated. ```SQL -- a query written for databricks-sql-connector==2.9.3 and below SELECT field1, field2, %(param1)s FROM table WHERE field4 = %(param2)s -- rewritten for databricks-sql-connector==3.0.0 and above SELECT field1, field2, :param1 FROM table WHERE field4 = :param2 ``` -------------------------------- ### Setting Databricks Connection Environment Variables (Bash) Source: https://github.com/databricks/databricks-sql-python/blob/main/README.md This snippet illustrates how to set the `DATABRICKS_HOST` and `DATABRICKS_HTTP_PATH` environment variables in a Bash shell. These variables are crucial for configuring the connection to a Databricks workspace or SQL endpoint, allowing the Python connector to locate the target Databricks instance. ```bash export DATABRICKS_HOST=********.databricks.com export DATABRICKS_HTTP_PATH=/sql/1.0/endpoints/**************** ``` -------------------------------- ### SQL Syntax for Deprecated Inline Parameters Source: https://github.com/databricks/databricks-sql-python/blob/main/docs/parameters.md Shows the SQL syntax for using `pyformat` (`%(param)s`) and positional (`%s`) markers when the deprecated `use_inline_params=True` setting is enabled. This method is not SQL injection safe and will be removed in future releases. ```SQL -- pyformat paramstyle is used for named parameters SELECT * FROM table WHERE field = %(value)s -- %s is used for positional parameters SELECT * FROM table WHERE field = %s ``` -------------------------------- ### Passing Sequence Parameters for IN Clauses in Databricks SQL Python Source: https://github.com/databricks/databricks-sql-python/blob/main/docs/parameters.md This snippet demonstrates passing a sequence (list) as a parameter value, commonly used for `WHERE ... IN` clauses. This behavior is specific to the inline renderer and is not specified by PEP-249, requiring `use_inline_params=True`. ```python from databricks import sql with sql.connect(..., use_inline_params=True) as conn: with conn.cursor() as cursor(): query = "SELECT field FROM table WHERE field IN %(value_list)s" parameters = {"value_list": [1,2,3,4,5]} result = cursor.execute(query, parameters=parameters).fetchone() ``` === COMPLETE CONTENT === This response contains all available snippets from this library. No additional content exists. Do not make further requests.