Skip to content

Commit ae5f2db

Browse files
reduce changes in docstrings
Signed-off-by: varun-edachali-dbx <varun.edachali@databricks.com>
1 parent 809b39e commit ae5f2db

File tree

1 file changed

+4
-1
lines changed

1 file changed

+4
-1
lines changed

src/databricks/sql/backend/databricks_client.py

Lines changed: 4 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -96,7 +96,7 @@ def execute_command(
9696
max_rows: Maximum number of rows to fetch in a single fetch batch
9797
max_bytes: Maximum number of bytes to fetch in a single fetch batch
9898
lz4_compression: Whether to use LZ4 compression for result data
99-
cursor: The cursor object that will handle the results
99+
cursor: The cursor object that will handle the results. The command id is set in this cursor.
100100
use_cloud_fetch: Whether to use cloud fetch for retrieving large result sets
101101
parameters: List of parameters to bind to the query
102102
async_op: Whether to execute the command asynchronously
@@ -282,7 +282,9 @@ def get_tables(
282282
max_bytes: Maximum number of bytes to fetch in a single batch
283283
cursor: The cursor object that will handle the results
284284
catalog_name: Optional catalog name pattern to filter by
285+
if catalog_name is None, we fetch across all catalogs
285286
schema_name: Optional schema name pattern to filter by
287+
if schema_name is None, we fetch across all schemas
286288
table_name: Optional table name pattern to filter by
287289
table_types: Optional list of table types to filter by (e.g., ['TABLE', 'VIEW'])
288290
@@ -321,6 +323,7 @@ def get_columns(
321323
catalog_name: Optional catalog name pattern to filter by
322324
schema_name: Optional schema name pattern to filter by
323325
table_name: Optional table name pattern to filter by
326+
if table_name is None, we fetch across all tables
324327
column_name: Optional column name pattern to filter by
325328
326329
Returns:

0 commit comments

Comments
 (0)