You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
* add required param 'engine' to multimodal functions ([#1834](https://github.com/googleapis/python-bigquery-dataframes/issues/1834))
13
+
14
+
### Features
15
+
16
+
* Add `bpd.options.compute.maximum_result_rows` option to limit client data download ([#1829](https://github.com/googleapis/python-bigquery-dataframes/issues/1829)) ([e22a3f6](https://github.com/googleapis/python-bigquery-dataframes/commit/e22a3f61a02cc1b7a5155556e5a07a1a2fea1d82))
17
+
* Add `bpd.options.display.repr_mode = "anywidget"` to create an interactive display of the results ([#1820](https://github.com/googleapis/python-bigquery-dataframes/issues/1820)) ([be0a3cf](https://github.com/googleapis/python-bigquery-dataframes/commit/be0a3cf7711dadc68d8366ea90b99855773e2a2e))
18
+
* Add DataFrame.ai.forecast() support ([#1828](https://github.com/googleapis/python-bigquery-dataframes/issues/1828)) ([7bc7f36](https://github.com/googleapis/python-bigquery-dataframes/commit/7bc7f36fc20d233f4cf5ed688cc5dcaf100ce4fb))
19
+
* Add describe() method to Series ([#1827](https://github.com/googleapis/python-bigquery-dataframes/issues/1827)) ([a4205f8](https://github.com/googleapis/python-bigquery-dataframes/commit/a4205f882012820c034cb15d73b2768ec4ad3ac8))
20
+
* Add required param 'engine' to multimodal functions ([#1834](https://github.com/googleapis/python-bigquery-dataframes/issues/1834)) ([37666e4](https://github.com/googleapis/python-bigquery-dataframes/commit/37666e4c137d52c28ab13477dfbcc6e92b913334))
21
+
22
+
23
+
### Performance Improvements
24
+
25
+
* Produce simpler sql ([#1836](https://github.com/googleapis/python-bigquery-dataframes/issues/1836)) ([cf9c22a](https://github.com/googleapis/python-bigquery-dataframes/commit/cf9c22a09c4e668a598fa1dad0f6a07b59bc6524))
Copy file name to clipboardExpand all lines: bigframes/_config/compute_options.py
+39-30Lines changed: 39 additions & 30 deletions
Original file line number
Diff line number
Diff line change
@@ -55,29 +55,7 @@ class ComputeOptions:
55
55
{'test2': 'abc', 'test3': False}
56
56
57
57
Attributes:
58
-
maximum_bytes_billed (int, Options):
59
-
Limits the bytes billed for query jobs. Queries that will have
60
-
bytes billed beyond this limit will fail (without incurring a
61
-
charge). If unspecified, this will be set to your project default.
62
-
See `maximum_bytes_billed`: https://cloud.google.com/python/docs/reference/bigquery/latest/google.cloud.bigquery.job.QueryJobConfig#google_cloud_bigquery_job_QueryJobConfig_maximum_bytes_billed.
63
-
64
-
enable_multi_query_execution (bool, Options):
65
-
If enabled, large queries may be factored into multiple smaller queries
66
-
in order to avoid generating queries that are too complex for the query
67
-
engine to handle. However this comes at the cost of increase cost and latency.
68
-
69
-
extra_query_labels (Dict[str, Any], Options):
70
-
Stores additional custom labels for query configuration.
Semantic operators are deprecated. Please use AI operators instead
75
-
76
-
semantic_ops_threshold_autofail (bool):
77
-
.. deprecated:: 1.42.0
78
-
Semantic operators are deprecated. Please use AI operators instead
79
-
80
-
ai_ops_confirmation_threshold (int, optional):
58
+
ai_ops_confirmation_threshold (int | None):
81
59
Guards against unexpected processing of large amount of rows by semantic operators.
82
60
If the number of rows exceeds the threshold, the user will be asked to confirm
83
61
their operations to resume. The default value is 0. Set the value to None
@@ -87,26 +65,57 @@ class ComputeOptions:
87
65
Guards against unexpected processing of large amount of rows by semantic operators.
88
66
When set to True, the operation automatically fails without asking for user inputs.
89
67
90
-
allow_large_results (bool):
68
+
allow_large_results (bool | None):
91
69
Specifies whether query results can exceed 10 GB. Defaults to False. Setting this
92
70
to False (the default) restricts results to 10 GB for potentially faster execution;
93
71
BigQuery will raise an error if this limit is exceeded. Setting to True removes
94
72
this result size limit.
73
+
74
+
enable_multi_query_execution (bool | None):
75
+
If enabled, large queries may be factored into multiple smaller queries
76
+
in order to avoid generating queries that are too complex for the query
77
+
engine to handle. However this comes at the cost of increase cost and latency.
78
+
79
+
extra_query_labels (Dict[str, Any] | None):
80
+
Stores additional custom labels for query configuration.
81
+
82
+
maximum_bytes_billed (int | None):
83
+
Limits the bytes billed for query jobs. Queries that will have
84
+
bytes billed beyond this limit will fail (without incurring a
85
+
charge). If unspecified, this will be set to your project default.
86
+
See `maximum_bytes_billed`: https://cloud.google.com/python/docs/reference/bigquery/latest/google.cloud.bigquery.job.QueryJobConfig#google_cloud_bigquery_job_QueryJobConfig_maximum_bytes_billed.
87
+
88
+
maximum_result_rows (int | None):
89
+
Limits the number of rows in an execution result. When converting
90
+
a BigQuery DataFrames object to a pandas DataFrame or Series (e.g.,
91
+
using ``.to_pandas()``, ``.peek()``, ``.__repr__()``, direct
92
+
iteration), the data is downloaded from BigQuery to the client
93
+
machine. This option restricts the number of rows that can be
94
+
downloaded. If the number of rows to be downloaded exceeds this
95
+
limit, a ``bigframes.exceptions.MaximumResultRowsExceeded``
96
+
exception is raised.
97
+
98
+
semantic_ops_confirmation_threshold (int | None):
99
+
.. deprecated:: 1.42.0
100
+
Semantic operators are deprecated. Please use AI operators instead
101
+
102
+
semantic_ops_threshold_autofail (bool):
103
+
.. deprecated:: 1.42.0
104
+
Semantic operators are deprecated. Please use AI operators instead
0 commit comments