使用google-cloud-python API访问Dataproc时出现无效区域错误
我正在尝试使用 google-cloud在Dataproc中创建集群-python 库,但是,当设置region = 'us-central1'
时,出现以下异常:
I am trying to create a cluster in Dataproc using google-cloud-python library, however, when setting region = 'us-central1'
I get below exception:
google.api_core.exceptions.InvalidArgument: 400 Region 'us-central1' is invalid.
Please see https://cloud.google.com/dataproc/docs/concepts/regional-endpoints
for additional information on regional endpoints
代码(基于示例):
#!/usr/bin/python
from google.cloud import dataproc_v1
client = dataproc_v1.ClusterControllerClient()
project_id = 'my-project'
region = 'us-central1'
cluster = {...}
response = client.create_cluster(project_id, region, cluster)
Dataproc使用region
字段来路由REST请求,但是,该字段未在gRPC客户端中使用(因此出现错误).
Dataproc uses region
field for routing REST requests, however, the field is not used in gRPC clients (hence the error).
只能通过默认端点访问global
多区域.要使用诸如us-central1
之类的区域终结点,您必须配置该终结点以在客户端的transport
上进行寻址.
Only the global
multiregion can be accessed through the default endpoint. To use a regional endpoint such as us-central1
, you have to configure the endpoint to address on the client's transport
.
Dataproc区域端点遵循以下模式:<region>-dataproc.googleapis.com:443
. region
字段应设置为与端点中的区域相同的值.
The Dataproc regional endpoints follow this pattern: <region>-dataproc.googleapis.com:443
. The region
field should be set to the same value as the region in the endpoint.
示例:
#!/usr/bin/python
from google.cloud import dataproc_v1
from google.cloud.dataproc_v1.gapic.transports import cluster_controller_grpc_transport
transport = cluster_controller_grpc_transport.ClusterControllerGrpcTransport(
address='us-central1-dataproc.googleapis.com:443')
client = dataproc_v1.ClusterControllerClient(transport)
project_id = 'my-project'
region = 'us-central1'
cluster = {...}
response = client.create_cluster(project_id, region, cluster)