Google Cloud DataFlow cannot read and write in different locations error (Java SDK v2.0.0) -


i'm using google cloud data flow , when execute code :

public static void main(string[] args) {      string query = "select * [*****.*****]";      pipeline p = pipeline.create(pipelineoptionsfactory.fromargs(args).withvalidation().create());      pcollection<tablerow> lines = p.apply(bigqueryio.read().fromquery(query));      p.run(); } 

i have

(332b4f3b83bd3397): java.io.ioexception: query job beam_job_d1772eb4136d4982be55be20d173f63d_testradiateurmodegfcvsoasc07281159145481871-query failed, status: {     "errorresult" : {         "message" : "cannot read , write in different locations: source: eu, destination: us",         "reason" : "invalid"     },     "errors" : [ {         "message" : "cannot read , write in different locations: source: eu, destination: us",         "reason" : "invalid"     }],     "state" : "done" }.     @ org.apache.beam.sdk.io.gcp.bigquery.bigqueryquerysource.executequery(bigqueryquerysource.java:173)     @ org.apache.beam.sdk.io.gcp.bigquery.bigqueryquerysource.gettabletoextract(bigqueryquerysource.java:120)     @ org.apache.beam.sdk.io.gcp.bigquery.bigquerysourcebase.split(bigquerysourcebase.java:87)     @ com.google.cloud.dataflow.worker.runners.worker.workercustomsources.splitandvalidate(workercustomsources.java:261)     @ com.google.cloud.dataflow.worker.runners.worker.workercustomsources.performsplittyped(workercustomsources.java:209)     @ com.google.cloud.dataflow.worker.runners.worker.workercustomsources.performsplitwithapilimit(workercustomsources.java:184)     @ com.google.cloud.dataflow.worker.runners.worker.workercustomsources.performsplit(workercustomsources.java:161)     @ com.google.cloud.dataflow.worker.runners.worker.workercustomsourceoperationexecutor.execute(workercustomsourceoperationexecutor.java:47)     @ com.google.cloud.dataflow.worker.runners.worker.dataflowworker.executework(dataflowworker.java:341)     @ com.google.cloud.dataflow.worker.runners.worker.dataflowworker.dowork(dataflowworker.java:297)     @ com.google.cloud.dataflow.worker.runners.worker.dataflowworker.getandperformwork(dataflowworker.java:244)     @ com.google.cloud.dataflow.worker.runners.worker.dataflowbatchworkerharness$workerthread.dowork(dataflowbatchworkerharness.java:125)     @ com.google.cloud.dataflow.worker.runners.worker.dataflowbatchworkerharness$workerthread.call(dataflowbatchworkerharness.java:105)     @ com.google.cloud.dataflow.worker.runners.worker.dataflowbatchworkerharness$workerthread.call(dataflowbatchworkerharness.java:92)     @ java.util.concurrent.futuretask.run(futuretask.java:266)     @ java.util.concurrent.threadpoolexecutor.runworker(threadpoolexecutor.java:1142)     @ java.util.concurrent.threadpoolexecutor$worker.run(threadpoolexecutor.java:617)     @ java.lang.thread.run(thread.java:745) 

i read theses posts 37298504, 42135002 , https://github.com/googlecloudplatform/dataflowjavasdk/issues/405 no solution work me.

for more information :

  • the bigquery table located in eu
  • i tried starting job --zone=europe-west1-b , region=europe-west1-b

  • i using dataflowrunner

when go bigquery web ui, see theses temporary datasets

edit : solved problem using version 1.9.0 of dataflow sdk


Comments

Popular posts from this blog

php - Vagrant up error - Uncaught Reflection Exception: Class DOMDocument does not exist -

vue.js - Create hooks for automated testing -

Add new key value to json node in java -