Skip to main content
0 votes
0 answers
32 views

I have 2 existing dataflows in a Dataverse environment, dataflow1 and dataflow2. I have created a 3rd dataflow (dataflow3) and I want to get data from dattaflow1 and dataflow2 into dataflow3. I have ...
grasshopper's user avatar
0 votes
1 answer
92 views

In the latest Apache Beam 2.68.0, they have changed the behavior of Coders for non-primitive objects. (see the changelog here). Therefore, I get a warning like this on GCP Dataflow. "Using ...
Praneeth Peiris's user avatar
1 vote
0 answers
35 views

I have a table in power bi which has calculated column based on azure table and Dataverse. (Dataverse is used for write back feature using power automate as few of the column value get changed). Table ...
A_B's user avatar
  • 68
0 votes
1 answer
104 views

I have created GDFusion instance: Enable needed APIs: Assign needed permissions: I run my pipelines several times with success status: I have lineage in UI: Then I try to get lineage using REST ...
Sergey Shabalov's user avatar
0 votes
0 answers
52 views

Iam using airflow 2.10.5 version to trigger dataflow using BeamRunJavaPipelineOperator , Here getting logs as dataflow submiited and dataflow id is 2025-09-18_01_16_18-13565731205394440306 ....
Rex Ubald's user avatar
0 votes
1 answer
90 views

I am trying to build a Python based Apache Beam pipeline which s going to read from Kafka. Kafka requires Truststore and Keystore JKS file based authentication. kafka_consumer_config = { "...
Bhargav Velisetti's user avatar
0 votes
1 answer
119 views

I'm trying to create a job to mirror a view that I have in my PostgreSQL DB to a BigQuery table in my Google Cloud Project through Dataflow, I created the job using the "Job builder", and I'...
Gustavo Trivelatto's user avatar
0 votes
1 answer
62 views

When I use us-central-2 for the following dataflow job, I got an error. Using us-central-1 is fine. Is this expected? Is there a way to use us-central-2 with prime? I have to use us-central-2 as my ...
bill's user avatar
  • 722
0 votes
1 answer
66 views

def check_worker_logs(event_uuid, dataflow_project, dataflow_job, timeframe_mins=30): # Start time of the worker log start_time = (datetime.utcnow() - timedelta(minutes=timeframe_mins))....
katsu's user avatar
  • 1
0 votes
0 answers
50 views

I am doing 3-tier data flow from ci to bo to till nodes using symmetricDS. Whenever I do an update in a table of ci database it is syncing to bo database, but not syncing to till database. I've set ...
venu's user avatar
  • 1
0 votes
0 answers
55 views

I've been trying for a some time to got a beam pipeline to do data transformations for a fairly simple machine learning transformation, but apache beam and Tensorflow-transform won't play nicely ...
George Chapman-Brown's user avatar
0 votes
0 answers
54 views

I would like to use an ErrorHandler to catch all the errors that happens during my pipeline. I have seen that there is an interface which allows to do so : https://beam.apache.org/releases/javadoc/...
Dev Yns's user avatar
  • 229
0 votes
0 answers
54 views

I am doing 3-tier data flow from ci to bo to till nodes. Data is flowing from ci to bo successfully. I am able to see in ci servers logs that initial load has beed ended for bo node. But when I see ...
venu's user avatar
  • 1
1 vote
2 answers
727 views

I am working on updating python version to 3.12.0 in my cloud function. After upgrading, Dataflow call was outputting the following message: httplib2 transport does not support per-request timeout. ...
Winston Kyu's user avatar
0 votes
1 answer
53 views

I'm experiencing inconsistent behavior between Apache Beam's DirectRunner (local) and DataflowRunner (GCP) when using AvroCoder with an immutable class. Problem I have an immutable class defined using ...
Nihal sharma's user avatar

15 30 50 per page
1
2 3 4 5
450