Skip to main content
0 votes
1 answer
64 views

Can someone tell me if the IBM InfoSphere DataReplication for Kafka (CDC 2 Kafka) is capable to communicate to a Kafka cluster/broker that needs authentication via Microsoft EntraID / OAuth2 with ...
MKAI's user avatar
  • 76
-3 votes
1 answer
111 views

I’m working with an STM32 microcontroller using USB CDC (Virtual COM Port) and handling incoming data in CDC_Receive_FS(). I want to understand the behavior for UART-style communication: If the host ...
Keerti Madhuvantika 's user avatar
0 votes
2 answers
96 views

I’m looking for practical ways to let Service A get the data it needs from Service B without querying Service B’s database directly. I’m not very experienced with distributed systems yet, so I want to ...
przhevallsky's user avatar
1 vote
1 answer
237 views

we are trying to replicate Db2 changes from z/OS to Confluent Kafka. Everything works out fine for the production of simple JSON records in Kafka. As our policies for Kafka Topics and Schemas are ...
MKAI's user avatar
  • 76
0 votes
2 answers
136 views

I have a pipeline that drops and re-creates several Snowflake tables every day (effectively a full refresh using CREATE OR REPLACE TABLE). I want to capture daily deltas (inserts/updates/deletes) for ...
NickS's user avatar
  • 1
0 votes
0 answers
63 views

I am a newbie in EMQX architecture. Recently, for test purposes, I deployed an EMQX enterprise 6 single node, with Kafka producer connector integration (Action Sink). The compatition matrix of the ...
4Register Tony's user avatar
0 votes
0 answers
80 views

I'm using Flink CDC + Apache Hudi in Flink to sync data from MySQl to AWS S3. My Flink job looks like: parallelism = 1 env = StreamExecutionEnvironment.get_execution_environment(config) ...
Rinze's user avatar
  • 844
0 votes
1 answer
164 views

I am trying to build an ADF pipeline that incrementally updates a table using Change Data Capture (CDC). And I'm running into an issue when I try to use a CDC built in table function in an ADF Lookup ...
Phil A's user avatar
  • 1
0 votes
0 answers
49 views

I have a debezium 3.0 in my python project. Postgres-debezium connector has the following structure: { "name": "dbz_name", "config": { "connector.class&...
Марина Лисниченко's user avatar
2 votes
1 answer
184 views

I am using Flink with Debezium to consume CDC changes from Oracle DB tables via LogMiner. For some tables, everything works fine. For example, the following table works without issues: CREATE TABLE ...
Parth Vyas's user avatar
0 votes
2 answers
131 views

I'm using CDC services like (Debezium) on my Mongo or Postgres but somehow I came up with situation that I need have to CDC on Redis . for example get streams of event occur in Redis like adding new ...
soroush safari's user avatar
3 votes
1 answer
678 views

I want to sync a Postgres table (2 million records) to Redis. Requirements: Full table should sync initially After that, realtime insert/update/delete should sync to Redis automatically (no polling) ...
shubham jha's user avatar
0 votes
0 answers
54 views

I want to replicate a collection and sync in real time. The CDC events are streamed to Kafka and I’ll be listening to it and based on operationType (insert/delete/update) I’ll have to process the ...
PanicLion's user avatar
  • 197
0 votes
0 answers
74 views

I am using AWS MSK cluster and I have created a MySQL debezium connector that runs on EC2 instance and reads from a specific table, and it is working fine, but it could be the use case where the user ...
RushHour's user avatar
  • 645
2 votes
0 answers
31 views

I'm developing a new project by using stm32 MCU and USB is one of most important features in my project. But unfortunately I always see manufacturer as "Microsoft" and comport name as "...
durukan oktay's user avatar

15 30 50 per page
1
2 3 4 5
42