Skip to main content

You are not logged in. Your edit will be placed in a queue until it is peer reviewed.

We welcome edits that make the post easier to understand and more valuable for readers. Because community members review edits, please try to make the post substantially better than how you found it, for example, by fixing grammar or adding additional resources and hyperlinks.

Required fields*

5
  • Thanks but I'm not sure your solution could work - see my edits in the "issues" part. Commented Jul 1, 2015 at 9:29
  • Since there are no updated time fields in the source database, then we are left to pull the qualified data rows based on the checksum or the hash. Commented Jul 1, 2015 at 9:45
  • Since your source is db2. How do you intent to pull the data from it ? via some webservice or API.. Commented Jul 1, 2015 at 9:48
  • A dsn has been set up using an odbc driver. I can connect and do queries using pyodbc for Python. Commented Jul 1, 2015 at 9:59
  • Alright this is good, since you can perform the queries using the tool called PyODBC into the remote DB. You can do one more thing. You can pull the product data straight in the same format as it is into the new "Staging table" in your target DB without any checks or validations. This way you will get the live data in a single shot in your target db under the stage tables. Then later in the second step, you can perform the checksum operations and update the target transactional table data. This would prevent the hash or the checksum evaluation with the source db data at the real time. Commented Jul 1, 2015 at 10:49