I have about 45GB of data in my BigQuery that I want to transfer to Elasticsearch. Currently, I am fetching rows from my BigQuery table as JSON then indexing them into Elasticsearch. The whole process took about 2 weeks to complete. I just wanted to know if there is a better and more efficient way to do this.
5
- the 45GB is single file or chunk of fileThangarajan Pannerselvam– Thangarajan Pannerselvam2020-06-22 09:46:11 +00:00Commented Jun 22, 2020 at 9:46
- It is chunk of file.Achal Gambhir– Achal Gambhir2020-06-22 11:27:59 +00:00Commented Jun 22, 2020 at 11:27
- 1try to use generatorThangarajan Pannerselvam– Thangarajan Pannerselvam2020-06-22 11:54:40 +00:00Commented Jun 22, 2020 at 11:54
- @ThangarajanPannerselvam Could you please elaborate on that. I am not much familiar with generator.Achal Gambhir– Achal Gambhir2020-06-22 13:36:09 +00:00Commented Jun 22, 2020 at 13:36
- 1Hi! I would like to ask you to take a look for following thread in Stackoverflow: stackoverflow.com/questions/39252484/… where you can find different scenarios. Let me know if it's sufficient for your needs.aga– aga2020-06-26 07:16:11 +00:00Commented Jun 26, 2020 at 7:16
Add a comment |