0

I have about 45GB of data in my BigQuery that I want to transfer to Elasticsearch. Currently, I am fetching rows from my BigQuery table as JSON then indexing them into Elasticsearch. The whole process took about 2 weeks to complete. I just wanted to know if there is a better and more efficient way to do this.

5
  • the 45GB is single file or chunk of file Commented Jun 22, 2020 at 9:46
  • It is chunk of file. Commented Jun 22, 2020 at 11:27
  • 1
    try to use generator Commented Jun 22, 2020 at 11:54
  • @ThangarajanPannerselvam Could you please elaborate on that. I am not much familiar with generator. Commented Jun 22, 2020 at 13:36
  • 1
    Hi! I would like to ask you to take a look for following thread in Stackoverflow: stackoverflow.com/questions/39252484/… where you can find different scenarios. Let me know if it's sufficient for your needs. Commented Jun 26, 2020 at 7:16

0

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.