1

I want to insert more than 500,000 points to influxdb by using curl util. so i want to set "batch size" for better performance in curl but i can't find any option about batch size.

I have already read influxdb standartd manaul section "Write Syntex" - "Write a Batch of Points with curl"

i think points.txt file's line counts in curl -X POST 'http://localhost:8086/write' --data-urlencode 'db=mydb&rp=myrp&u=root&p=root' --data-binary @points.txt is batch size but i'm not sure.

so i tried to separate insert_file to 5000 lines file and insert each file to influxdb but i don't find influxdb efficient.

could anyone tell me truth ?

1
  • What are the specs of the machine you are using? What version of InfluxDB are you running? Commented Nov 13, 2015 at 1:00

1 Answer 1

2

There isn't a batch size setting in InfluxDB. All points in the file submitted via curl are considered one batch. Batches should be about 5k points for best throughput, although on high-power servers or with very regular data larger batch sizes can be more efficient.

I want to insert more than 500,000 points to influxdb

That's too big to send in one batch, so you will need to split it into chunks.

i think points.txt file's line counts in curl -X POST 'http://localhost:8086/write' --data-urlencode 'db=mydb&rp=myrp&u=root&p=root' --data-binary @points.txt is batch size but i'm not sure.

Yes, cat points.txt | wc -l will give you the number of points in the file.

so i tried to separate insert_file to 5000 lines file and insert each file to influxdb but i don't find influxdb efficient.

There's no way to know what you didn't find efficient, so I can't respond to this. 500k points in 5k batches should take maybe 5-60 seconds on a reasonable server.

Sign up to request clarification or add additional context in comments.

Comments

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.