I'm looking for a very fast method to read a csv file. My data structure looks like this:
timestamp ,float , string ,ip ,string 1318190061,1640851625, lore ipsum,84.169.42.48,appname and I'm using fgetcsv to read this data into arrays.
The problem: Performance. On a regular basis the script has to read (and process) more than 10,000 entries.
My first attempt is very simple:
//Performance: 0,141 seconds / 13.5 MB while(!feof($statisticsfile)) { $temp = fgetcsv($statisticsfile); $timestamp[] = $temp[0]; $value[] = $temp[1]; $text[] = $temp[2]; $ip[] = $temp[3]; $app[] = $temp[4]; } My second attempt:
//Performance: 0,125 seconds / 10.8 MB while (($userinfo = fgetcsv($statisticsfile)) !== FALSE) { list ($timestamp[], $value[], $text, $ip, $app) = $userinfo; } - Is there any way to improve performance even further, or is my method as fast as it could get?
- Probably more important: Is there any way to define what columns are read, e.g. sometimes only the timestamp, float columns are needed. Is there any better way than my way (have a look at my second attempt :)
Thanks :)
fgetcsv,implode,SplFileObject,sscanf) in your real world environment. You have to do it yourself.$data = array_map("str_getcsv", file($filename));is the speediest method, as it reads in the whole CSV file at once.