performance - PHP looping through huge text file is very slow, can you improve? -


the data contained in text file (actually .dat) looks like:

lin*1234*up*abcde*33*0*ea lin*5678*up*fghij*33*0*ea lin*9101*up*klmno*33*23*ea 

there on 500,000 such lines in file.

this i'm using now:

//retrieve file once         $file = file_get_contents('/data.dat');  $file = explode('lin', $file);      ...some code  foreach ($list $item) { //an array containing 10 items      foreach($file $line) { //checking if these items on huge list          $info = explode('*', $line);          if ($line[3] == $item[0]) {              ...do stuff...                                   break; //stop checking if found           }       }           } 

the problem runs way slow - 1.5 seconds of each iteration. separately confirmed not '...do stuff...' impacting speed. rather, search correct item.

how can speed up? thank you.

if each item on own line, instead of loading whole thing in memory, might better use fgets() instead:

$f = fopen('text.txt', 'rt');  while (!feof($f)) {     $line = rtrim(fgets($f), "\r\n");     $info = explode('*', $line);     // etc. }  fclose($f); 

php file streams buffered (~8kb), should decent in terms of performance.

the other piece of logic can rewritten (instead of iterating file multiple times):

if (in_array($info[3], $items)) // $info[3] inside array of 10 things 

or, if $items suitably indexed:

if (isset($items[$info[3]])) { ... } 

Comments

Popular posts from this blog

javascript - RequestAnimationFrame not working when exiting fullscreen switching space on Safari -

Python ctypes access violation with const pointer arguments -