i’ve been working a major integration project with quickbase and vtrenz. one of the things i had to do this week was take a ~10K row csv, scrub the cells for specific length, and then ftp to a remote server.
so i wrote a php script that would connect to quickbase, pull down the csv, write it to file, open it, turn the csv into an array, scrub it, recreate the csv from the array, write it to file again, and then send it on it’s merry way ala ftp.
all of it was easy except turning the csv into an array. i didn’t want to step through each character looking for newlines or my delimeter (a comma) and i needed something that would use the first row of the csv as the column headings.
i was able to find some scripts on the web that half-way worked. in fact, the one i used before writing my own worked, it was just super slow and inefficient. it literally used a gig of memory for 5K records.
so here’s my solution for taking a very large csv, and using the first row to create a key=>value associative array.
it now only takes a few seconds to turn a considerably large csv into an array. hope this helps someone out there.