Drupal Association members fund grants that make connections all over the world.
I've been processing large csv files and keep seeing the same records processed multiple times without ever continuing past a certain place in the file. It seems as if just a part of the csv file gets reprocessed over and over again in a loop. I haven't been able to determine the cause of the problem, so I've deleted records from the feeds_source table and tried again by creating a new csv file starting with the first unprocessed record.
I have noticed that small problems with the csv file itself can cause feeds to break. For example a double quote within a field is treated as a text qualifier causing multiple records to all combine into a single field (until the next double quote is located). This tends to stop processing.
I'm hoping someone has seen this behavior and can point me in the right direction regarding the restarting issue. My hunch is that it's also caused by a problem with the csv file itself because it doesn't happen every time with every file.