Support for Drupal 7 is ending on 5 January 2025—it’s time to migrate to Drupal 10! Learn about the many benefits of Drupal 10 and find migration tools in our resource center.
I've tried importing the /tests/node.csv but am getting this error:
"An AJAX HTTP error occurred. HTTP Result Code: 500 Debugging information follows. Path: /batch?id=20&op=do StatusText: Internal Server Error ResponseText:"
Feels to me like a Server permissions rewrite/permissions rebuff but not sure whether something to do with .htaccess or settings.php. In Admin permissions I have everything checked as ok so can't understand.. Believe my .htaccess is standard D7 issue [see attached].
Am also running within a customized, sub-theme of Zen.
Comment | File | Size | Author |
---|---|---|---|
#25 | drush_feeds_import.tgz | 1.76 KB | j0rd |
htaccess.txt | 1.75 KB | deeve |
Comments
Comment #1
deeve CreditAttribution: deeve commented..just as an update: I've noticed the .csv file I'm trying to import does make it to my temp folder of 'sites/default/files/feeds/' but seems to not get any further. Could this be something to do with the .htaccess file in the files folder? All I have in there is the following:
Comment #2
deeve CreditAttribution: deeve commented..interesting development; when I receive the error as above, if I then immediately refresh the page, the loading bar reaches 100% & I receive the following error: 'Cannot acquire lock for source node / 0.'
Comment #3
deeve CreditAttribution: deeve commentedBeginning to feel like I'm talking to myself on here!
I just read on the 7.x-2.x-dev release notes not to use that version, when my admin had advised to upgrade to that! So, I downgraded to the 7.x-2.0-alpha4 version thinking this may make a difference - it didn't! Same exact problem/s. Am I to therefore understand the D7 version of Feeds does not work as yet when importing CSV files?
Comment #4
deeve CreditAttribution: deeve commentedHave changed version in this thread to 7.x-2.x-dev as tried & received same error. Took a look at the csv in Temp folder on server & looks like parser and/or processor not being able to distinguish between rows? Does anyone on here have a nodes.csv other than one supplied in tests folder which I could try? Am confused as to precisely how the file should be formatted..
Comment #5
deeve CreditAttribution: deeve commentedI did some more reading yesterday & followed an article I found elsewhere on someone who experienced a similar problem when trying to use Excel spreadsheets on a Mac. He recommended using Google Docs so I tried it & sure enough, it worked first time!
Must be something to do with the default CSV export in MS Excel that either adds or omits the necessary formatting for Feeds to interpret correctly. Anyway, all I do now know is Feeds 7.x-2.x-dev seems to be working fine for me now using Google Docs.
Comment #6
Dale Baldwin CreditAttribution: Dale Baldwin commentedHey I'm getting the same error using xml files so may be more than just a csv formatting issue
An AJAX HTTP error occurred. HTTP Result Code: 500 Debugging information follows. Path: /?q=batch&id=34&op=do StatusText: Internal Server Error ResponseText:
Comment #7
deeve CreditAttribution: deeve commentedHave you tried the Google Docs route?
Comment #8
Dale Baldwin CreditAttribution: Dale Baldwin commentedGoogle Docs would be great however I'm talking about pulling down a few thousand xml files on a daily basis so not really an option.
Comment #9
deeve CreditAttribution: deeve commentedMaybe something's not quite right in the parser element as I found it was having troubles returning line-breaks correctly, even with CSVs in the test folder's files. Having worked with db's in the past, I was under the impression you had to declare a separator for both column & row.. That said, mine now appears to work fine when formatted in Google Docs. Good luck with all those feeds, sorry I can't be of more help.
Comment #11
John Bryan CreditAttribution: John Bryan commentedIf you are not using taxonomy with Feed items then this comment is not relevant.
If you are trying to have RSS or XML (etc.) "Feed Item" nodes inherit a taxonomy term from the parent "Feed" importer node and seeing "An AJAX HTTP error occurred. HTTP Result Code: 500 Debugging information follows." :-
Multiple Feed project issues have been created for this, or with similar message but the relvant one appears to be:-
"taxonomy_node_get_terms doesn't work with drupal 7"
http://drupal.org/node/959984
Comment #12
brandy.brown CreditAttribution: brandy.brown commentedGoogle docs does not solve my problem. I am able to upload some csv documents without problem, but for others, it throws this error. Seems to be an intermittent problem.
Comment #13
rsgracey CreditAttribution: rsgracey commentedNever mind...still not working...
Comment #14
pwaterz CreditAttribution: pwaterz commentedHighly recommend not using feeds for this, and doing something custom. At my job we process 100,000+ xml files into drupal and feeds can not to that.
Comment #15
brandy.brown CreditAttribution: brandy.brown commentedI don't think not using feeds is a solution to this problem.
Perhaps upping the PHP memory limit is a solution? Just a thought.
Comment #17
Sivert CreditAttribution: Sivert commentedI had the same error message. The patch at http://drupal.org/files/feeds-unsupported_opperand_types-1213472-28.patch fixed the problem for me. Or see comment #28 at http://drupal.org/node/1213472.
Comment #18
j0rd CreditAttribution: j0rd commentedI believe this has something to do with bad characters in the csv.
I created a CSV file with entries with crazy characters and it fails immediately.
Then I imported some files with no bad characters and it works fine.
Then I tried to do a file with both, and I got 28% of the way though before it failed.
Here's some text samples from my 100% fail file.
Comment #19
j0rd CreditAttribution: j0rd commentedI seem to be able to get passed the 'Cannot acquire lock for source node / 0.' by dealing with these 3 items.
Additionally, if you have mega files like me:
PHP max_execution time will eventually kill the batch process main thread.
Additionally, if you're running fastcgi like me:
FPM's request_terminate_timeout will eventually kill the batch process main thread.
Additionally, if you're like me and running into timeouts:
The acquire_lock in {semaphore} will remain until it times out and you'll need to DELETE it from the {semaphore} table before restarting the batch.
Best way to import would be through CGI and Drush. I don't believe feeds provides anything currently to handle this...but I did find this sandbox, but have yet to test it out.
http://drupal.org/sandbox/enzo/1865202
Comment #20
Summit CreditAttribution: Summit commentedHi,
Also error 500 on xml with 130000 lines...how can I get this imported with feeds without having to cut the xml into 10 pieces of xml?
Background option is not working, then also crash of site.
Greetings,
Martijn
Comment #21
J0keR CreditAttribution: J0keR commentedsame problem here when i upload the csv file, but there is only 500+ rows in my csv file. maybe is bad chinese characters i guess...... any ideas ?
Comment #22
nyancat CreditAttribution: nyancat commentedSo characters such as ñ are causing feeds to spit an error? Mine was freezing at 40%, I'm trying to figure out what character in my file is causing this issue.
Comment #23
Grubber CreditAttribution: Grubber commentedI found a fix (not really!, but). Unlocking the importer showed me that 30% items were imported, then I re-imported the same file, it went to 30 % in less than a second then stuck at 56%. I was able to import in 3-4 tries. I know this isn't the right way but got no choice right now. Can php_upload limit be the culprit ? because by default, it allows uploads of only 2mb and my file size is 5mb
Comment #24
j0rd CreditAttribution: j0rd commentedI would recommend using a drush script to import. I posted enzo's sandbox in #19. I've patched it for my needs. I believe I posted those patches in his sandbox. There's also a feeds variable which needs to get updated, which limits an import to a certain amount of items, if you're using that script.
I believe anyone who's doing this via the web is running into PHP execution time problems, or memory limit problems or what every else type problems, which can be avoided by using the CLI via drush (providing your CLI has high values).
I imported a couple thousand items 2-3 days ago with out issue using that script. My CSV for that import had many UTF-8 strange chracters, and as long as you're making sure your encoding is ok, you will not have problems.
Comment #25
j0rd CreditAttribution: j0rd commentedI've .tgz'd up enzo's sandbox with my patches for ease of use. I also wrote a small README.txt in the folder for those who are interested. I know there's other drush feed importers, but I've been using this one for about a year and it works fine for my needs.
Comment #26
MegaChriz CreditAttribution: MegaChriz as a volunteer commentedSee #2866431: 500 Internal Server Error or #1219296: Partial import then ajax message - 500 error.