Early Bird Registration for DrupalCon Portland 2024 is open! Register by 23:59 PST on 31 March 2024, to get $100 off your ticket.
per subject :)
Comment | File | Size | Author |
---|---|---|---|
#9 | FeedsHTTPFetcher.inc_.patch | 3.16 KB | serbanghita |
per subject :)
Comment | File | Size | Author |
---|---|---|---|
#9 | FeedsHTTPFetcher.inc_.patch | 3.16 KB | serbanghita |
Comments
Comment #1
alex_b CreditAttribution: alex_b commentedThis is a feature request. Would have to be implemented on the fetcher level, both on file fetcher and HTTP fetcher.
Comment #2
wayout CreditAttribution: wayout commentedMy source only uses compressed feeds :( . Anyone have a patch for this by any chance (sorry i'm not a programmer)?
Comment #3
wayout CreditAttribution: wayout commentedComment #4
tomcatuk CreditAttribution: tomcatuk commentedAlso interested - got access to a feed, but ony compresssed (zip or gzip)
Comment #5
alex_b CreditAttribution: alex_b commentedThe challenge here is going to be to detect if a resource is compressed and if yes, the compression format. There is no standard way for doing this. We could try in that order: content type per HTTP header, then file extension.
Comment #6
Anonymous (not verified) CreditAttribution: Anonymous commentedI'm using the custom Feeds XML Parser (found in the issue queue somewhere).
I was able to implement a quick fix to read the gzip, in my FeedsXMLParser.inc:
find:
change it to:
Basically, the Feeds parser will read the $file as 'compress.zlib://http://www.mysite.com/feed.xml.gz'.
This may look strange, but the "compress.zlib://" prefix is actually a PHP url prefix that behaves as a gunzip().
Another example:
You should find the right line in your Parser.inc to implement this yourself.
Comment #7
alex_b CreditAttribution: alex_b commentedShould be implemented on fetcher / batch level. When a file is uploaded from client or downloaded from web, uncompress it right away. Note: of course, let's not uncompress enclosures.
Comment #8
serbanghita CreditAttribution: serbanghita commentedalex_b is right, i've already implemented the code, i'll post a patch or the code here in 1 or 2 days.
I will code the same stuff for zip support.
@alex_b is there any way i can modify this from another module? Like feeds_social?
Thanks!
Comment #9
serbanghita CreditAttribution: serbanghita commentedHere is the solution. Full support for gzip and zip.
I've modified modules/feeds/plugins/FeedsHTTPFetcher.inc the getRaw() method.
I've also attached a patch.
Check it out!
Comment #10
tomcatuk CreditAttribution: tomcatuk commentedOK, this might sound like a dumb question, but would I be right in thinking this patch is only intended for compressed XML?
Comment #11
serbanghita CreditAttribution: serbanghita commentedOh, i get it. I've should have used a general set of messages. Still the patch works for any other files.
I'll repost the improved code and messages.
Comment #12
tomcatuk CreditAttribution: tomcatuk commentedThanks Serbanghita, just patched the latest release before uploading it. Initial results are looking great.
Alex....is this likely to make it into the next release?
Comment #13
alex_b CreditAttribution: alex_b commented#12: Here is what I see is open:
- Break out decompression functionality into its own helper method in FeedsImportBatch. Goal: Make it available to both, FeedsFileBatch and FeedsHTTPBatch classes.
- Implement decompression for getRaw() and getFilePath() for both, FeedsHTTPBatch class and FeedsFileBatch class.
- Clean up code.
- Add tests.
Comment #14
Michsk CreditAttribution: Michsk commentedoh my, this would be awesome
Comment #15
pepej CreditAttribution: pepej commentedComment #16
pepej CreditAttribution: pepej commentedComment #17
johnhorning CreditAttribution: johnhorning commentedDid this ever go anywhere? I could use this feature on my Commission Junction product data feed.
Comment #18
ashcin47 CreditAttribution: ashcin47 commented@serbanghita the patch does not work on drupal 7. can somebody please share a solution for drupal 7
Comment #19
twistor CreditAttribution: twistor commentedComment #20
MegaChriz CreditAttribution: MegaChriz as a volunteer commentedThis is a duplicate of #2114565: Import from an zip/tar archive.