Support for Drupal 7 is ending on 5 January 2025—it’s time to migrate to Drupal 10! Learn about the many benefits of Drupal 10 and find migration tools in our resource center.
Module leaves ugly markers in output. You can use the change below or better yet, change the manual pagebreak marker to be an HTML comment which will do no harm.
At bottom of paging_nodeapi('view')
// Inexplicably, the constant for MANUAL_BREAK is a translated string.
elseif ($paging == 1) {
$node->content['body']['#value'] = str_replace('[ pagebreak ]', '', $node->content['body']['#value']);
}
Comment | File | Size | Author |
---|---|---|---|
#5 | ch10text.zip | 138.16 KB | adam_b |
Comments
Comment #1
mundanity CreditAttribution: mundanity commentedHi Moshe,
Not using HTML comments was a design decision, although I'm not particularly attached to the method I'm using. I didn't want users to have to enable a filter in part of the setup, but over time I'm thinking this is a small price to pay.
[ pagebreak ]
needs to be more flexible than just a string (ie random whitespace), and as such is using regex's to determine matches (and sadly to strip out the values after as well). If you have an example of it still showing the marker output, please setup a demo I can take a look at.Comment #2
mundanity CreditAttribution: mundanity commentedComment #3
adam_b CreditAttribution: adam_b commentedYou can find a sample (at the moment - may not be around for long) at
http://cpag.adam-browne.com/node/4785?page=show
First example is
[ header = Who can claim housing benefit ]
Comment #4
mundanity CreditAttribution: mundanity commentedHi Adam,
Hmm, I can't reproduce this in my test environment, could you send me your raw text for that article? Thanks for your help!
Comment #5
adam_b CreditAttribution: adam_b commentedHere's the text. The input format is full HTML, and I'm using the WYSIWYG module as input where necessary. I'm also using:
- the Footnotes module, but I don't think there are any here
- the Glossary module normally, but it times-out with this much text so I've turned it off for the moment
I've just noticed on extracting the text that the header's enclosed itself in
<p>
tags:<p>[ header = Who can claim housing benefit ]</p>
Maybe this is why? perhaps from the WYSIWYG filter? If so, maybe I need to go the
<h3>
route.Comment #6
mundanity CreditAttribution: mundanity commentedHi Adam,
Thanks for the attachment, version 1.4 and above can handle the
<p>
tags without issue, so it should be parsing fine, theoretically. I'll take a look at the text you sent though and figure out why it's not.Comment #7
mundanity CreditAttribution: mundanity commentedHi Adam,
Just an update on this, I was unable to reproduce the issue with the text you sent me, but I'll try a few more options with your configuration to see if I can track it down. Are you seeing this on all your nodes, or just this specific one?
Comment #8
adam_b CreditAttribution: adam_b commentedI'm afraid I've only experimented with it on the one node, so that's the only example.
Comment #9
mundanity CreditAttribution: mundanity commentedHi Adam,
Unfortunately I'm still unable to reproduce this. Which filters for footnotes are you using? (I doubt it's the issue but I'd like to replicate your environment as best as possible). Also, what order of filters are you using?
Comment #10
mundanity CreditAttribution: mundanity commentedHi Adam,
Unfortunately I was never able to reproduce this. I did notice some memory issues due to the size of the text you provided, and am wondering if that maybe contributed to the text not being parsed or something. If you are able to get me any more information on this please feel free to re-open this.
Comment #11
adam_b CreditAttribution: adam_b commentedHi - sorry I wasn't able to provide more information, but (a) the test node now causes a PHP memory error when I try and edit it, and (b) we're changing the data structure so that long articles will be broken down into separate nodes rather than using pagebreaks. Since nobody else seems to have had the problem, I agree that you're right to close it. Thanks for the attempts to help.