Token made available by Token Filter are removed by CKeditor

Any idea?

Thanks for help

Comments

jcisio’s picture

Status: Active » Postponed (maintainer needs more info)

"Tokens" are text and not removed by CKEditor. Could you please explain a bit more?

jvieille’s picture

Update:
Token are not stripped when I create a node, they only disappear when I edit the node. The token are removed by CKeditor at the time the node is edited.
If I set CKeditor to be disabled by default, token stay in the text field. As son a s I switch to "rich text editor", the tokens are removed.

Also, only node tokens are removed, global and user tokens do not seem to be affected.
As it is, Token filter does not handle node tokens. I modified this module as discussed here:
https://drupal.org/node/730078#comment-7777011

jcisio’s picture

What do the token and replace text look like?

jvieille’s picture

Before CKeditor messes up, a sample text would look like this:

Vous pourrez y souscrire en allant sur le site [token global site-url] ou directement &agrave; l&rsquo;aide du lien ci-dessous : &nbsp;<br />
[token node field_campagne_url_pd-formatted]<br />

After:

Vous pourrez y souscrire en allant sur le site https://www.see.asso.fr/ ou directement &agrave; l&rsquo;aide du lien ci-dessous : &nbsp;<br />

The global token is replaced by its context value [token global site-url] , so it will need to be entered again to keep its function
The node token is simply erased

jcisio’s picture

Status: Postponed (maintainer needs more info) » Active

It looks like in CKEditor XSS Filter (ckeditor_filter_xss), the Token Filter is used, nevertheless it should not.

jvieille’s picture

What that means, what can I do?

jvieille’s picture

Is there any fix for this?

jcisio’s picture

Priority: Major » Normal

#5 is a starting point. You'll have to debug the code.

jvieille’s picture

Wich one is the culprit, CKeditor, or Token filter?

jvieille’s picture

Status: Active » Closed (won't fix)

OK I give up the Token filter buggy module.
Also, this approach does not seem appropriate to render token information through rules (token in token will certainly never work).
Thanks for the directions, they were helpful.