As I have explained in the issue queues several times, the Search API currently has a fixed set of (scalar) data types, and there are good reasons for that. Service classes have to be aware beforehand of all data types that might come at them—adding new ones in contrib modules therefore is pretty much out of the question.
Recently, though, I have been more and more convinced that something like this is in fact needed—mostly in context of Geolocation searches (), but there might of course be other use cases. The best way to do this currently seems to me to require new data type definitions to also include one of the default data types as a fall-back, so service classes not knowing the data type can simply use that standard one (even though the stored information might then be quite useless). We could also—just coming up with this—allow a special value "complex" as the fall-back (not as a normal data type) to specify that service classes not knowing the type shouldn't even bother. Or maybe we could accomplish the same thing by just allowing custom data type definitions to leave out the fall-back.
Hm, thinking about this, we could just require service classes to specify the custom data types they are supporting via special
"search_api_type_$type" features in
supportsFeature(). That way, the framework code could take care of substituting the fall-back data type, and we wouldn't have to change the API/contract for service classes (making this only an API addition, I guess).
|#19||1260834--custom-data-types-19.patch||18.43 KB||drunken monkey|
|#17||1260834--custom-data-types-17.patch||15.75 KB||drunken monkey|
|#10||1260834--custom-data-types-10.patch||10.91 KB||drunken monkey|
|#9||1260834--custom-data-types-9.patch||4.51 KB||drunken monkey|