Problem/Motivation

First, a little bit of historical context

Even before shipping Drupal 8.0.0, and before the Drupal 8 ecosystem started adding functionality to Drupal core's REST/API-First support, it already was clear that the Symfony Serialization component that we use, is severely flawed in Drupal's plug-and-play model: #2575761: Discuss a better system for discovering and selecting normalizers.

When Drupal 8's REST module was at its height of initial development, in 2013, the HAL normalization seemed to be the future. 5 years later, it's at IETF spec draft (iteration 8), last updated in May 2016.
Besides HAL, we also support a "default" normalization, which is mostly the same, but without any hypermedia metadata.

Since then, JSON API has risen in popularity, reaching 1.0 in mid 2015 … i.e. after Drupal 8 was frozen, and only critical bugfixes could go in. A contrib module has been developed, and it's rising in popularity quickly: https://www.drupal.org/project/jsonapi. It offers a far better DX than core's REST module, because it solves many more problems (including collections, filtering, sorting, pagination, sparse fieldsets).

How does this relate to the issue title?

For many field types (@FieldType plugins), the normalizations of fields of those types at the time Drupal 8 shipped were either:

  1. incomplete, e.g. for @FieldType=text fields, the processed text wasn't available.
  2. not-so-great DX, e.g. for @FieldType=timestamp fields, a UNIX timestamp was returned, which led to bug reports about how to interpret them

We worked on addressing those problems. We followed the example of the existing normalizers, and hence ended up doing for example #2768651: Let TimestampItem (de)normalize to/from RFC3339 timestamps, not UNIX timestamps, for better DX (all others are still in progress). Problem solved!

Different normalizations structure their normalizations differently

This may sound obvious, but it is not.

  1. For the "default" normalization (provided by the serialization module for the json and xml formats), no top-level metadata keys are added the the normalization.
  2. For the "HAL" normalization (hal module for the hal_json format), top-level _links and _embedded keys are added.
  3. For the "JSON API" normalization (jsonapi module for the api_json format), everything is shifted two levels under a data and then attributes key, except for the UUID, which is relabeled to id and sits below data, next to attributes, and entity reference fields are shifted to a relationships key under data, next to attributes.

As you can imagine, this requires that the field-level normalization is implemented differently.

Which finally brings us to the problem that this issue aims to solve: the normalizer added in #2768651: Let TimestampItem (de)normalize to/from RFC3339 timestamps, not UNIX timestamps, for better DX works for both the "default" normalization and the "HAL" normalization … because those happen to be similar enough. It doesn't work for the "JSON API" normalization. Which means that a contrib module developer would have to do the work:

  1. once for "default" + "HAL" (likely to happen because in core)
  2. again for "JSON API" (less likely to happen because in contrib)
  3. again for new contrib/custom normalizations (same)

Nobody had ever given this some thought. In fact, I even blamed the JSON API module for this at #2860350-14: Document why JSON API only supports @DataType-level normalizers.

Strengthening API-First Drupal

If we continue along the path we're on right now, contrib modules keep adding @FieldType-level normalizers with the problems described above. Which means that new/contrib normalizations are always going to be lagging very far behind core's normalizations. Which means some data will simply not be accessible.

Now, if we instead would implement @DataType-level normalizers (in our example: \Drupal\Core\TypedData\Plugin\DataType\Timestamp aka @DataType=timestamp instead of \Drupal\Core\Field\Plugin\Field\FieldType\TimestampItem aka @FieldType=timestamp), then the normalizations of non-core normalizations would automatically support those.

99% of all field items (@FieldType-level) behave the same, it's only their property values (@DataType-level) that really need normalizers. The primary exception to that rule is @FieldType=entity_reference, because it's what determines relationships/hyperlinks, and almost every normalization has its own way of dealing with those.

IOW:

  1. have format-specific high-level normalizers (at entity, field item list, field item levels)
  2. have generic (formatless) @DataType normalizers (at property value level)

Proposed resolution

  1. Never implement @FieldType-level normalizers, only implement @DataType-level normalizers.
  2. Enforce this via test coverage: make the test discover all normalizer services, check the classes/interfaces it supports, those cannot be field types.
  3. Exempt the entity reference field type.
  4. This test should run for both core and contrib modules, which means some contrib modules' automated tests could start failing. This is necessary to nudge them to transition to @DataType-level normalizers.

Remaining tasks

  1. Write test that detects @FieldType-level normalizers and fails. This should fail in HEAD, because it contains a timestamp field normalizer.
  2. Fix #2926508: Add DateTimeNormalizer+TimestampNormalizer, deprecate TimestampItemNormalizer: @DataType-level normalizers are reusable by JSON API, which should make the test pass.

User interface changes

None.

API changes

None.

Data model changes

None.

Members fund testing for the Drupal project. Drupal Association Learn more

Comments

Wim Leers created an issue. See original summary.

Wim Leers’s picture

Issue summary: View changes

For context, this issue was created because I didn't even realize this problem. In fact, I blamed the JSON API module at #2860350-14: Document why JSON API only supports @DataType-level normalizers for being clearly buggy, because it wasn't picking up the improved normalization that #2768651: Let TimestampItem (de)normalize to/from RFC3339 timestamps, not UNIX timestamps, for better DX added.

@e0ipso explained at #2860350-21: Document why JSON API only supports @DataType-level normalizers why it worked this way, I didn't fully understand him at first (comment 23). Gradually, the understanding became clearer, and for the first time, I was grateful at the painfully slow rate of progress in fixing the normalizations of multiple field types … which I realized thanks to him should be fixed at the data type level instead. I summarized my understanding + the current status in #2860350-33: Document why JSON API only supports @DataType-level normalizers.

Wim Leers’s picture

I'm happy to report that #2926508: Add DateTimeNormalizer+TimestampNormalizer, deprecate TimestampItemNormalizer: @DataType-level normalizers are reusable by JSON API has proven to work. And in fact, it will be solving another issue at the same time: the normalization of datetime and daterange fields (#2867206: [PP-1] API-First DX of "date" and "date range" fields: add 'datetime_iso8601' @DataType (de)normalizer using RFC3339 timestamps with timezones) … thanks to the fact that we'll be fixing this at the @DataType level!

I first moved the normalization logic from \Drupal\Core\Field\Plugin\Field\FieldType\TimestampItem to \Drupal\Core\TypedData\Plugin\DataType\Timestamp, and then I realized (see #2926508-18: Add DateTimeNormalizer+TimestampNormalizer, deprecate TimestampItemNormalizer: @DataType-level normalizers are reusable by JSON API) that class Timestamp extends IntegerData implements DateTimeInterface {, which means I might as well fix it a level up in the interface tree: at the DateTimeInterface level.
Doing that means that it also automatically supports \Drupal\Core\TypedData\Plugin\DataType\DateTimeIso8601 which is class DateTimeIso8601 extends StringData implements DateTimeInterface. And by supporting that @DataType, the datetime and daterange fields work automatically!

Wim Leers’s picture

I'll work on a patch that provides the test described in the issue summary later this week.

Wim Leers’s picture

It'd be great if Drupal 8.5 shipped with this, to discourage in code what I've already been actively been discouraging in issue comments and discussions.

I'll be working on a patch for this shortly.

Wim Leers’s picture

While working on this, I learned something new about the Serialization module: there's a NullNormalizer which allows one to blacklist arbitrary classes encountered during normalization!

Also: I started by writing the change record: https://www.drupal.org/node/2934779 :)

Wim Leers’s picture

Status: Active » Needs review
Issue tags: +@deprecated
FileSize
6.04 KB

Status: Needs review » Needs work

The last submitted patch, 8: 2926507-8.patch, failed testing. View results
- codesniffer_fixes.patch Interdiff of automated coding standards fixes only.

Wim Leers’s picture

Assigned: Wim Leers » Unassigned
Status: Needs work » Needs review
FileSize
3.13 KB
6.29 KB

5 coding standards violations, telling us to create use statements. That'll be less clear, but sure, let's do that.

And most importantly:

00:17:45.777 Fatal error: Cannot declare class Drupal\Tests\serialization\Functional\EntityResource\NoFieldTypeNormalizersTest, because the name is already in use in /var/www/html/core/modules/serialization/tests/src/Kernel/NoFieldTypeNormalizersTest.php on line 0

… probably because I got the namespace wrong 🤦‍♂️

Wim Leers’s picture

Green patch, hurray! Would be very very valuable to have this ship in 8.5.0 :)

gabesullice’s picture

Status: Needs review » Needs work
  1. +++ b/core/modules/serialization/src/Normalizer/NormalizerBase.php
    @@ -97,4 +106,48 @@ protected function addCacheableDependency(array $context, $data) {
    +      // being normalized. By default, this is applied only  tothe 'password'
    

    s/ tothe/ to the/g

  2. +++ b/core/modules/serialization/src/Normalizer/NormalizerBase.php
    @@ -97,4 +106,48 @@ protected function addCacheableDependency(array $context, $data) {
    +      return in_array($name, [FieldItemInterface::class, FieldItemListInterface::class], TRUE)
    +        || is_subclass_of($name, FieldItemInterface::class, TRUE)
    +        || is_subclass_of($name, FieldItemListInterface::class, TRUE);
    

    This feels like it's trying to win the "most compact if-statement award".

  3. +++ b/core/modules/serialization/src/Normalizer/NormalizerBase.php
    @@ -97,4 +106,48 @@ protected function addCacheableDependency(array $context, $data) {
    +    array_walk($supported, function ($name) use ($normalizer_class, $allowed_exceptions, $is_disallowed) {
    

    I am the biggest sinner when it comes to this, but a foreach would be much more clear here.

gabesullice’s picture

+++ b/core/modules/serialization/tests/src/Kernel/NoFieldTypeNormalizersTest.php
@@ -0,0 +1,49 @@
+    $listing = new ExtensionDiscovery(\Drupal::root());

$this->root should work inside KernelTestBase

Wim Leers’s picture

Status: Needs work » Needs review
FileSize
2.56 KB
6.21 KB

#12:

  1. 🤦‍♂️ — fixed!
  2. Lots of these in core. Grep for example ||. Just some more added in #2933991: Deprecation tests fail when all PHPUnit tests are run via PHPUnit. So … not sure what you're suggesting?
  3. 👍, done!

#13: hah, great catch!

gabesullice’s picture

Status: Needs review » Reviewed & tested by the community

#14.2 When these get a little too "compact" I try to break them apart into variables that are named according to what they "prove," like $is_item or $is_item_list. Then you can write return $is_item || $is_item_list;. However, I just tried to do that for this particular case and it really didn't add anything in the way of legibility so /shrug.

LGTM now :)

Wim Leers’s picture

Oh, that makes sense! That's definitely a best practice, but here it'd likely add little.

Thanks for paying attention to details like that! 👍

larowlan’s picture

Status: Reviewed & tested by the community » Needs review
+++ b/core/modules/serialization/src/Normalizer/NormalizerBase.php
@@ -97,4 +106,48 @@ protected function addCacheableDependency(array $context, $data) {
+      // TextItemSillyNormalizer is a test-only normalizer.
+      TextItemSillyNormalizer::class,

So the fact we've got to reference this here perhaps indicates that we might need a container parameter for this list - or some sort of API.

For example, entity reference revisions normalizer in contrib is also a special case that does lookup/resolution to set the revision ID.

Similary Entity Pilot provides an ER normalizer that works with unsaved entities in its 'preview' feature. It can do so because it has an alternate implementation of the entity resolver that can handle unsaved entities.

Can these be removed and replaced with DataType level normalizer? If not, how do they go about white-listing themselves?

And then if they do, how do we stop the floodgates?

Wim Leers’s picture

Status: Needs review » Reviewed & tested by the community
FileSize
2.93 KB
6.54 KB

Great feedback!

  1. <a href="http://cgit.drupalcode.org/entity_reference_revisions/tree/src/EntityReferenceRevisionsFieldItemList.php?h=8.x-1.x ">EntityReferenceRevisionsFieldItemList at minimum needs serious cleaning up: referencedEntities() is 95% identical, with several of the bugfixes in core not having been ported (for example #2404331: Can't serialise an entity with an entity reference to an unsaved entity (remove EntityReferenceItem::$NEW_MARKER in favour of a validation constraint) which you worked on!), processDefaultValue() is 100% identical. This is risky/dangerous.
  2. RE: Can this be replaced with a @DataType-level normalizer? As explained in the IS, entity reference fields are the exception. I even said this in the proposed resolution:
    1. Exempt the entity reference field type.

    … and then I failed to do so in the actual patch!

    Fixed :)

Berdir’s picture

Status: Reviewed & tested by the community » Needs review

Yeah, also very unsure about this.

ERR is a good example, there's the current implementation but there are also issues to add more logic, for example to embed composite entities like paragraphs. That's IMHO logic that belongs on the field level and it is also format specific anyway as it basically recursively normalizes the entity using the HAL serialization, doubt that another format would just work.

We can document that when possible, it should be done in a way that works everywhere, but completely deprecating it? I'm not convinced that makes sense.

Another example is https://www.drupal.org/project/block_field/issues/2882727, how should that be done on a lower level, it's just a standard map property, we only want to act on block_field fields, not everything. (That is a workaround until core supports that natively, but still)

Wim Leers’s picture

Yeah, also very unsure about this.

ERR is a good example, there's the current implementation but there are also issues to add more logic, for example to embed composite entities like paragraphs. That's IMHO logic that belongs on the field level and it is also format specific anyway as it basically recursively normalizes the entity using the HAL serialization, doubt that another format would just work.

We can document that when possible, it should be done in a way that works everywhere, but completely deprecating it? I'm not convinced that makes sense.

Based on this, I think you didn't read #18's interdiff, because that specifically addressed this?

As the #18 patch, the #18 interdiff, the #18.2 comment and the detailed rationale in the issue summary all state: yes, entity reference fields are a special snowflake, because they are what express relationships between different pieces of data stored in Drupal, and each format represents relationships differently. That's why we're adding this test to actively discourage adding more @FieldType-level normalizers, except for entity reference fields!

Another example is https://www.drupal.org/project/block_field/issues/2882727, how should that be done on a lower level, it's just a standard map property, we only want to act on block_field fields, not everything. (That is a workaround until core supports that natively, but still)

This is exactly an example of why solving things at a lower level results in things working for everything. That issue indeed initially set out to only change the normalization for @FieldType= block_field. And the only thing that that patch (#2882727-8: Add custom HAL normalizer) is doing, is relying on the default normalization for all properties:

+    // Let HAL field item normalizer do the initial normalization.
+    $data = parent::normalize($field_item, $format, $context);

followed by the only logic it really intends to add, to deal with BlockFieldItem's settings property, which is of the @DataType=map:

+    // The block's settings aren't normalized. Add them here.
+    $data[$field_item->getParent()->getName()][0]['settings'] = $field_item->settings;

As @JamesK at #2882727-13: Add custom HAL normalizer points out: there is no more need for the @FieldType=block_field normalizer once there is a @DataType=map normalizer, which is being added in #2895532: @DataType=map cannot be normalized, affects @FieldType=link, @FieldType=map.

Wim Leers’s picture

Title: Add test to prevent adding new @FieldType-level normalizers » Discourage @FieldType-level normalizers, encourage @DataType-level normalizers, to strengthen the API-First ecosystem

Also, I see that the issue title is still more aggressive than it should be. That was the original issue title. The CR that I created a few days ago is more nuanced: https://www.drupal.org/node/2934779. Made the CR title also the issue title.

Wim Leers’s picture

Also, thanks @Berdir, for asking the critical questions :) 👌

Wim Leers’s picture

Discussed this with @Berdir in IRC:

14:44:04 <WimLeers> berdir: https://www.drupal.org/project/drupal/issues/2926507#comment-12414339
14:44:28 <WimLeers> berdir: thanks for asking critical questions :)
14:46:02 <berdir> WimLeers: yeah, so about "As #2882727-13: Add custom HAL normalizer points out: there is no more need for the @FieldType=block_field normalizer once there is a @DataType=map normalizer, which is being added in #2895532: MapData cannot be serialized .". @DataType=map is great *if* I want to affect all map data properties. but what if I only want to affect the one in my field? What if I only want to affect a string or integer or
14:46:03 <berdir>  or common field type for my field type?
14:46:44 <berdir> WimLeers: will the entity normalizer check also include all subclasses of that?
14:47:03 <WimLeers> berdir: so you're asking "what if I want different normalization behavior for a data type in my field type, i.e. different from the default normalization of that generic data type" — correct?
14:47:12 <berdir> yes
14:47:28 <WimLeers> berdir: then you should specify your own data type that subclasses the existing data type
14:47:47 <WimLeers> berdir: because a data type X should be normalized exactly the same way EVERYWHERE
14:48:05 <WimLeers> berdir: otherwise, it's also impossible to generate JSON Schema/OpenAPI and therefore API docs are impossible
14:48:36 <WimLeers> (just like @DataType=timestamp subclasses @DataType=integer for example)
14:50:32 <berdir> WimLeers: well, \Drupal\Core\TypedData\Plugin\DataType\Timestamp also has different logic, not sure that creating an empty data type jut to make normalization happy makes sense. then other logic might not apply anymore to that, e.g. using that property as a (plugin system) context that then wouldn't have the same type anymore
14:51:06 <WimLeers> berdir: Let's take a step back.
14:51:35 <WimLeers> berdir: Why does a particular generic data type need a different normalization in a particular field type compared to its uses elsewhere?
14:53:04 <berdir> WimLeers: a core example is the current situation with PathItem, as mentioned in that issue I opened in default_content, the fact that we currently normalize the (internal) pid is a bit weird. Lets say I would want to instead expose a UUID (lets assume that would exist for this example) and then the opposite when denormalizing
14:53:14 <berdir> WimLeers: currently that is an integer
14:54:13 <berdir> WimLeers: quite possibly, I don't want to expose the pid property at all and instead want to expose a pid_uuid, but just like in entity references, I don't want to load/deal with that all the time as a typed data element, just for normalization
14:54:13 <WimLeers> berdir: okay
14:54:25 <WimLeers> berdir: thanks for that example
14:54:42 <WimLeers> berdir: It actually ends up illustrating exactly what I wanted to say, but now I can do so with a concrete example :)
14:54:49 <WimLeers> berdir: $properties['pid'] = DataDefinition::create('integer')
14:54:58 <WimLeers> berdir: (from PathItem::propertyDefinitions())
14:55:07 <WimLeers> berdir: so we say that the 'pid' property is an integer
14:55:20 <WimLeers> berdir: it therefore is given the semantics of "@DataType=integer"
14:55:38 <WimLeers> berdir: but now you'e saying that it's not really an integer, it's actually a reference to something
14:55:54 <WimLeers> berdir: and because it's a reference to something, actually UUID might be a better normalization
14:56:59 <WimLeers> berdir: This makes sense! And that's also the exact same reason I am saying it'd be better to create a different DataType for this use case, because then you can not only assign it a different normalizer, you can also clearly communicate the extra meaning/semantics!
14:57:54 <berdir> WimLeers: well, the whole PathItem is the reference, just like EntityReferenceItem, its target_id is also integer or string
15:00:58 <WimLeers> berdir: but EntityReferenceItem does DataReferenceTargetDefinition::create('integer')
15:01:09 <WimLeers> berdir: PathItem does DataDefinition::create('integer')
15:01:46 <WimLeers> berdir: but yes, there are flaws everywhere. That's what this issue is about: addressing oversights that we are only now realizing to be oversights
15:02:31 <berdir> WimLeers: true, but the resulting data type is the same, that's just a different data definition class that you can't address through the type. and the technically, the pid is not actually stored/used as a refernce, the reference does happen through the parent, so the pid is computed on the field item level.. the business logic is there
15:03:35 <WimLeers> berdir: IDK what your point is in your last message?
15:05:13 <berdir> WimLeers: the first part is that you could not do DataType=data_reference_integer or something, it's still just an integer. the second that the current logic of how PathItem works is on the PathItem class not its properties, so imho, related normalization logic should also be tied to that. but let me think about this stuff a bit more, in the middle of something else atm
15:05:39 <WimLeers> berdir: ok
15:06:19 <WimLeers> berdir: agreed with the first — I think \Drupal\Core\Field\Plugin\Field\FieldType\EntityReferenceItem::propertyDefinitions() should also be updated to use more specific data types
dawehner’s picture

I'm wondering whether instead of hardcoding ER fields we could add something like an interface / annotation for you to opt into not warning about custom normalizers. I could imagine that pretty much every field type which potentially has more than one field would need some additional logic, not just on the data type level.

Wim Leers’s picture

I could imagine that pretty much every field type which potentially has more than one field would need some additional logic, not just on the data type level.

Where you said "more than one field", you probably meant "more than one property"?

Berdir’s picture

That was basically my argument as well. If a field type has custom logic (and not its data types), then it is possible that the normalization logic needs something custom too. Wim's argument was that the logic then should be on the data types and custom data types should be created but I'm not sure about that.

How can we possibly claim that EntityReferenceItem is indeed the only case that's special and nobody else can possibly come up with another valid use case?

And if we add an interface/annotation then we're not really deprecating anything anymore. Those plugins already say they want to work on the FieldType, now they need to add a @YesIReallyReallyWantTo too? :)

I think I still prefer having good documentation what should be done 80%/90% of the cases to support as many different formats and serializations as possible than forcing it.

Wim Leers’s picture

First off: this issue is not about outright forbidding something. This is about nudging all contrib/custom developers in a direction where the normalizers they write work for and with the entire API-First ecosystem where possible:

  • normalizers should work for all formats in the API-First ecosystem where possible: when you have the choice to write a normalizer at the @DataType or @FieldType level, you should always prefer the @DataType level if possible, because that means it'll work for the default normalization (json + xml formats), the HAL normalization (hal_json format) and the JSON API normalization (api_json format) … because the generic field normalizer is enough, since in >90% of cases (I'd say >95%), your field really just contains N properties, and each of those properties needs to be normalized. Then @DataType-level normalizers are sufficient
  • normalizers should work with the entire API-First ecosystem where possible: if they're written at the @DataType level, then docs-generating modules like OpenAPI can be made aware of the shape of data, and have that work across all field types. If every field type has its own normalizer, it can't. If those @FieldType-level normalizers have perfectly matching normalization behavior, then OpenAPI would still need to know about all those field types (which really just reuse data types 95% of the time). But more likely, they don't have perfectly matching normalization behavior, in which case it's a complete nightmare.

Sorry for repeating this again, but it feels necessary. It's about those things: nudging towards the choices that strengthen the API-First ecosystem. If we don't do this, we'll always have issues like #2931496: Add serializer for json/xml, where modules implemented one normalizer (in that case: HAL), but not others (default and JSON API), which leads to confusion and frustration for those using Drupal for decoupled use cases. (And yes, I know, that particular issue is an entity reference field, but you get the point.)

Wim Leers’s picture

How can we possibly claim that EntityReferenceItem is indeed the only case that's special and nobody else can possibly come up with another valid use case?

It's the only case we currently know of. We can totally expand this to more in the future, if we find more. Let me quote the IS:

99% of all field items (@FieldType-level) behave the same, it's only their property values (@DataType-level) that really need normalizers. The primary exception to that rule is @FieldType=entity_reference, because it's what determines relationships/hyperlinks, and almost every normalization has its own way of dealing with those.

And if we add an interface/annotation then we're not really deprecating anything anymore.

We're not deprecating any API, we're strongly discouraging. I think an interface actually can make sense, because that is an explicit signal for no, really, this is a special field type, it needs its own normalizer, and it'll need that in every normalization! In fact, such an interface would even allow us to automatically surface to developers that a certain field type only has a normalizer for format X, but not formats Y and Z!

I think I still prefer having good documentation what should be done 80%/90% of the cases to support as many different formats and serializations as possible than forcing it.

I'd be fine with that if it worked. But which developers read documentation thoroughly? And especially D8's REST docs, which have been historically weak (to put it nicely).

I am okay with postponing this issue and adding docs instead. But I think we really should explore a stronger signal: documentation is passive (the developer has to discover it), the E_USER_DEPRECATED error I'm throwing in this test is active (the developer will be notified of it). Although not nearly actively enough, because they do need to run this test.

So, that is the question: what do you think is the best way to reach developers of new normalizers in an active way, to give them guidance and nudge them in the direction that strengthens the Drupal API-First ecosystem?

Version: 8.5.x-dev » 8.6.x-dev

Drupal 8.5.0-alpha1 will be released the week of January 17, 2018, which means new developments and disruptive changes should now be targeted against the 8.6.x-dev branch. For more information see the Drupal 8 minor version schedule and the Allowed changes during the Drupal 8 release cycle.

damiankloip’s picture

I agree that it would be really nice if we could push people down this route of using property level normalizers where possible, but also share the same concern that berdir (and others) that there are potentially a lot of legitimate cases for field item level normalizers too. If we added an interface or annotation, I see the main problem being that a normalizer would be created at the field level, someone sees the warning in their logs, then does whatever is easy fix is to remedy this - which would be just adding whatever is needed to their normalizer. It is also a good point though, that seeing the notice will generate more awareness of the potential issues with using a normalizers at this level.

So I think we would certainly want good documentation to supplement this. The main concern I would have with the current patch is that being in the actual normalizer itself, this is more of a runtime check. This will generate a lot of errors in the logs. Could we instead maybe do a requirements check for the status page instead? That way we wouldn't be generating a ton of duplicate errors. I think an interface or something could work in conjunction with this still.

Wim Leers’s picture

Could we instead maybe do a requirements check for the status page instead?

That won't trigger deprecation errors, meaning developers won't see it. Status report is meant to be actionable, especially for the site builder/owner. This would not be actionable. This is why I started with pure test coverage, but ended up also doing a runtime check, because otherwise only core developers would ever see the failing test.

Given that, it sounds like we're shifting towards a "documentation-only" approach. I can live with that. It's not ideal, but neither is the current patch.

Everyone +1 to just documenting this in serialization.api.php and rest.api.php?

dawehner’s picture

I wonder whether there is a weaker form of errors than deprecation errors. Basically you know a way for us to log and gather data so people tell us: We are using field level normalizers. Based upon that we could do some educated decision one day.

Wim Leers’s picture

@dawehner: so basically … more data than what https://www.drupal.org/project/usage/drupal (and hence the update.module module in core) is giving us? That'd be nice, but collecting more data is … very sensitive. I agree that'd be very nice though!

dawehner’s picture

Yeah, somehow something which reports extended class methods etc., but sure, it would be an opt in tool, but it could be dramatically useful.

Berdir’s picture

One additional thing that I just figured out is that this only works for *normalizing*.. it's not possible to do denormalization on the property level because \Drupal\serialization\Normalizer\FieldItemNormalizer() does that itself.

One more argument why this for now doesn't make much sense IMHO.