Skip to content

Commit

Permalink
Rework traversal strategy in Policies#applyMerges. (#5880)
Browse files Browse the repository at this point in the history
Previously, I thought it was important to process any child object fields
with custom merge functions before processing their parents, which
required a post-order traversal of the result tree (in other words,
calling policies.applyMerges recursively on incoming.__value *before*
invoking the custom merge function). I have come to realize this was a
mistake, for a few different and somewhat subtle reasons.

First, a merge function should be able to return merged data in any format
(specified by the TExisting type), but it should also be able to assume
the incoming data has not been modified by merge functions. Reasoning
about data types is much simpler if the incoming data is always just the
type of data returned by the GraphQL server/schema (StoreValue, most
generally), and the existing type is always some other type chosen by the
developer (TExisting), possibly but not necessarily the same as the schema
type (for example, an indexed lookup table, or an ordered history of
results). Calling policies.applyMerges recursively *before* invoking the
merge function violated this assumption, because nested incoming data
could be turned into TExisting data before the merge function saw it.

Note: this is not just a case of trying to make the type system happy,
because TypeScript's flexible type inference isn't powerful enough to
provide any reliable warnings or errors. A stronger static type system
would make enforcement easier, but even in a completely dynamic language
this kind of type-level reasoning would still be important, because it has
actual semantic implications.

Second, there's no universal way for the cache to merge arrays
automatically (should it replace, concatenate, and/or deduplicate? what if
the other value is not an array?), so blindly calling policies.applyMerges
on array values before the merge function had a chance to process them was
a risky plunge. This commit severs the false linkage between items in
different arrays that happen to have the same index.

Third, I have been hoping to provide an options.merge helper function to
facilitate forced merges of unidentified data, like { ...existing,
...incoming } but with appropriate regard for custom merge functions.
Unfortunately, the risk of repeatedly merging already merged data would
have required that all merge functions be idempotent (a useful quality for
most merge functions to have, but not always possible). I feared that
implementing options.merge was going to require a new field function in
addition to read and merge (like merge but taking two TExisting values and
merging them idempotently), but thankfully the changes in this commit
allow us to implement that functionality with just read and merge.
  • Loading branch information
benjamn authored Jan 30, 2020
1 parent 4da32e7 commit 87b9fd2
Showing 1 changed file with 69 additions and 65 deletions.
134 changes: 69 additions & 65 deletions src/cache/inmemory/policies.ts
Original file line number Diff line number Diff line change
Expand Up @@ -576,78 +576,82 @@ export class Policies {
if (isFieldValueToBeMerged(incoming)) {
const field = incoming.__field;
const fieldName = field.name.value;
const policy = policies.getFieldPolicy(
// This policy and its merge function are guaranteed to exist
// because the incoming value is a FieldValueToBeMerged object.
const { merge } = policies.getFieldPolicy(
incoming.__typename, fieldName, false);

// The incoming data can have multiple layers of nested objects, so we
// need to handle child merges before handling parent merges. This
// post-order traversal also ensures that the incoming data passed to
// parent merge functions never contains any FieldValueToBeMerged
// objects for fields within child objects.
const applied = policies.applyMerges(
existing,
incoming.__value as T,
getFieldValue,
// If storage ends up null, that just means no options.storage object
// has ever been created for a read function for this field before, so
// there's nothing this merge function could do with options.storage
// that would help the read function do its work. Most merge functions
// will never need to worry about options.storage, but if you're reading
// this comment then you probably have good reasons for wanting to know
// esoteric details like these, you wizard, you.
const storage = storageKeys
? policies.storageTrie.lookupArray(storageKeys)
: null;

incoming = merge(existing, incoming.__value, {
args: argumentsObjectFromField(field, variables),
field,
fieldName,
variables,
storageKeys,
);
policies,
isReference,
toReference: policies.toReference,
readField<T>(
nameOrField: string | FieldNode,
foreignObjOrRef: StoreObject | Reference,
) {
// Unlike options.readField for read functions, we do not fall
// back to the current object if no foreignObjOrRef is provided,
// because it's not clear what the current object should be for
// merge functions: the (possibly undefined) existing object, or
// the incoming object? If you think your merge function needs
// to read sibling fields in order to produce a new value for
// the current field, you might want to rethink your strategy,
// because that's a recipe for making merge behavior sensitive
// to the order in which fields are written into the cache.
// However, readField(name, ref) is useful for merge functions
// that need to deduplicate child objects and references.
return policies.readField<T>(
foreignObjOrRef,
nameOrField,
getFieldValue,
variables,
);
},
storage,
invalidate() {
if (storage) {
policies.fieldDep.dirty(storage);
}
},
}) as T;
}

const merge = policy && policy.merge;
if (merge) {
// If storage ends up null, that just means no options.storage object
// has ever been created for a read function for this field before, so
// there's nothing this merge function could do with options.storage
// that would help the read function do its work. Most merge functions
// will never need to worry about options.storage, but if you're
// reading this comment then you probably have good reasons for
// wanting to know esoteric details like these, you wizard, you.
const storage = storageKeys
? policies.storageTrie.lookupArray(storageKeys)
: null;

return merge(existing, applied, {
args: argumentsObjectFromField(field, variables),
field,
fieldName,
variables,
policies,
isReference,
toReference: policies.toReference,
readField<T>(
nameOrField: string | FieldNode,
foreignObjOrRef: StoreObject | Reference,
) {
// Unlike options.readField for read functions, we do not fall
// back to the current object if no foreignObjOrRef is provided,
// because it's not clear what the current object should be for
// merge functions: the (possibly undefined) existing object, or
// the incoming object? If you think your merge function needs
// to read sibling fields in order to produce a new value for
// the current field, you might want to rethink your strategy,
// because that's a recipe for making merge behavior sensitive
// to the order in which fields are written into the cache.
// However, readField(name, ref) is useful for merge functions
// that need to deduplicate child objects and references.
return policies.readField<T>(
foreignObjOrRef,
nameOrField,
getFieldValue,
variables,
);
},
storage,
invalidate() {
if (storage) {
policies.fieldDep.dirty(storage);
}
},
}) as T;
if (incoming && typeof incoming === "object") {
if (isReference(incoming)) {
return incoming;
}

return applied;
}
if (Array.isArray(incoming)) {
return incoming.map(item => policies.applyMerges(
// Items in the same position in different arrays are not
// necessarily related to each other, so there is no basis for
// merging them. Passing void here means any FieldValueToBeMerged
// objects within item will be handled as if there was no existing
// data. Also, we do not pass storageKeys because the array itself
// is never an entity with a __typename, so its indices can never
// have custom read or merge functions.
void 0,
item,
getFieldValue,
variables,
)) as T;
}

if (incoming && typeof incoming === "object") {
const e = existing as StoreObject | Reference;
const i = incoming as object as StoreObject;

Expand Down

0 comments on commit 87b9fd2

Please sign in to comment.