> I think 90% of it is solved with GraphQL/ protobuf like rules (“only ever introduce new fields”)
Agreed, that’s the only
sensible thing to do. Not sure it’s 90% though.
> but we’re hoping to provide patterns to mitigate these
Hope is not confidence inspiring for the most difficult problem at the heart of the system. That doesn’t mean it has to be impossible, but it needs to be taken seriously and not an afterthought.
Another thing you have to think about is what happens when data of a new schema is sent to a client on an older schema. Does the “merging” work with unknown fields? Does it ignore and drop them? Or do you enforce clients are up to date in some way so that you don’t have new-data-old-software?
Agreed, that’s the only sensible thing to do. Not sure it’s 90% though.
> but we’re hoping to provide patterns to mitigate these
Hope is not confidence inspiring for the most difficult problem at the heart of the system. That doesn’t mean it has to be impossible, but it needs to be taken seriously and not an afterthought.
Another thing you have to think about is what happens when data of a new schema is sent to a client on an older schema. Does the “merging” work with unknown fields? Does it ignore and drop them? Or do you enforce clients are up to date in some way so that you don’t have new-data-old-software?