Tell me, what is the fundamental difference between using EnsureValue instead of a more complex solution? Is it really necessary in this case?
UPD: I’m looking very closely at the patch suggested above, and I don’t quite understand what the essential difference is in the behaviour of such an implementation. In essence, it looks like a very complex, full of checks operation, but performs a very similar action.
@gregsn The implication above is as if it divided the object into many channels. As if it means that if one field of the object is changed in the copy, this implementation will change the same field in the original. And vice versa? And if we used EnsureValue not on object fields, but on matched objects, would we get unwanted side effects?
I understand the problem to be solved like this:
Merge the data.
Take the data - whenever the original model changes - and transfer that data over.
But let me also change the model-copy myself!
The sync shall cooperate with other ways of storing data in stage 1.
Reacting on the main model channel stage 0 by writing the main object to the main channel stage 1 would result in making sure that all properties are the same. But that’s not the idea. We want to be able to have different data on stage 1 - allowing data to stick around that set differently - not via this sync.
Guys, first of all thank you for the very interesting discussion.
I’ve been thinking about this case for a while and suddenly I had an insight.
Earlier I asked: do you want to copy arbitrary object fields between each other? Strictly speaking, in your case it’s not really arbitrary fields, you want to copy fields that have changed (by tracking the change event). This is a really interesting feature. It might be worth experimenting more in this area.
Now I understand the point of such an implementation - the ability to copy object changes while leaving the rest of the object unchanged. There could be a long discussion about the correctness of such an implementation within the mutable/immutable topic, but now the meaning is clear.
I think, it is not about the model-view-runtime pattern as such, but about the functionality of objects in Gamma and the functionality of Сhannels. I also think that the name ‘copy’ is confusing, as these are not copies, but states (or stages). Also I’m not sure it’s about ‘synchronisation’ at all, rather it could be called ‘copying arbitrary fields between objects in channels (on change)’ — very similar, but different.
I thought it would be cool to have an EnsureValue node with filters on which fields should be ensured.
Well, sometimes the most convenient choice is not the most eficient in terms of programing, the idea in this case is to provide the user the easiest approach, where we sacrify eficiency for usability.
I’m talking more about architecture and programming patterns at the moment. It’s hard for me to think of counterarguments to this approach at the moment, and it wasn’t a criticism. It’s just that this approach is controversial because it creates mutability where the chosen data types don’t provide for mutability. This could potentially lead to undesirable consequences in the future. Or it might not. That’s where discussing this comes in. By the way, I don’t intend to start this discussion - I have enjoyed your invention of original approaches very much anyway. And I’m certainly not talking about efficiency.
By the way, can you share the crawler/synchroniser?