I'm integrating an enterprise application with an existing pattern.
The main problem is how to synchronize data between my system and an external system through RPC (HTTP API calls to be precise).
What I have done so far
The first thing I have done is dividing the interface in business methods such that at the beginning of the method I have a pretty clear idea of the data I will want to work with. For example the fictitious method call getIceCreamsOfColor("red") will probably make a remote call to get all the red ice creams.
getIceCreamsOfColor($color) { $remoteClient->queryForColor($color) ... } This is generally much better than
getIceCreams($idArray) { foreach ($idArray as $iceCreamId) { $remoteClient->queryForId($iceCreamId) ... } } (a) Inside every business method at some point I would have to spend 1 second or so to download the remote data and create local stubs of the remote objects.
(b) Then I perform my own logic. This mean possibly manipulating the local stubs. The local stubs are in reality business entities that are used to reason about the business domain and are queried by the front-end to visualize data.
(c) Finally I should spend another second or so flushing the local stub pool. This implies writing the stubs to disk for caching and writing to the remote server all the changed data.
Either action (a) or (c) are optional if for some reason I know the local data is still in sync or I don't care so much and I just want to view the remote data without modifying it.
At this point, inspecting the result of the remote call, the business method should return possible failures or an ok message.
Does this architecture sounds reasonable to you?
Do you think the local caching is dangerous/unnecessary for remote data sync?
How do you address the problem of dealing with an external service that stores your data and you want to keep in sync?
The main reason there is only 1 call to write data is that the remote API have a cap set very low.