The author here states that:
Share state between processes. Run a long running batch job in one Python interpreter (say loading a few million lines of CSV in to a Redis key/value lookup table) and run another interpreter to play with the data that’s already been collected, even as the first process is streaming data in. You can quit and restart my interpreters without losing any data.
- If the interpreter is stopped (quit), how is the sharing going to happen?
What is the concept of it? How do you explain this in simple terms, maybe with example?