I'm investigating using Apache Spark for an application. I'm especially interested in the structured streaming mode using temporary views and full SQL queries (for simplicity and low latency).
The application would require running multiple (tens, possibly hundreds) of queries on a single input stream of data. Is there a way to avoid having Spark re-read the input for each query?