At my company we are revising some backend architecture. I think I have identified a use case where event streaming (for example with Apache Kafka or RabbitMQ) makes sense.
Because me (and everyone else- at my company) are pretty much noobs in that space, I'd like to validate my idea.
Business flow and prerequisites
We have a bunch of Microservices that are responsible for importing an encrypted document from an external source.
- One for downloading it and saving it, acting as a sort of archive
- One for decrypting it
- Some others for associating the document with other business objects
The flow is roughly as follows(each color is a different microservice):

Current situation and problem
All communication happens with REST APIs, exposed by the microservices over HTTP. This was very simple to implement but has since revealed a big problem: If one service in that chain is down, the import is interrupted and can't be continued without manual intervention.
Proposed idea
I'd like to replace the direct communication over HTTP with message queues between the different services, like that:

This way the service that downloads the document could just push an event each time it downloaded a document. If the decryption service is online, it will decrypt it right away. If not, it will happen start the moment it is online again.
Similar for the other services, where the decryption service sends an event it has finished decrypting a document.
So my question(s) are:
- Is this a good use for events?
- Are there obvious tings to improve on this design?
- Are there any design patterns that I should study or look up?
Thanks in advance for any answers and cheers :)
If one service in that chain is downor the queue is down or messages are stealth or gone. Isn't it? Why do all three stages have to "happen" at different moments?