The first option doesn't appear to be possible. In terms of automation options, I have a backfill script that I often use (I'd post it, but it's specific to our org).
If apex is an option for you, you may consider creating a generic backfill script that runs an auto-launch flow. Then you can put feature-specific logic in the flow. I've found backfill scripts are particularly effective in this paradigm:
- Create a Queueable that traverses the database day-by-day. In the constructor, pass a start date, an end date, an sObjectType, a flow name, and a batch size.
- Select n records at a time (where n is the batch size) from the current day (using an indexed date field, such as CreatedDate), WHERE their id is not in a set of already queried ids.
- If the number of records returned is zero, decrement the current day to the previous day, clear the set of already queried ids, and re-enqueue the queueable (going back to 2).
- Otherwise, pass the record ids to the flow. Wrap the flow in a try/catch block and run it.
- Store the record ids in the set of already queried record ids, and re-enqueue it (going back to 2).
This will run the queueable over all records from the provided object, from the start date to the end date. Going day-by-day prevents the set of already queried ids from becoming too large to fit in memory. Wrapping the flow in a try/catch block prevents errors in the flow from killing the entire backfill; instead it kills only the current batch.
It takes some time to set up the initial apex, but once it all works, you can use it for backfilling any kind of data, by creating a quick flow that performs the necessary logic.