Unity already decouples gameplay from framerate. If you remember to always use [`Time.deltaTime`][1] in your Update-functions, the actual graphics framerate should not affect gameplay speed. So when you start the game and the audio track at the same time, they should stay synchronized.

Regarding designing levels around music, there are two approaches:

* The manual approach. Meticulously hand-craft levels around specific audio tracks.
* The procedural approach. Use audio analysis algorithms to auto-generate levels around audio tracks. The most simple is to just look for changes in volume. A step further is detecting volume separately in different frequency spectrums. But that's still just scratching the surface. Audio analysis is a very wide and interesting field. People wrote a lot of scientific literature about it.

Many rhythm games use a hybrid approach. You generate a first draft of the level using a generator and then tweak it by hand to make it more playable.


 [1]: https://docs.unity3d.com/ScriptReference/Time-deltaTime.html