I am working on an audio effect that passes it's input though the opus codec. I want the effect to be a VST, so I don't have control of the input buffer size and the sample rate. However, opus requires a sample rate of 48Khz and frames of 120, 240, 480, 960, 1920, or 2880 samples. What is the best way to convert from an arbitrary frame size and sample rate to a known one? A loss of fidelity is OK.
Here is the signal chain as I understand it:
- Audio comes from a VST host application such as Ableton Live.
- I'm using JUCE as a library for creating the plugin so the audio is then past to the internals of JUCE.
- JUCE calls the
processBlockmethod on my plugin with an audio buffer. When testing I can force this audio buffer to be a certain size, but I want to support hosts with arbitrary sized buffers. - I need to somehow convert from the host buffer size, back to a frame size supported by opus. This is where my question resides.
- I call
opus_encode_floaton the buffer to compress the data. This introduces sonic artifacts I want to keep. - I call
opus_decode_floaton the compressed data. This gives me back a buffer the size of a valid opus frame. - I need to then convert this back to a size the host is expecting (same size as the input buffer).