I am currently working with switched capacitor circuits used as sample and hold stages, gain stages, and integrators.
It is obvious that switching capacitors to simulate resistors involves sampling of voltages. That also means that there is a sampling interval 𝞃(tau) at which samples are collected.
Imagine we have a gain stage in which the input is sampled at phase1 and the output is provided at phase2 and is 0V at phase1. A full clock cycle consists of phase1 and phase2 and a duration of 𝞃.
In the following simulation 𝞃 equals 2 microseconds (sampling frequency=500kHz) and the gain of the amplifier is g=1.
One can read off the diagram that the phase delay is always in the range of [𝞃/2,3𝞃/2] in the time domain neglecting that the output is 0V at phase2. If we transform this phase delay to the frequency domain we get a linear relation between the phasedelay and the input frequency as shown below:
Now I am wondering about the magnitude error.
My thoughts:
I am sure there must be some kind of magnitude error through sampling, but I am not able to prove it. Looking at the z-transform of the problem above, which can be described as the following transfer function
H(z)= gain* z^(-1) with z^(-1)=e^(-jω*𝞃/2)
no magnitude error results, but I zam sure there is some, maybe in the form of sinx/x.
Can someone please explain if there is some sort of magnitude error or correct me if I made a mistake in my calculations?



