Please help me understand why this op amp integrator output has a DC offset. Why doesn't the coupling capacitor block it?
EDIT: Running the simulation for longer time resulted in the expected output behavior
Please help me understand why this op amp integrator output has a DC offset. Why doesn't the coupling capacitor block it?
EDIT: Running the simulation for longer time resulted in the expected output behavior
A couple things to keep in mind.
In your setup, you have a gain of 20V/V (at dc and before the cutoff frequency \$\frac{1}{2\pi R_fC_f}=723\text{Hz}\$), your opamp will stop amplifying at this gain values for frequencies greater than 723Hz. After that frequency, the gain decreases 20dB/dec.
This is way lower than the frequency of your input signal (11.63KHz). After 723Hz your circuit acts as an integrator, and that is why you get a triangular wave instead of the square wave.
I ran a quick test on LTspice:
Notice I downsized the feedback capacitor so that I have still a square wave for an input frequency of 11.63KHz. The cutoff frequency where the opamp will integrate the signal instead of amplifying is now about 200KHz. This may not what you want to do with your circuit...Another thing I think is important, is to run your simulation starting your dc sources at zero volts. That way, you will get the real transient response. This is very useful when using capacitors and inductors. Just check the box that says 'Start external dc voltages at 0V', it will be the same as having a step input.
Here is the input and output:
Also notice I added some resistance to after the dc blocking cap at the output. This form a high-pass filter with cutoff frequency determined by the capacitor and the resistor at the output \$\big(\dfrac{1}{2\pi R_5C_3}\big)\$. The value of the resistor is important here because if it's too great (an open, for example), you may end up passing the dc component of the signal. Just choose C and R in a way that you get the frequency you want. 100K would have worked too.
Why does it take longer to reach steady state? Because of the RC time constants. At the input, you have a 47\$\mu\$F capacitor, along with the 5K\$\Omega\$ resistance. The feedack resistor also influences this time constant (100k\$\Omega\$). The time constant is \$\tau = (47\mu \text{F})(5\text{k}\Omega+100\text{k}\Omega)= 4.935\$ seconds.
Now, initially V+ is greater than V- (in linear mode V+ \$\approx\$ V-) so the output will saturate at about +9V. This is the voltage that will charge the input capacitor (your input signal is essentially 0Vdc). At the 4.935 seconds the voltage at the capacitor will be about 0.63*9V = 5.67V. But as soon as V- reaches the offset voltage of 4.5Vdc (present at V+ through the divider), the opamp will enter the linear region (now V+ \$\approx\$ V-) and the V- voltage will stay there. So, it takes a less than \$\tau\$ to reach 4.5Vdc. But the takeaway here is that the time constant could be reduced by downsizing the capacitor at the input, that way you don't sacrifice your dc gain.
For example, take a look at how much time it takes the inputs to come together and not surprisingly, the output starts to behave at that point:
If you downsize the input capacitor (1\$\mu\$F), this is what you get:
Look how quicker the inputs (V+ and V-) get close to each other, and the opamp behaves linearly.