Why is the voltage amplification factor of an amplifier or a filter generally expressed by its gain in decibels
The decibel is a very convenient way of expressing gain especially in filters. For instance a simple low-pass filter will have unity gain in the pass band and, above the pass-band, the gain will diminish at 6 dB per octave. So, if the filter has a cut-off (-3 dB point) at 1 kHz, we can say that it will be nominally 6 dB down at 2 kHz, 12 dB down at 4 kHz etc..
This is exactly the same as saying that the output voltage (above the cut-off) falls at the same rate that the frequency increases hence, if frequency doubles then voltage halves or, if frequency rises ten times then voltage drops then times. This leads us to also say that a 1st order low-pass filter has an output "roll-off" of 20 dB per decade. It's exactly the same as saying 6 dB per octave.
and why is the bandwidth of this amplifier defined at −3 dB?
When the output of a low-pass filter (for example) drops to 3 db below is nominal pass-band value, the output amplitude power amplitude has halved. This should be obvious to anyone understanding the decibel but, there's another observation. For a simple RC low-pass filter, when the magnitude of the output is 3 dB down, the value of resistance equals the magnitude of the capacitive reactance.
Of course, -3 dB is a close approximation to the half-power-point. The true half-power-point is \$10\log_{10}(0.5)\$ = -3.0103 dB but, we call it -3 dB for reasons of convenience.
If we looked at the voltage transfer function of a simple RC filter we would get this: -
$$H(j\omega) = \dfrac{1}{1+j\omega RC}$$
And we know that the cut-off frequency (\$\omega\$) is \$\frac{1}{RC}\$ so, the above formula becomes: -
$$H(j\omega) = \dfrac{1}{1+j}$$
And that implies the voltage gain has dropped by \$\sqrt2\$ at the cut-off point. This also equals -3.0103 dB using this formula: \$20\log_{10}(\frac{1}{\sqrt2})\$.