Short Answer
Just compute a gaussian blur kernel for a given sigma, then perform single iteration of 2D convolution or two 1D convolutions (for rows, then for columns or vice versa).
Don't forget to normalize the kernel!
If the kernel comes out too large (it is typically cropped to $\lceil 3\sigma\rceil$), compute the convolution in frequency domain (i.e. IFT(FT(image) *. FT(kernel)) where *. is a point-wise multiplication).
If you don't care about precision, you can apply several iterations of a box filter (the more iterations, the better approximation to Gaussian blur, but the more blurring occurs). You can find a nice definition in SVG Specs. Box filter is separable (can be done as two 1D filterings) and can be implemented with running sum (filter size is not a problem) - this is probably the fastest approximate smoothing method possible.
Long Answer
SInce you have the kernel in integers, it is only approximation to gaussian. To imagine what happens after applying by source_kernel 50 times is equivalent to applying this target_kernel once:
target_kernel = IFT( FT(source_kernel) ^. 50 )
where ^.50 is point-wise power function.
To learn more about the iterative approach, note that Gaussian blurring is equivalent to applying heat equation on an image (imagine how light "hot" pixels diffuse into darker, cooler pixels).
The heat equation is partial differential equation (PDE) in case of images.
You can apply certain amount of smoothing by specifying time amount for simulating heat dissipation and then applying the heat equation in small time steps. I don't know how time amount relates to gaussian $\sigma$, but you probably find more info in any introductory material to diffusion image filtering using PDEs.
Each time step requires computing finite differences at every pixel and applying a small stencil - it is somewhat similar to convolution.
This iterative approach is called implicit filtering and can give you very precise amount of smoothing (the precision is limited by the size of your time steps).
Explicit formulation also exists but require solving very large system of equations.
This kind of filering is rather used for more comlplicated problems like anisotropic diffusion, inverse problems, total variation denoising etc.