$C$ is a regularisation parameter, where smaller values tend to limit the model's capacity, and larger values allow the model to follow and fit the data more freely.
The plot you included traces out $C$ over five orders of magnitude, starting from a relatively constrained model at $C=0.001$, up to a model that can deal with more complex patterns.
The learning curve of the lower plot exhibits three regimes over that span.
Underfitting: From $C=0.001$ to $C=1$, the model's train and validation accuracies have these properties: they are relatively low, are increasing with $C$, and they follow each other closely. These are typical traits of an underfitting model, where it's not scoring highly, is making good use of the extra capacity you give it, and where it does no better on the training data than on unseen validation data (like it is hitting a ceiling in performance).
Balanced: At $C\approx2$, the validation accuracy reaches its highest value. Before this point, it was still picking up useful patterns that translated well to unseen data. After this point, it starts memorising the training data and degrading in generalisation performance.
Overfitting: As $C$ is increased further, the additional capacity is being used to over-adapt to the training data in a way that isn't useful for non-training data (hence the validation score drops). It scores well on the training data and could eventually reach 100%, but we care more about general performance (validation score) rather than merely the particular samples of the training set.
Do I have underfitting?
The plot represents lots of models rather than just a single one - each value of $C$ results in a different model (a model with a different capacity).
Depending on which $C$ you choose (which point along that curve), you end up with a different model along the underfitting-overfitting continuum. If you use a small $C$ you'll have an underfitting model, whereas a value of $C$ that is too large will give you an overfitting model.
Relevant posts: bias and variance, how they relate to bagging and pasting.