Exercise 7.15 from Ciprian Foias, Michael Jolly, "Differential Equations in Banach Spaces" (with some edits).
Let $X = C([0, 1], \mathbb C)$ (a set of continuous functions from $[0, 1]$ to $\mathbb C$ with the uniform norm) and $A \in B(X)$ (bounded linear operator from $X$ to $X$) defined by
$$(A f)(t) = \int_0^t f$$
Determine the spectrum of $A$ (i.e. the set of $\lambda$ s.t. $A - \lambda I$ is not invertible).
I can show that $\|A^n\|^{1/n} \to 0$, and therefore by Theorem 7.14 from the book $\max \{|\lambda| \colon \lambda \in Sp(A)\} = 0$. I.e. $Sp(A) = \{0\}$.
Question: is there a way to show that by the definition of spectrum? Currently, it's not clear to me at all why $A$ is not invertible, and why $A + \lambda I$ is invertible for any $\lambda \ne 0$. Also, is there a way to guess the answer by just looking at the operator?
Motivation: While I can solve this particular problem, I'm interested in a general method of how a spectrum can be found.
I found this answer: https://math.stackexchange.com/a/199730/743044, but again it doesn't show how to do this by definition.