The professor is presenting the fact that it is impossible to have deterministic extractor. We want to define a deterministic extractor as follows:
A function $Ext:\{0,1\}^n \rightarrow \{0,1\}^{\ell}$ is a $(k, \epsilon)$ deterministic extractor if for all distributions $X$ over $\{0,1\}^n$ such that $H_{\infty}(X) \geq k$, $Ext(X)$ is $\epsilon$ close to uniform distribution over $\{0,1\}^{\ell}$.
To show the impossibility, it is enough to show that for any $(k, \epsilon)$ deterministic extractor $Ext$ (where $k \leq n-1$ and $\epsilon$ is extremely small), there exists a distribution $X$ with high min-entropy (i.e., $H_{\infty}(X) \geq k$) and $Ext(X)$ cannot be $\epsilon$ close to uniform. To construct such an $X$, we examine the preimage of $Ext$. We consider the strongest setting where the extractor $Ext$ outputs just a single bit (i.e., $\ell$ = 1) while its input distribution $X$ has very high min-entropy, i.e., $k = n-1$. In other words, it is not even possible to construct a deterministic extractor that outputs just a single bit but the input distribution has significant randomness.
Let's consider the preimages $S_b = Ext^{-1}(b)$ where $b \in \{0,1\}$. Observe that $S_b \subseteq \{0,1\}^n$ and $S_0 \cup S_1 = \{0,1\}^n$. Therefore, by an averaging argument, $|S_b| \geq 2^{n-1}$ for some $b$. Without loss of generality, let it be $S_0$. Now, consider the distribution $X$ that is uniform over $S_0$. This means that
$$Pr[x \gets X] = \begin{cases} \frac{1}{|S_0|} & x \in S_0\\ 0 & x \notin S_0\end{cases}$$
Observe that for this distribution $X$, we have $$H_{\infty}(X) = \underset{x}{min} - \log(Pr[x \gets X]) = \log(|S_0|) \geq n-1$$
Moreover, for this particular distribution $X$, we have $Ext(X) = 0$. That is, the extractor is a constant function for $X$ which is far from being uniform.