Artificial intelligent assistant

Limit of Integral About a Point I'm wondering how one would go about proving rigorously the following statement: Suppose one has an infinitely differentiable (I'm not sure one needs this, but I'm thinking about the Taylor expansion of $f$), continuous, single-valued function $f:[a,b]\rightarrow \Bbb{R}$ where $[a,b] \subset{\Bbb{R}}$. For some $c \in{[a,b]}$, prove $$\lim_{\epsilon \rightarrow 0}\int_{c-\epsilon}^{c+\epsilon}f(x)\, dx = 0.$$ I should note that I'm a physicist and it's been some time since I took analysis in ug. Much thanks if anyone can help.

For a function $f$ that is bounded on a neighbourhood of $c$, say, $|f(x)|
Note that $f$ is continuous at $x=c$ implies that $f$ is bounded on a neighbourhood of $c$.

xcX3v84RxoQ-4GxG32940ukFUIEgYdPy afdb6c0c17d16f9510c9dd35fe6d3f1b