Artificial intelligent assistant

When is an image produced by a lens sharp? (optics) Very basic question about optics: suppose I have a thin lens with a given focal length $f$, a screen and an object at distance $a$ from the screen. What is the mathematical relation that must be fulfilled so that the image of the object is sharp on the screen? Thank you very much in advance for your answers. Julien.

Use $${1\over x}+{1\over a-x}={1\over f}$$

xcX3v84RxoQ-4GxG32940ukFUIEgYdPy 06a36b1c61dfa90fa362027bf3bc9bc1