No, there is no way to do this (even if $P$ is invertible), except in some special cases. If we could, it would make it easy to derive very effective methods for a lot of important convex optimization problems.
One special case where we can do this is when the matrix $P$ is orthogonal. To evaluate \begin{equation*} \arg \min_x f(Px) + \frac12 \|x - \hat{x} \|^2 \end{equation*} when $P$ is orthogonal, we can make the substitution $w = Px$. Our problem is now to evaluate \begin{equation*} \arg \min_w f(w) + \frac12 \|P^T w- \hat{x} \|^2 = \arg \min_w f(w) + \frac12 \|w - P \hat{x} \|^2. \end{equation*} So we need only evaluate the prox operator of $f$.
Another special case is when $f(x) = \frac12 \| x \|^2$, and $P$ is a convolution operator. In this case the prox operator of $g(x) = f(Px)$ can be evaluated efficiently by setting the derivative equal to $0$ and using the FFT to solve the resulting linear system.