At a basic level, projected gradient descent is just a more general method for solving a more general problem.
Gradient descent minimizes a function by moving in the negative gradient direction at each step. There is no constraint on the variable. $$ \text{Problem 1:} \min_x f(x) $$ $$ x_{k+1} = x_k - t_k \
abla f(x_k) $$
On the other hand, projected gradient descent minimizes a function subject to a constraint. At each step we move in the direction of the negative gradient, and then "project" onto the feasible set.
$$ \text{Problem 2:} \min_x f(x) \text{ subject to } x \in C $$
$$ y_{k+1} = x_k - t_k \
abla f(x_k)\\\ x_{k+1} = \text{arg} \min_{x \in C} \|y_{k+1}-x\| $$