You left out a tiny detail. The wikipedia article states $\
abla\cdot$ is the divergence operator. Along with the dot product applied to a vector yields the divergence. $\
abla$ itself applied to some function is the classical gradient vector.
You left out a tiny detail. The wikipedia article states $\
abla\cdot$ is the divergence operator. Along with the dot product applied to a vector yields the divergence. $\
abla$ itself applied to some function is the classical gradient vector.