The intuition comes easily if you stop thinking of the matrix elements as _switches_ for the edges. Instead, **think of $(A)_{ij}$ as the number of distinct walks of length $1$ from $i$ to $j$**. Try! ;)
Note this also allows you to account naturally for looping, directed, undirected and multiple edges if you don't restrict to a symmetric matrix with null diagonal and elements in $\\{0,1\\}$.
What happens if you take a power of the adjacency matrix? Multiplying the matrices you are essentially concatenating walks, so...
> $(A^k)_{ij}$ is the number of distinct walks of length $k$ from $i$ to $j$.
Transposing the matrix simply flips the edges. So..
> $((A^T)^kA^l)_{ij}$ is the number of distinct walks of length $k+l$ starting from $i$, doing $k$ steps against the direction of the edges and $l$ steps following them finally arriving at $j$. The meaning of $(A^k(A^T)^l)_{ij}$ is similar.