Artificial intelligent assistant

Upper bound for the number of positive (and negative) eigenvalues in a certain symmetric matrix Suppose we are given a finite set $S$. For $X,Y\subseteq S$, $X\cap Y=\emptyset$, define $A^{X,Y}\in M_{S\times S}(\mathbb{C})$ as $(A^{X,Y})_{s_1,s_2}=1$ if $s_1\in X, s_2\in Y$ or $s_1\in Y, s_2\in X$ and $0$ otherwise. Suppose $A=\sum_{i=1}^n A^{X_i,Y_i}$ where $X_i,Y_i\subseteq S, X_i\cap Y_i=\emptyset\ \ \ \forall 1\le i\le n$. Prove that $A$ has at most $n$ positive (respectively negative) eigenvalues. Note: It is easy to see that this is true for $n=1$ since the rank of $A^{X,Y}$ is 2 and its trace is 0. But I can't see how this is true for larger $n$.

This is basically the argument of Graham and Pollak, likely mentioned in any good introduction to spectral graph theory, which they used to show that a complete graph of order $N$ cannot be partitioned into $N-2$ disjoint complete bipartite graphs.

If the space of positive (respectively negative) eigenvalues had dimension greater than $n,$ it would non-trivially intersect the codimension-$n$ space of vectors orthogonal to all the characteristic vectors of the $Y_i.$ A vector $x$ in this space would satify $x^tAx>0$ (respectively $x^tAx<0$), but the orthogonality implies $x^tAx=0.$

xcX3v84RxoQ-4GxG32940ukFUIEgYdPy c49713454fc15c53bcd396e2ba4396fa