Artificial intelligent assistant

Confusion regarding Burke's theorem Arrivals occur at rate $\lambda$ according to a Poisson process the service time have an exponential distribution with parameter $1/\mu$ in an M/M/1 queue, where $\mu$ is the mean service rate where $\mu\geq \lambda$ (stability condition). According to Burke's theorem the departure process of an M/M/1 queue is a Poisson process with rate parameter $\lambda$ if the arrival follows a Poisson process with rate parameter $\lambda$. Now, lets assume arrival rate is 2 jobs/sec ($\lambda=2$) and service rate is 3 jobs/sec ($\mu=3$). Then the departure rate should be 3 jobs/sec under the condition that all jobs leave the server after getting service. But according to the Burke's theorem 2 jobs/sec should be leaving the server. What I'm missing here?

The OFRBG comment is the thing (we know the long term average departure rate must be less than or equal to the long term average arrival rate). But here is some more intuition:

Consider an $M/M/1$ queue with $0 < \lambda < \mu$. Define $\rho = \lambda /\mu$. The steady state distribution is $p_k = (1-\rho)\rho^k$ for $k \in \\{0, 1, 2, \ldots\\}$, where $p_k$ is the steady state probability of being in state $k$. Thus, $p_0=Pr[\mbox{idle}] = 1-\rho$. Now suppose we arrive at a time $t$ when the system is in steady state. Let $T$ be the remaining time to the next departure. Let's calculate $E[T]$ by conditioning on whether or not the system is busy at time $t$:

\begin{align} E[T] &= E[T|\mbox{busy}]\rho + ET|\mbox{idle}\\\ &= (1/\mu)\rho + (1/\lambda + 1/\mu)(1-\rho) \\\ &= 1/\lambda \end{align}

xcX3v84RxoQ-4GxG32940ukFUIEgYdPy 5b572415004b6b1aa91385e72cba4d8a