Let's assume:
* Gaussianized data has a normal distribution with zero mean and unit variance
* The output layer uses a softmax layer
* The cost function is cross entropy which measures how much the distribution of the output layer is different from the desired output distribution
Now if the desired output is a sample from Gaussian distribution and you run the back-propagation algorithm with cross entropy cost function, the network weight will be trained in such a way that the output distribution is as close as possible to normal distribution.