whilst monitoring a Windows Virtual Server I have noticed a slight difference in the output produced.
On the virtual server itself, perfmon was reporting 90% memory usage. Yet, looking at the same time period within SCOM 2012, SCOM was reporting the server to be consuming 86%. This is consistent throughout the day, an average 4 - 5% difference between the
server and scom.
Is this differential normal? What causes it and is it anything to be concerned about?