In any packet loss sensor you have a graph which shows downtime,ping time,minimum,maximum and packet loss(%) where the range is from 0 to 1% i need to know how this packet loss is calculated and how it is interpreted? my upper warning limit is 10% and the upper error limit is 20% but in the graph all i see is 0.1 up to maximum of 1% of packet loss
Any explanation please?
Hello,
thank you very much for your support ticket. May I ask, which exact sensor type you are referring to? Maybe upload some screenshots to avoid any misunderstanding.
best regards.
Dec, 2013 - Permalink