OPNsense Forum
English Forums => General Discussion => Topic started by: moonman on March 21, 2019, 03:49:54 am
-
(https://i.imgur.com/1L5g9fO.png)
I understand that the latency for example is 15ms, but what is 400m for packet loss?
-
Loss is in percent, 400m to 1 is 40% loss. The graph shares the y-axis with the millisecond values so it's a little hard to read indeed.
Cheers,
Franco
-
Does the graphical package offer a secondary y scale? Maybe it can be easily plotted in, making the graph easier to read?
-
I don't believe it does. RRD is relatively simplistic and static in these matters.
Cheers,
Franco
-
Loss is in percent, 400m to 1 is 40% loss. The graph shares the x-axis with the millisecond values so it's a little hard to read indeed.
Cheers,
Franco
Thanks. Makes sense.
-
Loss is in percent
Can you please describe what exactly this percentage mean?
For example I have a loss of up to 1.3 with a delay of 15, so 130% loss?. How do I interpret this?
I'm a bit confused ;D
-
Loss is in percent, 400m to 1 is 40% loss.
This seems rather self explanatory? I don't know what "1.3" is. Screenshot please.
Cheers,
Franco
-
I do such graphs on a daily basis, the y-axis is.. improvable... I would have ms on the left and %loss on the right. Then it would be fool-proof.
Or if you have only one y-axis in your graphing tool, choose ms and %loss on the same y-axis. No "400m" which is ambiguous at best.
-
Loss is in percent, 400m to 1 is 40% loss.
This seems rather self explanatory? I don't know what "1.3" is. Screenshot please.
Cheers,
Franco
See attached screenshot with a value of 1.9 up to 3.3
-
*bump*
Might this be a rendering bug in the graph and 1.9 is actually 1.9m or something?
Is there a way to see the raw values generated by dpinger?
-
Thanks for the screenshot. Depending on the graph interval I get a loss of up to 18 aggregated, basically higher with each interval increase. I think that either rrd is treating these values incorrectly or they are actually set up incorrectly or a mix of both (these are not package counters/bytes that can add up).
Does the value drop if you switch the resolution to "high"?
Cheers,
Franco
-
Does the value drop if you switch the resolution to "high"?
No, it actually increase, see attached screenshots.
-
Just a quick follow-up on this:
On Monday there was a WAN outage and during this time the loss peaked to 100 (see attached screenshot).
I think this confirms that the reported value is indeed the raw percentage as logged by dpinger ranging from 0 to 100 and not 0 to 1.
What do you think?
-
...that should be quite easy to simulate, just pull the WAN cable. why does nobody know what the numbers mean? :-O