I am trying to understand the graph "Reporting -> Health -> Quality". Unfortunately, I could not find anything in the documentation.
I have the following approach: The x-axis corresponds to the time in hours/minutes and two different values are plotted on the y-axis. On the one hand, for delay and stddev. The entered value is given in seconds. On the other hand, for loss. The value seems to be entered in % per 10min (standard), 5min (medium) or 2min (high), "500m" would be 0.5%.
But i found this explanation: "Loss and standard deviation of loss is in percent which is a little hard to grasp at first. 100% equals 1, so 110m is 110 per mille, or 11% average loss." (https://forum.opnsense.org/index.php?topic=12605.0) If 1 equals 100% and the value 7 is shown in the diagram, then you have loss 700%?
Hm, what am I missing? Would anyone have an explanation for this?
Hi all,
I'm writing to this old thread because it's exactly my question: Why is the health graph so complicated to read?
The label of the y-axis reads "seconds/%". What does that even mean? How does it relate to the selected granularity?
If you set the granularity to 60 minutes or 24 hours, the x-axis label becomes labeled by "days of the year" or "week of the year". While I would be able to calculate that into something understandable - why is it so hard in the first place?
I'd really appreciate a simplification there. Kind regards
Christian