You are querying an OID that returns 1 (up) or 2 (down), and multiplying by 100. So, now 100 is up and 200 is down.
You have maxbytes set to 100, so now up = 100 and down = Undef (since 200>100).
When you get the CSV, undef is shown as 0.
When you query the average, Undef is ignored, so you are averaging over a set containing only 100 or undef, hence average is 100.
This is really a MRTG query... however, I would do this by making a script wrapper that gets the SNMP value, and maps 2->0, 1->100. Then you get 0 for down and 100 for up. You could then graph this the way you are trying.
SteveStatistics: Posted by stevesh — Thu Nov 06, 2003 1:10 pm
]]>