Average time is so misleading. it means that half of the requests were faster than this, but half were worse. but how much worse is not know
70% of the time I am in the i product I spend clicking on request and looking at the max number to see if it needs attention - by seeing that in any of the tables such as the "slowest" I could find and prioritize things so much faster.
90% is also an extremely useful metric - because I describe the time that most users experienced. the visual studio load testing software has this and the 95% percentile and its helpful, but does nothing for live systems with real humans
This was already suggested in 2018 and still not added? I just started to use Stackify and I'm really struggling to find bottlenecks in the application without knowing the percentiles and max times. I would say this is a must feature or at least i would like to know how other customers are analyzing their request without this information
Attachments Open full size
I think median time would be what you describe as half of requests slow/faster. Average is still susceptible to outliers.
Attachments Open full size