|
Edited on Wed Mar-16-11 12:23 PM by jdp349
The 1 to 10 scale sucks as a tool because there are so many ambiguous interpretations of what the number mean. When someone asks me to rate my computer skills on a scale of 1 to 10 it's impossible for that to mean anything unless what each number means.
The way I see it is that since 5.5 is the average of the sum of set of 1 through 10 then a 5 or 6 means roughly average, average being dependent on the group you're considering the average of. The average computer skills the developers over at Google would be a 10 compared to the entire population, but there is a spectrum of computer skill competency at google which could be measured with the 1 to 10 scale. But this interpretation doesn't seem to be how other people see it, most apply the standard American grading ranges ie a 5 is an E/F meaning you suck, a 6 is a D which means you still suck but just a little less, 7 means below average even though a C is supposed to be average, 8 is average, 9 is good and 10 is outstanding.
This whole topic is stupid, I'm angry that my anger about this makes me angry enough to start a thread about this..
Bottom line is that the 1 to 10 scale is stupid and that it's probably your fault.
:rant:
|