wiz-icon
MyQuestionIcon
MyQuestionIcon
1
You visited us 1 times! Enjoying our articles? Unlock Full Access!
Question

Mean square deviation of a distribution is least when deviations are taken about

A
mean
Right on! Give the BNAT exam to get a 100% scholarship for BYJUS courses
B
median
No worries! We‘ve got your back. Try BYJU‘S free classes today!
C
mode
No worries! We‘ve got your back. Try BYJU‘S free classes today!
D
none of these
No worries! We‘ve got your back. Try BYJU‘S free classes today!
Open in App
Solution

The correct option is A mean
if S denote the root mean square deviation from same number a, i.e.,
S=1Nni=1fi(xia)2 and σ is the S.D.
then S2=σ2+d2 where d=¯¯¯¯¯Xa
Clearly, S is least when d=0¯¯¯x=a
Thus, root mean square deviation is least when deviation are taken from ¯¯¯x

flag
Suggest Corrections
thumbs-up
0
Join BYJU'S Learning Program
similar_icon
Related Videos
thumbnail
lock
Coefficient of Variation
MATHEMATICS
Watch in App
Join BYJU'S Learning Program
CrossIcon