The correct option is A 0.97
When it is known that the transmitted symbol is x1, the corresponding output symbols exist with probabilities,
P(y0|x1)=0.40
P(y1|x1)=0.60
Required entropy =H(Y|X=x1)= Entropy [0.40, 0.60]
=−[0.40log2(0.40)+0.60log2(0.60)] bits/symbol
=0.97 bits/symbol