The correct option is
B 50 W lamp consume more power
Assuming the bulbs were manufactured to operate at 220 volts, according to energy dissipation law I=P/V, the current and according to the ohm's law R=V/I, the resistance is determined.
For 100 W bulb:The current I=P/V, I=100/220, that is, I[100]=0.45 amps for the 100 watt bulb.The resistance R=V/I, R=220/0.45, that is, R[100]=489 ohms. The 100 watt bulb has 489 ohms resistance when burning.
For 50 W bulb:The current I=P/V, I=50/220, that is, I[50]=0.23 amps for the 50 watt bulb.The resistance R=V/I, R=220/0.23, that is, R[50]=957 ohms. The 50 watt bulb has 957 ohms resistance when burning.
As these 2 bulbs are connected in series, the series resistances add directly. So,
the total resistance is 489 ohms + 957 ohms = 1446 ohms.So in our 220 volt circuit, I=V/R, I=220/1446, I=0.152 amps.
Hence, the total circuit current is 0.152 amps.All points in a series circuit see the same current. So, volt drop at the 50 watt bulb is V=IR, that is,
0.152×957=114.8 volts of drop and volt drop at the 100 watt bulb is V=IR, that is,
0.152×489=74.3 volts of drop.
At the 50 watt bulb;
P=voltagedropV×I;
P=114.8×0.152;
P= 17.44 watts.At the 100 watt bulb;
P=voltagedropV×I;
P=74.3×0.152;
P = 11.29 watts.
The more watts burning means brighter light from the 50 watt bulb. Hence, the 50 W bulb consumes more power.