Consider a house gets 110V AC supply...
Then a 60W bulb will have a resistance of 201.64 ohms and
a 100W bulb will have a resistance of 121 ohms
Inference: Lower the resistance, more is the power dissipated...
Isn't this contradicting??
I have been thought from the begining that resistance is something bad.. we make it minimum to avoid losses..
We go for copper wires because copper provides less resistance.. etc etc...
Then why on earth do we go for less resistance if more power disspates in a less resistive element...
Good or bad is something relative and application dependent.
In your case, the power dissipated in the bulbs are transformed to light emission. The more the power dissipated, the more the light emissions. That's why you get more luminance by a 100W than you get by 60W.
On the other hand, in any communication system, you do your best to get low resistive cables to avoid signal loss due to the cables resistance. In this case, the more the resistance value, the more the power dissipated, the more the loss of the signal.
So, it all depends on the application and the type of the load.
On the other hand, in any communication system, you do your best to get low resistive cables to avoid signal loss due to the cables resistance. In this case, the more the resistance value, the more the power dissipated, the more the loss of the signal.
May be I wasn't clear.
Check the below link, it's about a type of Coaxial Cables. This is what I meant by "Low Resistive" cables. **broken link removed**