12-20-2016, 10:28 PM
Lol - I used to do something like that when I was a kid. Put a low ohm resistor in an switched outlet - then POP!
That was great until I tripped the circuit breaker and got scared. Had to explain it to Dad.
As for really determining the limit, here's what Ohmite says about their resistors:
The wattage rating of resistors, as established under specified standard
conditions, is defined as the “Free Air Rating” (“Full Rating” or
“Maximum Power Rating”).
[cut]
The relation of the “Free Air Watt Rating” of tubular type, vitreous
enameled resistors to the physical size, is to be set at such a
figure that when operated at their rated watts, the temperature rise
of the hottest spot shall not exceed 300°C (540°F) as measured by
a thermocouple when the temperature of the surrounding air does
not exceed 40°C (104°F). The temperature is to be measured at
the hottest point of a two-terminal resistor suspended in free still air
space with at least one foot of clearance to the nearest object, and
with unrestricted circulation of air.
Yikes that's hot!
I guess there are really two questions:
1. How much power can a repro resistor dissipate before it hits 100°C (under the same conditions as Ohmite lists)?
2. How much power does a given resistor actually need to dissipate?
The easy answer to #2 would be whatever the rating of the original is - but I'd hazard a guess that many (especially small wattage ones) were just selected based on availability. I did a quick search for several 1/4W Philco part numbers in the service bulletins but didn't see them show up until around 1934 or so. It's just a rough guess, but maybe Philco was just using 1/2W resistors by default in 1931 when this model 50 was made.
Of course we could get a real-life number by measuring the voltage drop across a given resistor and then calculating P = V^2 / R. But we still don't know how much is too much because of question #1.
Anyway, I think I'm going to stick with the "works-for-me" approach - but with a few caveats:
That was great until I tripped the circuit breaker and got scared. Had to explain it to Dad.
As for really determining the limit, here's what Ohmite says about their resistors:
The wattage rating of resistors, as established under specified standard
conditions, is defined as the “Free Air Rating” (“Full Rating” or
“Maximum Power Rating”).
[cut]
The relation of the “Free Air Watt Rating” of tubular type, vitreous
enameled resistors to the physical size, is to be set at such a
figure that when operated at their rated watts, the temperature rise
of the hottest spot shall not exceed 300°C (540°F) as measured by
a thermocouple when the temperature of the surrounding air does
not exceed 40°C (104°F). The temperature is to be measured at
the hottest point of a two-terminal resistor suspended in free still air
space with at least one foot of clearance to the nearest object, and
with unrestricted circulation of air.
Yikes that's hot!
I guess there are really two questions:
1. How much power can a repro resistor dissipate before it hits 100°C (under the same conditions as Ohmite lists)?
2. How much power does a given resistor actually need to dissipate?
The easy answer to #2 would be whatever the rating of the original is - but I'd hazard a guess that many (especially small wattage ones) were just selected based on availability. I did a quick search for several 1/4W Philco part numbers in the service bulletins but didn't see them show up until around 1934 or so. It's just a rough guess, but maybe Philco was just using 1/2W resistors by default in 1931 when this model 50 was made.
Of course we could get a real-life number by measuring the voltage drop across a given resistor and then calculating P = V^2 / R. But we still don't know how much is too much because of question #1.
Anyway, I think I'm going to stick with the "works-for-me" approach - but with a few caveats:
- Don't make reproduction resistors for higher wattage parts
- Burn in (hopefully not literally!) any radio for a while before calling it done (probably a good idea anyway)