08-22-2021, 07:24 PM
IF transformers are [surely?] designed to be as efficient as possible at the rated frequency. at least, i see max output at or near the IF when i apply a signal to the primary and attach a scope to the secondary. why is that?
more specifically, what determines the frequency at which power transfer is a maximum? for example i had a transformer rated at something like 4.5MHz and it didn't look all that different from a 455KHz transformer from an AA5 radio.
more specifically, what determines the frequency at which power transfer is a maximum? for example i had a transformer rated at something like 4.5MHz and it didn't look all that different from a 455KHz transformer from an AA5 radio.