All About Wireless: Transmission Lines, Part I
In this issue, we will focus on transmission lines, beginning with an explanation of coaxial cable impedance and why 50-ohms is the standard in RF systems, followed by a review of cable loss specifications.
Welcome to the sixth installment of All About Wireless. In this issue, we will focus on transmission lines, beginning with an explanation of coaxial cable impedance and why 50-ohms is the standard in RF systems, followed by a review of cable loss specifications.
Coaxial cable is an unbalanced transmission line manufactured in a range of impedance values. It is typically constructed with a copper core conductor and braided copper or aluminum foil shield, separated by a dielectric material. The copper core conductor carries the current, while the shield is at ground potential to minimize the radiation or reception of RF signals, caused by the cable itself acting as an antenna.
The two most common impedance ratings encountered are 50-ohm and 75-ohm. 50-ohm cables are typically used for audio RF systems, and 75-ohm cables for video distribution. These two cables types look exactly the same, so how is one designed to present 50-ohms of impedance and the other 75-ohms? To understand this, we first need to examine the concept of 'characteristic impedance'.
The characteristic impedance of a coaxial cable may be defined as the resistance that it would present to a source if it were infinitely long. This concept is best visualized as a theoretical electrical circuit of infinite series inductance and parallel capacitance, the values of which are determined according to the ratio of core conductor to shield diameter. The characteristic impedance of the cable is equal to the square root of the total distributed inductance divided by total capacitance.
A common 50-ohm RG8A/U cable, for example, may be specified as having an inductance of 241.9nH and a capacitance of 96.76pF per meter. 241.9nH divided by 96.76pF is equal to 2500, the square root of which is 50, so the characteristic impedance is 50-ohms. Moving away from this theoretical model, the equation for calculating actual impedance is slightly different, but the result is the same.
The same RG8A/U cable is constructed with a polyethylene dielectric, the relative permittivity of which is equal to 2.3. The inner diameter of the shield is equal to 7.24mm, and the outer diameter of the core conductor is equal to 2.03mm. Actual impedance is therefore equal to 138 divided by the square root of 2.3, multiplied by the log of 0.00724 divided by 0.00203. Again, the result is 50, so this is actually a 50-ohm cable.
Impedance is therefore independent of cable length; rather, it is a characteristic of the cable materials and cross-sectional dimensions. This is why it is so important that coaxial cables aren't physically compressed or bent in too tight a radius. Doing so would change the ratio of the core conductor to shield diameter, which will alter the impedance value. The question now is, why did 50-ohms become the standard impedance value for RF circuits?
It can be proven mathematically that for an ideal coaxial transmission line with an air dielectric, an impedance of 77-ohms provides the lowest signal loss, while 30-ohms provides the highest power handling. 50-ohms is a good compromise between these 77-ohm and 30-ohm ideals. In a real cable, some materials are required to keep the core conductor physically separated from the shield. As is the case with the RG8A cable in our previous example, polyethylene is commonly utilized for this purpose. Conveniently, with a polyethylene dielectric, minimum signal loss can be achieved at roughly 50-ohms impedance if the ratio of core conductor to shield diameter is carefully calculated, and this is why 50-ohms has become the standard impedance value in RF systems.
Signal loss is typically specified as a dB value per unit length, per frequency. Notice the difference in the electrical characteristics of each cable type listed in the table. For example, over a 30m cable length, RG85 exhibits 6dB more loss at 200MHz and almost 15dB more loss at 800MHz compared to RG8! This is an important specification to be aware of, as excessive cable loss can severely impair system performance.
Unlike impedance, signal loss in coaxial cables is proportional to cable length. The purity of the copper used, and the cross-sectional area of the core, both contribute to the resistance of the conductor, resulting in some signal loss. The dielectric material can also cause some signal loss, although this only becomes a serious consideration at frequencies in the Gigahertz range. At frequencies most commonly used in professional audio applications, the 'skin effect' is the primary contributor to coaxial cable signal loss.
The skin effect is the tendency of a high-frequency AC signal to become distributed such that the current density is largest near the surface of the conductor. Quality cable made from high-purity copper is able to support greater current density at the conductor surface, resulting in less signal loss per meter. The severity of the skin effect increases with frequency, which is why we see greater loss values specified at higher frequencies.
Cable loss of less than 3dB is usually acceptable. However, if more than 3dB loss is expected, either an active antenna such as the Shure UA874, or an in-line RF amplifier such as the Shure UA830, can be utilized to compensate. Belden model 9913 RG-8/U coaxial cable, for example, is specified as having 11.8dB loss per 100m at 700MHz. For a system using two Shure UA874 active antennas, if the A antenna is connected via a 100m 9913 cable and the B antenna is connected via 50m 9913 cable, the integrated RF amplifiers on the UA874 antennas should be set to +12dB and +6dB respectively.
Note the difference in the recommended amplifier gain settings. As the cable connecting the B antenna is half the length, cable loss is only 5.9dB at 700MHz. Line amplification should only be used to compensate for cable loss, so gain settings may be different between the A and B antenna lines, depending on the cable type and length used. Line amplifiers should not be relied upon to amplify weak signals to usable levels as this approach also amplifies noise, usually yielding undesirable results due to the poor signal to noise ratio.
Next month we will continue to focus on transmission lines, examining the importance of impedance matching and the effect of standing waves.
To stay updated about this and other educational content, subscribe to our email list here.