Why does PoE use 48V and why is PoE output 15.4W?

Why does PoE use 48V and why is PoE output 15.4W?

I’m glad you finally asked. First, 48V is just symbolic. In reality, the actual voltage range might not even include 48 volts. 


By nature, PoE is DC power. And historically, DC power source equals batteries. We used to use batteries to provide DC power to devices (and now sometimes still).  A typical lead-acid battery has a nominal voltage of "12V", but they can vary between 10.5V (fully discharged) and 12.7V (fully charged). Five batteries in series to power are beyond the limit because the high end may exceed 60V (5 x 12.7V = 63.5V). 


Why not exceed 60V? Because the maximum Safety Extra Low Voltage (SELV) is 60V which is to make sure that there is no risk of electrocution when users are in direct contact with the current. By the time IEEE came to set the rules, they had decided to leave a 5% safety margin, so the maximum acceptable voltage for all PoE applications is fixed at 57 (60-60*5%) V in the 802.3af/at standard.


Therefore, we consider four batteries in series, which makes 12V × 4 = 48V (nominal). That’s where the “48V” came from. 


IEEE made it clear that the maximum voltage is 57V. What about the minimum? Here it is. 10.5V (fully discharged) × 4 = 42V. Don’t forget to add the 5% safety margin. In the end, we got 44V as the minimum voltage in IEEE 802.3af standard. Now we understand why the full DC voltage range specified by IEEE 802.3af is 44 to 57 volts.

If you are only curious about why PoE uses 48V, it ends here. Next, I will tell you why the PoE power output is 15.4W. 


Note that when IEEE 802.3af was written, they had CAT3 cable in their mind. They assume it’s the worst case. And 350 mA is the maximum continuous current considered to be safe for CAT3 cabling. As the voltage may drop to 44V, the minimum guaranteed power entering the cable is 44V × 0.35A = 15.4W.

The IEEE standard sets the overall PoE loop resistance of a 100-meter CAT3 cable at a fixed 20Ω. So cable losses are 0.35 × 0.35 × 20 = 2.45 W. Then the minimum guaranteed power entering the powered devices (PD) is 15.4W - 2.45W = 12.95W.


Why is PoE+ power output 30W?


When the IEEE 802.3at standard was written, it became clear that lead-acid batteries were obsolete. Most applications had turned to the AC-DC power supply. As a result, we were able to increase the minimum voltage to 50 V. 


The current was fixed at 600 mA to keep the temperature rise of the cable below 10°C. Note that the cable considered at that time was CAT5e. The minimum guaranteed power is now an integer of 50V × 0.6A = 30 W. 


The minimum loop resistance of CAT5e cabling was fixed at 12.5 Ω in the 802.3at standard. So the cable loss is 0.6 × 0.6 × 12.5 = 4.5 W. Which is to say, at the other end of the cable, the minimum guaranteed power available to the powered device (PD) is 30W - 4.5W = 25.5 W. This is called a Type 2 by the IEEE 802.3at standard. The IEEE 802.3af standard we just discussed is referred to as Type 1.


    • Related Articles

    • PoE overview-From knowing nothing to knowing something

      Clueless about PoE? Even though they tell you it's shorthand for power over ethernet. I get it. You probably want a panoramic view, a deep understanding of it. Hi, I’m Frank. I like to speak frankly. After reading all those blogs trying to feed you ...
    • What is Quick-PoE technology?

      Introduction Power-over-Ethernet (PoE) technology allows network devices to receive power through the same cable that is used for data communication. This can be useful in situations where there is no convenient electrical outlet nearby or where it ...
    • What is Perpetual PoE technology?

      Perpetual PoE is a technology that allows devices to receive power from the network even when the PSE switch is rebooting. Currently, rebooting the switch causes a temporary loss of power on the ports, which leads to a complete reboot of all ...
    • MoCA, DECA and PoE over coax, how to choose?

      I'm guessing you all clicked into this article because your house is wired all over with coax, but your devices only support the RJ45 Ethernet ports. Powerline Ethernet stopped working reliably, and Wi-Fi still sucks. In order to get hard-wired ...
    • Why does PoE use 48V?

      First, 48V is just symbolic. In reality, the actual voltage range might not even include 48 volts.  By nature, PoE is DC power. And historically, DC power source equals batteries.  A typical lead-acid battery has a nominal voltage of "12V", but they ...