I’m glad you finally asked. First, 48V is just symbolic. In reality, the actual voltage range might not even include 48 volts.
By nature, PoE is DC power. And historically, DC power source equals batteries. We used to use batteries to provide DC power to devices (and now sometimes still). A typical lead-acid battery has a nominal voltage of "12V", but they can vary between 10.5V (fully discharged) and 12.7V (fully charged). Five batteries in series to power are beyond the limit because the high end may exceed 60V (5 x 12.7V = 63.5V).
Why not exceed 60V? Because the maximum Safety Extra Low Voltage (SELV) is 60V which is to make sure that there is no risk of electrocution when users are in direct contact with the current. By the time IEEE came to set the rules, they had decided to leave a 5% safety margin, so the maximum acceptable voltage for all PoE applications is fixed at 57 (60-60*5%) V in the 802.3af/at standard.
The IEEE standard sets the overall PoE loop resistance of a 100-meter CAT3 cable at a fixed 20Ω. So cable losses are 0.35 × 0.35 × 20 = 2.45 W. Then the minimum guaranteed power entering the powered devices (PD) is 15.4W - 2.45W = 12.95W.
Why is PoE+ power output 30W?
When the IEEE 802.3at standard was written, it became clear that lead-acid batteries were obsolete. Most applications had turned to the AC-DC power supply. As a result, we were able to increase the minimum voltage to 50 V.
The current was fixed at 600 mA to keep the temperature rise of the cable below 10°C. Note that the cable considered at that time was CAT5e. The minimum guaranteed power is now an integer of 50V × 0.6A = 30 W.
The minimum loop resistance of CAT5e cabling was fixed at 12.5 Ω in the 802.3at standard. So the cable loss is 0.6 × 0.6 × 12.5 = 4.5 W. Which is to say, at the other end of the cable, the minimum guaranteed power available to the powered device (PD) is 30W - 4.5W = 25.5 W. This is called a Type 2 by the IEEE 802.3at standard. The IEEE 802.3af standard we just discussed is referred to as Type 1.