100 MA Is a Myth

A lot of electronics ends up being connected to USB - whether they need computer or not. USB and its 5 V has became an unofficial standard power supply for the world. And thus it makes me crazy when I hear from people that you can only draw 100 mA from USB without going through USB enumeration else something horrible is going to happen. And that is bullshit.

Yes, there is such a thing as USB negotiation where each device can request certain amount of current from its host. Older USB devices could ask for up to 500 mA (in units of 100 mA) while USB 3.0 devices have maximum of 750 mA (in units of 150 mA). If you go higher than that, you get into the category of USB Power Delivery 3.0 and the beauty of multiple voltages we’ll conveniently ignore here, and deal only with 5 V requirements.

A misconception when it comes to USB negotiation is due to ability of USB devices to self-report their maximum usage and likewise the ability of chipset to say no if multiple devices go over some internal limit. Ideally, if you have four USB ports on the same power bus with total of 1 A available and you connect four devices using 300 mA each, three would get positive response and fourth one would get their request denied.

What might not be completely clear from this story is that bus has no means to either measure current or to enforce device’s removal. This whole story depends on the device accurately reporting its (maximum) usage and actually turning itself off if it receives “power denied” response. And yes, this self-reporting works as well as you can imagine it. As there is no way for computer to ensure either accuracy of data or device’s compliance everybody simply ignores it.

Computer manufacturers decided to save some money and not verify device’s consumption. So what if device that reported 300 mA is using 350 mA? Should you disconnect the device just because it uses lousy cable with a big loss? Why would you even care if that is the only device connected? What to do if that device just goes to 500 mA for a fraction of second? What to do with devices reporting nothing (e.g. coffee heaters)? Is nitpicking really worth bad user experience (my damn device is not working!) and is it worth extra cost to implement (and test)?

While there were some attempts at playing “power cops” in the early days; with time all manufacturers decided to over-dimension their power bus to handle more power than specification requires and simply placed cheap poly-fuse on it to shut it down in the case of great overload.

Such friendly behavior has culminated with each port of any laptop made in the last five years being capable of 1 A minimum. And you can test it - just connect data lines (or use UsbAmps’ high-power option) - and you will see you can easily pull 1 A out of something officially specified for half of that. And that is without any power negotiation - courtesy of the USB Battery Charging specification.

This leniency from manufacturers in turn generated a whole category of completely passive devices. Why the heck would USB light have a chip more expensive then all other components together just to let computer know its power usage? That is just wasting money if you can get power out of it whether you inform it or not.

Devices that have to communicate with computer over USB kept their self-reporting habits just because they had to use a bit smarter chips to interface the USB. And all those chips had to have power negotiation built-in to be certified. There was literally no cost in using this feature.

And even then they would fudge the truth. Quite often an external CD drive or hard disk would actually need more than 500 mA to spin up. And there was no possibility in the early days to specify more than 500 mA. So they lied.

Such (lying) behavior was essentially later approved with the USB battery specification that uses not an USB message but voltage levels as a limiting factor. It essentially says that you can pull as much as you want while voltage levels are high enough. Once voltage starts dropping, ease off a bit. This point can happen at 1 A, or 1.5 A, or 2 A - it all depends on power source/computer you are using.

Device manufacturers have become quite verse too. Since USB battery charging specification was done is hardware friendly manner, there was no extra cost to bear. Only task was to determine if computer supports the, then new, specification. If yes, pull current until voltage start dropping. If not, limit yourself to 500 mA. How do they recognize it? You’ve guessed it - by shorting the data lines.

Due to the powers of backward compatibility you can pull essentially as much current as you want from your PC. Yes, you are still limited by fuse (quite commonly 2 A or more per port pair), you still have thin USB cables and their voltage drop (with accompanying losses) to deal with, and yes devices - especially phones - will still self-limit to avoid tripping aforementioned fuse. But otherwise it is a wild west.

Unless you are making device that has to be certified, stop worrying about the power negotiation. If your device already has an USB transceiver onboard and you need to program it anyhow, go for the standard and configure current you need. But if you don’t have such chip or programming is just too much of a hassle, simply ignore it and world is not going to self-destruct. I promise.