After closely looking at the cut-away pack they had on display at the Leaf test-drive event, the battery controller did not look like it had any significant thermal dissipation capability. I would doubt seriously if it exceeds 100ma dissipation per cell. I'm not just guessing, allow me to explain my reasoning:
Here's an image from the service manual:
The service manual claims that the bypass function is built into an ASIC, and that the "bypass switch" is triggered by cell voltage. This means only during charge will balancing occur. Carefully note this diagram:
If this is the case, it's unlikely they can implement much of a balancing charge, as thermal dissipation in an ASIC with at least 4 other channels (see diagram) is going to be limited by it's package. Since ASIC dissipation is limited, the charger must be set to a low constant current while this phase is operating. It cannot balance the cells at any point except by top balancing during charge, as this the only time the voltage in the cell would hit the bypass threshold.
They are probably using an ASIC similar to
Linear's LTC6802. It supports up to 12 channels per device, and has an onboard bypass FET with typical on resistance of 10ohms. Do the math, if they drove the on-board FETs fully, (in a perfect world) that would be ~350mA maximum possible discharge. However, at that point the FET would have to dissipate at least 1.25W of power, and that's only one channel! Note this diagram from the LTC6802 datasheet:
It pretty much speaks for itself! Even if they are only using 4 channels per ASIC, that limits your discharge current do about ~75ma max before thermal runaway. Even if they are using a much better part, I'll still bet anyone here that they aren't much over 100ma per cell. At 100ma that would put total balancing energy dissipation at somewhere close to 50 watts of heat near the end of balancing. This heat would need to be disposed of in the battery controller somehow, which is substantial but feasible.