I've been trying to get my head around the principals of voltage and amps again. I know I know, by now I should have a clue lol.
My understanding is, that a device will only draw enough amps for what it needs. I.e. a charger just supplies amps and doesn't push them into a device.
So, if I were to use a higher rated charger, the BMS will only draw enough amps for what it is rated at, or needs. Would this assumption be correct?
Maybe not.
Charging a battery is an unusual case.
In order for current to flow into the battery, the charging voltage has to be higher than the battery voltage. The battery itself has a small resistance. This and the voltage difference (V/r) determine the current. In many cases this would result in too greater charge rate, so the charger will restrict the max current (say 2A) to prevent damage/fire/divorce.
As an example, a battery at 32v, connected to a 42v source has a volt difference of 10v. If the batteries internal resistance is 1 ohm, 10amps would be drawn when connected, so that's why the charger must limit current.
The batteries internal resistance is an important factor, and will also govern the point at which current begins to drop below the 2A limit. In the above example, with 1ohm, when the battery reaches 40V the current will begin to drop below 2A, tailing off to 0A as it approaches 42v