I am pretty sure the answer to this is "Yes" but just going to double-check ....
Got a customer with an 800W Array on a 12V System - made up of a pair of 400W Panels in series (I need to check this) with a current VoC of around 56V being presented to the controller.
Need to get a new controller for this. I am thinking the 100/50 can cope fine with the Voltage that the Array could present, but as the controller is rated at 700W (so is 100W under the potential maximum of the array) the only effect should be that any harvesting is capped at 700W.
So while there will be a potential waste of harvested power for a short time during the day and that for only a part of the year, there is no physical detriment to the controller (Current can get capped but have to make sure Voltage is not exceeded).
I could of course suggest the 150/60, but for the realistic overall slight increase in harvesting, the extra £200+ is a bit too much of a jump IMO (would even be quite a bit cheaper to get a pair of 100/30s)
So to recap - 100/50 ok for a 800W Array with Voc of 56V?