Dear Community,
Can anyone exlain t me how the VRM 'Time to Go' figure is calculated? In particular, how is it calculated when I am not using SOC but instead using battery voltage as the 'trigger' for generator stat/stop functions configured on my Cerbo GX relay.
I read somewhere in another person's response to a similar question that the Time to Go was calculated based upon the current SOC as compared to the low SOC's configured in the relay.
I have a large bank of LFP batteries and SOC is NOT a good indication of their state....whereas DC voltage is an ideal indicator of battery state.
At present my VRM suggests much longer Time to Go periods than are realistic. It may say 12 hours when in reality I only have 2 hours.....before my Cerbo GX is configured to start the generator.
If the configuration of the LBCO (low battery cut out) in the System Configurator together with the generator stop/start voltages in the Cerbo are not the basis of this calculation....then what is the basis? What inputs are used to make the calculation?
I don't pay much attention to SOC readings...rather I always focus on battery voltage.....but my client naturally looks at the SOC, because the SOC is displayed much more prominently and continuously....compared to the voltages. And for the lay person it is much easier to understand a SOC percentage compared to interpretting a voltage to a battery state. But this can be dangerous if the SOC is not accurate.....and is associated with a 'Time to Go' that is totally bogus.
Thank you for an explanation of the Time to Go calculation!
Cheers,
Jim