Hi, I recently (last week) commissioned new battery banks.
12 x 6v 250Ah AGM Full River (1500 Ah).
Adjusted the MPPT 75/50 from old wet cell to AGM. Moved from Preset (2) Default, to Preset (3) AGM Spiral, as this was the only preset for 14.7v Absorption (which is the Battery Manufacturers specs).
Then manually adjusted float from preset 13.8v to 13.65v (again Manufactures spec) this moved the MPPT into custom programming. There is ample solar power being generated (4 x 327w SUNPOWER PV Panels)
Attached are the VRM portal views for the past few days and a longer-term week view, plus voltage at shunt. I have also attached a diagram of the wiring. I will be moving to DCVV shortly.
Maximum voltage output reported by MPPTs is 14.4v ~ 14.5v (VRM) and Floating at 13.44v (VRM)
Maximum voltage seen at shunt is 14.16v (VRM)
(disregard the voltage max on second BMV, historical from old wet cells and faulty aux high power (160A) alternator.
I understand that the MPPT will default to Absorbtion time interval of 1 hr.
I have measured using MM voltage at output of MPPT to what it reports in Victron Connect. I am seeing a difference of 0.08v higher on VicCon vs MM
I am seeing a 0.05v difference between my MM and the BMV702, VicCon showing higher.
During a period of monitoring on VicCon BMVs showing 12.97v while MPPT showing 13.15v (Delta 0.13)
I can accept that there will be a difference from the output of the MPPTs to that seen by the BMV shunts, but this will be (should be minimal, as I have 70mm' AWG 2/0 gauge cable and total run length 6.48m POS and 6.68m NEG.
While moving to DVCC will align the Voltage and Temp to the MPPT. I am a little concerned why when I have programmed the MPPTs for 14.7v Abs and 13.65 float, that the MPPT don’t get above 14.1v at the battery and float at 13.44v
Thoughts?