Different Ah in a series...

Hi

We had a 120AH and a 75 AH connected in para for the house battery charged via a semi conductor splitter. They only lasted 12 years before one failed. I changed the battery tray to allow 2 off 115AH when we replaced them but would have no problems with differing sizes (provided that it was not to different) if the tray is not able to be changed.

Good luck
 
The voltage you see during charge is generated by the battery not the charge source,

Is that right? :confused:

Because it takes a voltage to generate the conversion, the alternator only limits the max voltage. In your case above assuming 10 hour recharge, so 300 amp hour battery bank @ 30 amp, you will see initially 13.0, rising to 14 + after 10 hours.

Brian

You haven't convinced me the voltage is coming from the battery, not the charging source. The alternator (or dynamo, or charger) is not just limiting the voltage, it's providing it, surely?
 
You haven't convinced me the voltage is coming from the battery, not the charging source. The alternator (or dynamo, or charger) is not just limiting the voltage, it's providing it, surely?

The alternator provides watts, the conversion in the battery acts as a resistive/capacitive circuit, the conversion acting as a resistor. As the capacity increases, the effective resistance increases, so the voltage to complete conversion rises and current falls.

It is a sad fact of live, I spent 20 years or more designing and building them, out of interest, the transformer based charger gives a better charge cycle than switch mode.

Brian
 
The alternator provides watts, the conversion in the battery acts as a resistive/capacitive circuit, the conversion acting as a resistor. As the capacity increases, the effective resistance increases, so the voltage to complete conversion rises and current falls.

It is a sad fact of live, I spent 20 years or more designing and building them, out of interest, the transformer based charger gives a better charge cycle than switch mode.

Brian

Yes, the battery is resistive, but the power comes from the charging source. (The values (only) of the volts, watts and amps being products of the relationship between the charging source and the battery.) So it seems incorrect to say the charging source 'only limits the max. voltage', or that 'the voltage that you see during charge is generated by the battery'.

That would be the equivalent of saying that, during discharge, the voltage that you see is generated by the load!
 
Yes, the battery is resistive, but the power comes from the charging source. (The values (only) of the volts, watts and amps being products of the relationship between the charging source and the battery.) So it seems incorrect to say the charging source 'only limits the max. voltage', or that 'the voltage that you see during charge is generated by the battery'.

That would be the equivalent of saying that, during discharge, the voltage that you see is generated by the load!

Yep, take charge, you have 13.6, 14.4, 14.8 volt for instance seen during charge, these volts are limited by battery charger or alternator regulator.

You got it, as you increase the load you decrease the voltage, for example having anchor light on or starting engine.

Brian
 
Yes, the battery is resistive, but the power comes from the charging source.

The battery will have internal resistance (not sure what the other poster was on about above about it having capacitance but we'll ignore that). Remember however that the driving voltage when charging is only the difference between the charger's output voltage and the voltage to which the battery has already been charged. So say the charger is able to put out 14.4V (and of course it would only be able to do this if it can support the resultant current) and the battery is already charged to 13.8V, then you only have 0.6V driving the charging current through the battery. I think that maybe was what Halcyon was trying to say.
 
Remember however that the driving voltage when charging is only the difference between the charger's output voltage and the voltage to which the battery has already been charged. So say the charger is able to put out 14.4V (and of course it would only be able to do this if it can support the resultant current) and the battery is already charged to 13.8V, then you only have 0.6V driving the charging current through the battery.

You seem to suggest that a charger's output is defined by its maximum voltage and that this voltage level in relation to battery's voltage level drives the current into the battery?

My view of what is going on is totally different:
The output of the charger is defined by its maximum current (for instance 25A). The only voltage relevant during charging is the system voltage. The level of this voltage is a result of two things: The current being supplied to the battery and the battery's state of charge. The more amps and the fuller the battery is (resistance increasing), the higher the voltage level gets (think: pressure).
The voltage set point of the charger (for instance 14,4V) is there to protect the battery. When this voltage level is reached, the charger starts to cut back the current in order not to over-shoot this voltage, which would otherwise lead to over charging.

So, in answer to the question raised in post #19, my view is that the voltage level seen during charging is a result of both the current from the charging source and the battery's state of charge.
Also, the word 'generate' can have several meanings.
 
You seem to suggest that a charger's output is defined by its maximum voltage and that this voltage level in relation to battery's voltage level drives the current into the battery?

My view of what is going on is totally different:
The output of the charger is defined by its maximum current (for instance 25A). The only voltage relevant during charging is the system voltage. The level of this voltage is a result of two things: The current being supplied to the battery and the battery's state of charge. The more amps and the fuller the battery is (resistance increasing), the higher the voltage level gets (think: pressure).
The voltage set point of the charger (for instance 14,4V) is there to protect the battery. When this voltage level is reached, the charger starts to cut back the current in order not to over-shoot this voltage, which would otherwise lead to over charging.

So, in answer to the question raised in post #19, my view is that the voltage level seen during charging is a result of both the current from the charging source and the battery's state of charge.
Also, the word 'generate' can have several meanings.

OK. I'm assuming modern multi-stage charging systems and, in particular, when they are in the later stages where they are putting out higher voltages (the sort of voltages that cause gassing which is the only time the different battery sizes will matter). Yes, in the early stages of charging the current is the limit, but that only persists until the battery is sufficiently charged that the charger limiting its own voltage.

I didn't use the word generate so not concerned about the meaning.
 
Top