Ammeter switch

I am glad its worked for you, but a battery monitor is much more useful if it can display and integrate the value of all the shunts.
This way you can get a net A in or out as well as AHrs in or out.
A single battery monitor with the capicity to deal with the shunts you require is usually a better choice.
If it can be integrated with the regulators to provide intelligent charge termination that is even better, but this is a more sophisticated system than the OP needs.
Not sure I understand. I have a single shunt and monitor on the negative side of each of my two battery banks. All current in and out flows through the shunt irrespective of which charging source it comes from. It therefore integrates all input and output including Amp Hours in and out, and displays it on the monitor. The only weakness is that, if I want to know, say, what my sterling charger, my fridge or my masthead light is providing/consuming I have to switch them off and see what difference it makes to the amps to the battery. Both my sterling and my engine have intelligent three-stage regulators, which basically means that which ever is most enthusiastic gets to do the charging if both are on (rare) and the battery bank is so near the top of its charge that it doesn't suck charge from both. Never failed me and my batteries seem to last for ever.

But I agree, the OP doesn't want to faff with all this nonsense.
 
Actually I do have a battery controller which integrates everything going in and out of the battery. However I would like to be able to see the momentary contribution from my individual charging systems.
 
Assuming that the system has diodes in each line, so that the generators can't feed back into each other, then you could possibly derive the current going through each diode by measuring the voltage across each of them. It would need a bit of electronics to convert the very non-linear diode voltage into a current reading!

Hi Creaky Decks That is the thing about diodes in a power supply line the voltage drop across them is nearly constant for all currents. It is very non linear. Yes the voltage drop does increase but only slightly with more current so you would need a look up table in a microprocessor to which you calibrate from an ampmeter and then you would also have to correct for temperature variations. A silicon diode volt drop varies so much with temp that they have been used as a thermometer sensor.

If the OP has an amp meter with external shunt then adding additional shunts into the circuits is the logical way to go. However I am appalled at the cost of shunts. For just a piece of metal mounted oninsulation the Chinese could make them for a few pence.
You can make your own shunts especially if you have an ampmeter with separate shunt whose accuracy you are happy with. A shunt being a conductor in the current carrying wire which has some resistance enough to develope a voltage drop sufficient to drive a very sensitive meter.
You need a piece of stainless steel plate about 6cms long 2cms wide and fairly thin. Put a hole at each end to accept your cable lug and bolt. This bolt can bolt it down to wood or plastic insulation. Put 2 more holes one at each end to accept wire and lugs to the meter. Don't use the same holes and lugs.
Fit this shunt into a circuit of appropriate current. Lamps would be good. along with your amp meter and its original shunt.Measure the current on the meter then move the amp meter connections to the new shunt. You hopefully will get a reading but different to real current. The reading is reduced by reducing the distance between the pairs of bolts (mainly the distance between the wire take offs to the meter.) This change is linear related to distance. Or you can increase the width of the plate or even the thickness.
The reading is increased by making the plate narrower or wasting the middle making the plate thinner or longer will do same thing. Fine adjust by wasting the middle.
Once the shunt is correct resistance by having the same effect as the original shunt it will be stable for ever and can be copied for the other circuits.
As said use a rotary or multiposition switch to select the shunt you want to read the voltage drop from. You need to switch both wires however.

If your amp meter has an internal shunt. ie all current goes to the meter head. You can open up the meter.
There are 2 types. One will use a moving coil movement (a little box shaped coil swinging inside magnet poles) or it may be a moving iron type. The moving coil type will have a shunt just inside the case on the terminals. The moving iron type has heavy wire coil of a few turns with a little plate that attached to the needle swings against a spring. This moving iron type can not be fiddled or modified.
However if you have a shunt in the case this can be removed and wired up remote from the meter and you can then also make more shunts and a switching arrangement. If you feel inclined.
There is another option where you use the actual resistance of the wire lead from solar or wind. You need to attach a sensitive voltmeter (or moving coil milliamp meter) to each end of the wire carrying the current. ie +ve at the panel and -ve at the +ve battery terminal. In some cases there is enough volt drop to give a current indication. Connect across part of the wire to reduce reading or fit a series resistor to the meter to reduce reading.

If you have digital amp meter then it is the same story as they must have a shunt. However this is of more resistance as dvm needs more voltage to reach max reading. (measuring volt drop on wire won't work) Usually also you will get a choice of 2 amps max or 20 amps max. (the numbers go up to 1999) If 2 amps is sufficient then you can buy wire wound resistors very cheaply down to .1 ohm. These can be paralleled to give the required resistance.
You can calculate without calibrating. So at 2 amps you need 200 milivolts droped so you need (R= Volts divided by amps) .2/2 is = .1 ohm. So for 20 amps 10 of these resistors in parallel or a .01ohm resistor would do the same thing.

It is all doable just have a fiddle good luck olewill
 
Last edited:
There seem to be numerous "experts" out there but no-one willing to answer the question raised in post 19 with reasons.
I'm still interested - any takers please:):)
 
May I please raise another point from curiosity.

If you are trying to charge batteries from several sources they will probably be generating different voltages.

My simple mind tells me that only the device generating the highest voltage will be charging the batteries. The other devices (through their controlling circuits) will be dormant. If so then only one ammeter will be giving a reading.

Could someone comment please.

Ignoring voltage drop in the wiring all the generating devices (or at least there regulators) will be at the same voltage. The battery voltage.
The amount of current each device can generate at the battery voltage will vary as the conditions change and sometimes because the regulators will turn the charging off when charge parameters have been met.. Each device will contribute some current if it can.
As the current into the battery changes its voltage will change and a new equilibrium will be established
 
There seem to be numerous "experts" out there but no-one willing to answer the question raised in post 19 with reasons.
I'm still interested - any takers please:):)

Generators are not ideal voltage sources, ie. they do not have zero internal impedance. That means that each of the three in fact looks like an ideal voltage source in series with a resistor (which is then in series with a diode and then the battery). The voltages at the "ideal voltage sources" can be different but where they all connect together is a single point at the battery voltage. Because each generator is (hopefully) higher in voltage than the battery then current will flow.
If, for example, the alternator was at 15V and it had an internal impedance of two ohm then (ignoring the batteries own internal impedance) a current of (15-12-0.6)/2 = 1.2A would flow out of that generator's wire. (The 0.6 is the forward voltage drop of the diode.)
The three currents coming out of the three wires HAVE to flow into the battery because they have nowhere else to go!
 
There seem to be numerous "experts" out there but no-one willing to answer the question raised in post 19 with reasons.
I'm still interested - any takers please:):)

That is baisically true were you have regulated circuits, known as hunting, common with two alternators.

Assuming the battery is low capacity, we have solar panel, wind gen and the water gen all charging. The intial battery voltage will be low, something above 13 volt, but below the regulation point of any regulator. As the battery is recharged, the voltage will rise as the battery capacity rises.

We now reach the point when one regulator starts to regulate, if it turns off totally, charge amps will fall. The battery voltage will fall slightly, so the regulator kicks back in. Voltage goes up, regulate kicks in volt drops, so regulator hunts. This carries on till the voltage stabalizes above regulation point, and the two remaining charge sources carry on charging.

Battery recharge voltage carries on rising as the battery is charged, and the regulation point of the next hightest regulator is reached. At this point we go though the same action of hunting, till the volatge is stabalized. and the remaining charge source carries on charging till it starts to regulate.

That is the basic story, but you need more detail, as no regulators gets to another story, then charge rate and on and on.

Brian
 
Last edited:
noelex, CreakyDecks and halcyon,

Many thanks for your explanations. I think I will have to read them several times to fully understand them but I am certainly enlightened.
 
Halcyon's suggestion using Hall effect sensors is the best but if this is a step in technology too far .. it would be for me

scan0096.jpg

I understand shunts

:)
 
noelex, CreakyDecks and halcyon,

Many thanks for your explanations. I think I will have to read them several times to fully understand them but I am certainly enlightened.

Halcyon is right however i will try to put into other words.
We have to get away from the concept that a 12v source is just that. A large lead acid battery is near a perfect source in that if you draw 1 amp from it it supplies 12v. If you draw 10 amps it still supplies 12v. If however you draw 500 amps the voltage falls to about 9v. As a source it runs out of steam. The voltage sags.
Now a solar panel is just the opposite. It might give 20 volts with no or small load. If it were a 20w panel then at 1 amp drain the voltage will be down around 15v. If you tried to draw 1.25 amp the voltage would fall to near zero. Indeed you can put a short across the panel output so no volts and a current flow of a bit over 1 amp with no harm done. The panel is described as having an internal impedance or resistance of about 20 ohms.
Now an alternator charging at full RPM and max field current will like wise give something like 12v at rated current. (eg 60 amps) If you attempt to draw more current the voltage will fall further until the voltage provided is no use to you or the alternator melts. Note in this example that at full field current the alternator might produce 20+ volts at quite a few amps. A battery connected across the alternator output will accept that 20 volts and pull it down to 14v by accepting current as a charge into the battery. The voltage and current depending on the charge state. In practice we don't run the alternator at full filed current but rather regulate it (or throttle back) to a voltage which will not do the battery (or alternator harm).
This voltage (14v) will not charge a battery very quickly but it has been satisfactory on cars for many years. (the smart or stepped alternator controllers can improve this by exceeding 14v)

You can see that any charge source will supply current until its voltage falls due to its own internal resistance so each source will contribute what it can. If a solar panel or wind gen has a regulator the effect is similar to that of the engine alternator the voltage is limited or regulated in some cases by actually wasting current so pulling down voltage. But once the battery voltage is lower than the regulated voltage the regulator is effectively out of the game. The current and voltage from the source depend on internal resistance.
Note that the internal resistance as I call it consists of actual resistance plus the capability or lack of it for the source to provide power. So a solar panel in dim light has very high resistance compared to bright sun and likewise a wind gen turning slowly has more resistance than when wind blows. However it is not as simple as that because a wind gen will develop only a low voltage at low wind speed which often results in voltage not rising up enough to exceed battery voltage so no charge. A solar panel tends however to give high voltage with little sun but sags with current drain.

Does that make sense olewill
 
Halcyon is right however i will try to put into other words.
We have to get away from the concept that a 12v source is just that. A large lead acid battery is near a perfect source in that if you draw 1 amp from it it supplies 12v. If you draw 10 amps it still supplies 12v. If however you draw 500 amps the voltage falls to about 9v. As a source it runs out of steam. The voltage sags.
Now a solar panel is just the opposite. It might give 20 volts with no or small load. If it were a 20w panel then at 1 amp drain the voltage will be down around 15v. If you tried to draw 1.25 amp the voltage would fall to near zero. Indeed you can put a short across the panel output so no volts and a current flow of a bit over 1 amp with no harm done. The panel is described as having an internal impedance or resistance of about 20 ohms.
Now an alternator charging at full RPM and max field current will like wise give something like 12v at rated current. (eg 60 amps) If you attempt to draw more current the voltage will fall further until the voltage provided is no use to you or the alternator melts. Note in this example that at full field current the alternator might produce 20+ volts at quite a few amps. A battery connected across the alternator output will accept that 20 volts and pull it down to 14v by accepting current as a charge into the battery. The voltage and current depending on the charge state. In practice we don't run the alternator at full filed current but rather regulate it (or throttle back) to a voltage which will not do the battery (or alternator harm).
This voltage (14v) will not charge a battery very quickly but it has been satisfactory on cars for many years. (the smart or stepped alternator controllers can improve this by exceeding 14v)

You can see that any charge source will supply current until its voltage falls due to its own internal resistance so each source will contribute what it can. If a solar panel or wind gen has a regulator the effect is similar to that of the engine alternator the voltage is limited or regulated in some cases by actually wasting current so pulling down voltage. But once the battery voltage is lower than the regulated voltage the regulator is effectively out of the game. The current and voltage from the source depend on internal resistance.
Note that the internal resistance as I call it consists of actual resistance plus the capability or lack of it for the source to provide power. So a solar panel in dim light has very high resistance compared to bright sun and likewise a wind gen turning slowly has more resistance than when wind blows. However it is not as simple as that because a wind gen will develop only a low voltage at low wind speed which often results in voltage not rising up enough to exceed battery voltage so no charge. A solar panel tends however to give high voltage with little sun but sags with current drain.

Does that make sense olewill


IMHO best explanation so far.
 
Top