Inverter, how many watts for laptop?

Have the people who dislike inverters tested them themselves?

Nick

Yes. Almost twice the drain compared with a maplin DC/DC. And horribly RF noisy. Pulling over 0.5a in standby. All in all not a great way to go.

edit
Also,
This is what a cheap inverter sine wave looks like :eek: :eek:

IMAGE020.jpg
 
Last edited:
Hi-

An AVO inserted in series between the battery and inverter or 12v adapter on 10A scale. A flat l/top battery would initially pull in 3a+ but would drop down to 1.5/2a. A 3yr old 85a/hr no name caravan battery would run it for 20 hrs (down to 11.8v- not recommended, I know.)

N

If you've got an AVO I assume that you know what you are doing! (I'm not taking the P)
Your figures surprise me but, FWIW, I made a switched mode power supply for a radio only to discover that, in some situations, it was less efficient than an ordinary regulator. :(
 
I've Have the people who dislike inverters tested them themselves?

Nick

Yes, and measured the current drawn. In my hands nearly twice as much for the inverter in the most extreme case.

One of my ancient laptops - much to my surprise - emitted a horrible rattling noise on a non-sine-wave inverter. I have no idea why.

I have only measured comparative current on three laptops - and always using the same DC/DC charger. The advantage was always over 30%. It's only a small sample.

If you do go the DC/DC route do check the power capability of the adapter. The fact that it is the right voltage does not mean that it can deliver enough power. It's not rocket science as all the relevant info will be written on your computer and its mains charger. However you can take them to Maplin with you and let their geeks read it and point you in the right direction.

I still have an inverter - I just never use it.
 
Spyro, it's a Dell Inspiron 1545 which I have borrowed in order to see if I can access synoptic charts etc using HF (Nasa Target). I use a Mac laptop and didn't fancy spending $95 on the necessary software, as opposed to £19.90 from Amazon for a charger. Also the Laptop is surplus and the Mac expensive!
 
Many thanks for advice, I didn't realise that a 19.5v supply could be obtained from 12V, this is much simpler and cheaper than a 230v inverter. Is the 19.5v charger in effect an inverter, or is other technology involved?

Regards Gerry:cool:

NO it is not an invertor as the source is already DC. It is a type of step transformer if you will. I prefer to have as much as possible DC based. If you get a better quality Dc charger you can set the output and also charge a range of DC devices from Phones, to laptops to torches to hand Held VHFs etc...
 
NO it is not an invertor as the source is already DC. It is a type of step transformer if you will. I prefer to have as much as possible DC based...

That view is often expressed by uninformed users but you are fooling yourself (especially being adament with a capital "NO") because the step up voltage converters being referred to here do contain inverters. They convert the DC on the supply side to AC using the same technology as an inverter and then convert that back to DC at the required higher voltage.

So the same process as using an inverter to get AC then converting that back to DC using the PC (or whatever's) AC power supply. The differences, which may show up in better efficiency, are to do with better matching of power required (e.g. using a 80 watt unit instead of a 1000 watt inverter) and integration.
 
Interesting arguments going on ,would be a good test to measure current drawn from battery on both methods with decent ammeter , maybe someone has seen published evidence of such a test as it would solve the argument,the one thing you cant trust is manufacturers efficiency forecasts.I suspect the DC/DC convertor may be less draw because it is designed for a specific task and less energy is lost through heat but I am willing to be persuaded .
 
Interesting arguments going on ,would be a good test to measure current drawn from battery on both methods with decent ammeter , maybe someone has seen published evidence of such a test as it would solve the argument,the one thing you cant trust is manufacturers efficiency forecasts.I suspect the DC/DC convertor may be less draw because it is designed for a specific task and less energy is lost through heat but I am willing to be persuaded .

Just done a check, using my BM1. The dedicated ASUS Eee power brick is drawing 300ma with the battery fully charged and computer set on power saving mode.

Through the 350W inverter, it's drawing around 1 amp.

This is only a rough test because I'm getting voltage/charge fluctuations with solar panels and Aerogen running.
 
Just done a check, using my BM1. The dedicated ASUS Eee power brick is drawing 300ma with the battery fully charged and computer set on power saving mode.

Through the 350W inverter, it's drawing around 1 amp.

This is only a rough test because I'm getting voltage/charge fluctuations with solar panels and Aerogen running.

So thats more than three times the current draw although only 700ma more I wonder in what way thats proportional or is there a nominal amount that inverters draw just to be switched on. The laptop battery charger I use at home is plugged in all the time I use my laptop and Im surprised how hot it gets, this heat must be more than half of the energy used maybe 15 watts or more .Of course it could be my laptop battery probably is s-**** and its working to hard.
 
So thats more than three times the current draw although only 700ma more I wonder in what way thats proportional or is there a nominal amount that inverters draw just to be switched on. The laptop battery charger I use at home is plugged in all the time I use my laptop and Im surprised how hot it gets, this heat must be more than half of the energy used maybe 15 watts or more .Of course it could be my laptop battery probably is s-**** and its working to hard.
Inverters vary a bit, but a 300w inverter typically draws around half an amp when it's turned on with nothing connected. People often quote the inverter efficiency of typically 97%, but this is at full power where the inverter is most efficient and often excludes the small perminant power draw.
Smaller inverters are generally more efficient if they will handle the load.

Running net books or tablets with medium sized inverters is very inefficient. The difference would be less with a full sized laptop or when charging the computer battery, but the dc to dc converter is always better.
 
That view is often expressed by uninformed users but you are fooling yourself (especially being adament with a capital "NO") because the step up voltage converters being referred to here do contain inverters. They convert the DC on the supply side to AC using the same technology as an inverter and then convert that back to DC at the required higher voltage.

So the same process as using an inverter to get AC then converting that back to DC using the PC (or whatever's) AC power supply. The differences, which may show up in better efficiency, are to do with better matching of power required (e.g. using a 80 watt unit instead of a 1000 watt inverter) and integration.

Please check your facts - they seem to be quite out of date. :mad:

Electronic switch-mode DC to DC converters convert one DC voltage level to another, by storing the input energy temporarily and then releasing that energy to the output at a different voltage. The storage may be in either magnetic field storage components (inductors, transformers) or electric field storage components (capacitors). This conversion method is more power efficient (often 75% to 98%) than linear voltage regulation (which dissipates unwanted power as heat). The efficiency has increased since the late 1980s due to the use of power FETs, which are able to switch at high frequency more efficiently than power bipolar transistors, which incur more switching losses and require a more complicated drive circuit.

Another important innovation in DC-DC converters is the use of synchronous rectification replacing the flywheel diode with a power FET.................
 
Interesting arguments going on ,would be a good test to measure current drawn from battery on both methods with decent ammeter .... .
I don't now about any of the others who have asserted that the DC to DC method is more efficient, but I certainly would never have had the gall to make such a statement if I had NOT actually done the test and measured the current. As I said, for four different laptops (setting to different voltages) the saving was always more than 30% and in my hands the best saving was about 50%. I note that another contibuter saw a factor of three saving on measured current.

It's quite likely to be a wee bit complicated - the relationship between amps in and amps out could vary as a function of output voltage and current. However I have casually measured it in diverse circumstances and always found the DC to DC system to be at least 30% better. I have no theories, just data.
 
Please check your facts - they seem to be quite out of date. :mad:

Electronic switch-mode DC to DC converters convert one DC voltage level to another, by storing the input energy temporarily and then releasing that energy to the output at a different voltage. The storage may be in either magnetic field storage components (inductors, transformers) or electric field storage components (capacitors). This conversion method is more power efficient (often 75% to 98%) than linear voltage regulation (which dissipates unwanted power as heat). The efficiency has increased since the late 1980s due to the use of power FETs, which are able to switch at high frequency more efficiently than power bipolar transistors, which incur more switching losses and require a more complicated drive circuit.

Another important innovation in DC-DC converters is the use of synchronous rectification replacing the flywheel diode with a power FET.................

Well a nice exact quote from Wikipedia which is correct but wrongly interpeted by you. Unfortunately as for most who parrot something out of Wiki rather than being able to explain the matter in their own informed words you do not have the backgound to understand what you are reading.

That page covers all types of converter and the quote you take from it is a generalisation for all whether step up, same to same, or step down. If you take the trouble to look further in that same Wiki DC-DC Converter page at the table listing the types and go to the links in that to "Buck-Boost" and "Step-Up Boost" under the fly back types (which are the non linear step up ones we are all talking about) then look at the waveforms that are produced you will see that they are AC (of square wave form in the simple illustrating examples) just as I said.

You may also like to do a bit of study as to how inverters work too and maybe you will then come to see the statements I made are valid.
 
Well a nice exact quote from Wikipedia which is correct but wrongly interpeted by you. Unfortunately as for most who parrot something out of Wiki rather than being able to explain the matter in their own informed words you do not have the backgound to understand what you are reading.

That page covers all types of converter and the quote you take from it is a generalisation for all whether step up, same to same, or step down. If you take the trouble to look further in that same Wiki DC-DC Converter page at the table listing the types and go to the links in that to "Buck-Boost" and "Step-Up Boost" under the fly back types (which are the non linear step up ones we are all talking about) then look at the waveforms that are produced you will see that they are AC (of square wave form in the simple illustrating examples) just as I said.

You may also like to do a bit of study as to how inverters work too and maybe you will then come to see the statements I made are valid.

Each to his own. I recommend a DC to Dc convertor. END of discussion in my book.
 
Last edited:
Each to his own. I recommend a DC to Dc convertor. END of discussion in my book.

Then why add the pseudo scientific explanation?
In fact you are probably quite correct to say that going from 12V DC to 19V DC is more efficient than going from 12V DC to 240V AC and then 240V AC to 19V DC but your reasoning was way out! DC-DC converters, inverters, switching regulators, switch mode power supplies etc all work on the same principle. The 12V-19V converter still converts power to a high frequency switched signal internally.
The efficiency comes because you can buy one designed to operate at the required power level. An inverter designed to supply a kilowatt is very inefficient if only supplying a few watts (because MOSFET transistors take the same power to switch them on and off whether they are on for 2 microseconds or 20 microseconds).
 
Top