I've tried to answer my own question but...

Van Living Forum

Help Support Van Living Forum:

This site may earn a commission from merchant affiliate links, including eBay, Amazon, and others.
The generally accepted figures for battery state of charge based on voltage are:

12.63 volts = 100%
12.54 volts = 90%
12.45 volts = 80%
12.36 volts = 70%
12.27 volts = 60%
12.18 volts = 50%
12.09 volts = 40%
12.00 volts = 30%
11.91 volts = 20%

Frankly a $10 plug in analog meter ain't gonna be accurate enough to tell you anything - if it re3ads anywhere between 12 and 13 volts, you could be anywhere between 50 and 100 %.  You'll need a good digital meter to get any kind of semi-meaningful reading.

I say semi-meaningful because the above figures are only an approximation.  They can vary depending on battery temperature, and other factors. 

Regards
John
 
Please don't try to charge your Golf Cart 6 volt deep cycle battery using a cigarette lighter outlet. You will blow fuses or melt wires. You will not recharge your battery.
 
SternWake said:
In this case GC batteries stand not for gel cell, but golf cart, and most golf cart batteries are 220 to 232 amp hours

GOLF CART! Well NOW I feel a little silly. Yes, that not only answers one of my questions (capacity of the batteries), but invalidates my assertion that you didn't have enough information to be recommending 4 gauge wire. Apologies for that.

SternWake said:
even the best designed ciggy plugs and receptacles use spring loaded steel contacts and heat up greatly passing  just 7 amps.  It might be a convenient and ubiquitous connector, but it is a very poor electrical connection that wastes battery or alternator power as heat, to the detriment of the device and battery and the person depending on them.

I agree, and I never did and never would recommend using cigar lighter sockets for connecting anything more demanding than a cell phone charger to them. You're beating a dead horse. All I was objecting to was your blasting Off grid just for saying what works for him based on his long experience.

SternWake said:
The starting battery on a healthy fuel injected engine is depleted so very little during engine starting that time delaying prioritizing its recharge is not a factor, if the starting battery, which is not designed to ever be discharged lower than 80%, is indeed fully charged to begin with.   If there is a house battery system, then there is absolutely no reason to discharge the engine starting battery, and such a battery has an easy life indeed remaining at 95%+ state of charge.

This, on the other hand, is just wrong. Auto makers do not put 100 Ah batteries into vehicles just so they can have an easy life. Every pound of lead costs them money, and they don't oversize batteries just to be nice. They put big batteries in for cold cranking Amps, and also because sometimes it DOES take a while to start any engine, and often, people DO have other loads on their vehicles' 12V systems. You mention a "healthy fuel injected engine". You should be aware that a 1995 Dodge is at this point twenty years old, so healthy is not a good assumption.

But you're also leaving out another factor: when you connect a discharged house battery to the electrical system of a vehicle, the house battery draws current from not just the alternator, but the vehicle battery as well. This is avoided if you use a battery isolator, but this just makes things worse because it drops the charging voltage by about 0.7V to both batteries.

SternWake said:
Voltage is a piss poor method for determining battery state of charge.

My recommendation for adding a voltmeter was to monitor discharge to avoid getting down into the danger zone. I do agree that monitoring how much charging current the house battery is accepting is a better way of monitoring the state while charging. But from gypsychic's posts, my sense is that she wants to ensure that the batteries will be charged during her normal driving, rather than driving around until the batteries are fully charged, so this would be just an additional verification measure.

Jim
 
Yes GC stood for golf cart. Sorry. I thought I was using correct vernacular that I had seen before. I didn't mean to make it more complicated. I typed out a long post last evening & in it mentioned this but when I went to link the hydrometer & voltmeter I ordered off Amazon, my post vanished. (Typing on a small phone, must have hit a wrong key ) I was too tired to retype it.

During evening time tonight I'll reread this + the new stuff bbj just wrote & ask whatever questions come up. Thanks a bunch for your input.
 
Optimistic Paranoid said:
Frankly a $10 plug in analog meter ain't gonna be accurate enough to tell you anything - if it re3ads anywhere between 12 and 13 volts, you could be anywhere between 50 and 100 %.  You'll need a good digital meter to get any kind of semi-meaningful reading.

I wasn't talking about a $10 analog meter. I was talking about the digital ones now available that have the readout right on the cigar lighter plug, like this: http://www.amazon.com/Neewer®-Displ...im_auto_6?ie=UTF8&refRID=030YXWSSWCCD24PZ799D. This one costs six bucks. The year is 2015. These days, you don't need a "good" meter to get the kind of accuracy needed to indicate battery charge state in a semi-meaningful way.

Jim
 
Okay, things are getting out of hand so I guess I need to do something. First, we're going off on little tangents of "he said, she said" and that never ends well.

Worse, we've gotten so fixated on the details of this one debate that we aren't answering Gypsychics question. So we're either going to get back on track or I'll close the thread and start over again and try to do it right.

Here is the post that is the heart of the debate:
Off Grid 24/7 said:
I have a single 12v deep cycle house battery that I charge while driving via a cigarette lighter to cigarette lighter jumper cable.

Unless I'm traveling, I rarely drive more than maybe 20-30 miles a week, which has successfully kept my house battery charged for over a year now. I never plug into shore power.

Off Grid, I'm sorry to have to say this but anybody who knows anything about 12 volt knows this is a ridiculous claim. An hour of driving and charging through a cigarette plug will not give enough electricity to a house battery to last a week unless you use nearly zero power. There is no mumbo jumbo involved, it really is the law of physics and someone swearing it's true isn't going to make it true. I can swear to you that I flapped my arms and flew to the moon and it really is made of Swiss Cheese, but that isn't going to negate the laws of physics that say I'm wrong.

There is no debating this, Off Grids claim simply isn't true. That means Gypsychic needs to find another way to keep her battery charged that will comply with the laws of physics and common sense.

That discussion is closed. We've all had our say and made our points and the readers can reach their own conclusion. Posts furthering that debate will be deleted.

Now, let's put our minds together and find Gypsychic another solution.
Bob
 
There is a LOT of variation in advice I hear, when it comes to how low you should let your batteries get.  In an earlier post I claimed that you shouldn't let lead-acid batteries get below 12.6V.  This was technically correct for batteries with no load on them, but I now realize that this is a useless figure since we don't generally use our batteries without any load.  And as SternWake pointed out, this would be for a battery "at rest", meaning it hasn't been charged or discharged for some time, which makes it even more impractical.  What would be much more useful would be some indication of what the charge state of a battery is while in use.

Battery manufacturers aren't that great when it comes to publishing information about their products, but here's some data I DID find from some of them:

This is a discharge curve from Yuasa, for their sealed lead-acid batteries:
np-discharge-characteristic.jpg


Note that there are a number of different curves, representing different discharge rates.  For RV/vandwelling, what's probably most applicable is the 20-hour rate, since most of us charge our batteries daily as we drive from one place to another, then spend the rest of the day discharging it in various ways.  By 20-hour rate, they mean a load that's reasonably constant and will discharge the battery in 20 hours. For a 100 Ah battery, this would be a 5 Amp load. For a 200 Ah golf cart battery, it would be 10A.

The 20-hour curve on this graph is the top one.  It is marked "0.05CA", which means the drain current is .05 x the battery capacity, i.e., 1/20th of the capacity.  As you can see, the curve ends at 20 hours.  The graph is a little weird, in that it has a non-linear time scale, so if we want to see where the 50% discharged point is, for a 20-hour discharge time we have to look at the 10 hour point, NOT the horizontal center of the graph.  The voltage at this point is about 12.1V, so if you wanted to be nice to your battery and not discharge it below 50% on a regular basis, you would want to stop discharging it when it got down to 12.1V.

Here's a 20-hour discharge graph from Trojan, for a 6V battery:
292271-pmst_ev_fig7.jpg


Here the 50% discharge point is 6.05V, but if you're using two of these in series for a 12V system, again the 50% point during discharge is 12.1V.

Here's a discharge graph from Concorde Battery:
Lifeline-Discharge3.png

At the 10-hour point of a 20-hour discharge cycle (the 50% point), this one shows a voltage of 12.15V

I'm starting to see a pattern.  Note that this is all for a 20-hour discharge rate; for faster discharges the voltage will be lower at the 50% discharge point.  You can look at the charts to determine where the voltage will be based on your current consumption.  For example, if you are using a 100 Ah battery at 20 Amps, you would need to look at the 5-hour (or 0.2C) rate curve.  0n the Yuasa graph, you would use the 0.2C curve, and look at the voltage at the 2.5 hour point, which reading between the lines, my eyeball says is 11.8V.

I didn't pick-and-choose this data - these were just the three discharge curves I could find that actually came from battery manufacturers.

I also found another useful chart, which I think is useful regarding how low you should let your batteries get.  The chart shows the relationship between the depth of discharge and the expected lifetime of the battery (from www.mpoweruk.com):
dod.gif

This shows us that for a typical lead-acid battery that can be expected to deliver around 1000 charge/discharge cycles when discharged 50% at each cycle, discharging 75% (down to around 11.6V) will reduce the lifetime to about 700 cycles.  But that's if you do this consistently - occasionally draining your system down to 11.6V is NOT going to have the drastic effects that some claim.  In fact, according to this chart, even if you drain it all the way down to 10.5V every day, it only cuts the number of charge/discharge cycles by half.  Which if you think about it, means that if you only charge your batteries every other day instead of every day, yes, you only get half as many cycles out of it, but that's the same number of days, so the lifetime in years remains the same!

My conclusions from this are:
1) If you want to know when your house battery is down to 50% charge, measure it under a moderate load (when there aren't any heavy loads on it like a running refrigerator or microwave oven) and the magic number is about 12.1V. Look at the graphs if your discharge rate is different.
2) If you want your battery to last a long time, IT DOESN'T REALLY MATTER that much how deep you discharge them (within reason).

The one thing that everybody seems to agree on is that lead acid batteries wear out fastest when they are left discharged for a long time, even partially.  I don't have any manufacturer's curves to back this one up, but all of the manufacturers say to store them fully charged.  According to the lore, a phenomenon called "sulfating" occurs. Throughout the discharge of a lead acid battery, lead sulfate naturally forms on the negative plates of the battery.  The more a battery discharges, the more sulfate on the plates.  This is just part of the electrochemical process, and normally the sulfate is reabsorbed in the next recharge cycle, but this lead sulfate crystallizes over time, excluding it from the chemical reaction when the battery is recharged, lowering the overall battery capacity. This means that there are two very bad things you can do to lead acid batteries: 1) leave them discharged for weeks before recharging, and 2) not fully charge them when you DO charge them.  Both of these actions leave sulfates on the negative plates for long periods of time, leading to crystallization, leading to decreased capacity.  See this page http://batteryuniversity.com/learn/article/sulfation_and_how_to_prevent_it for an in-depth explanation.  In fact, http://batteryuniversity.com/ is a great resource for many questions about rechargeable batteries of all kinds.  Another good resource is http://www.mpoweruk.com/leadacid.htm.

Enjoy your travels,
Jim
 
With a Mid 90's Dodge, even if the Engine is not super healthy, in my experience they will still start very easily, requiring very little of the starting battery capacity.  Newer, modern vehicles have much higher parasitic loads, and such a vehicle sitting unused for 3 weeks, can deplete the engine battery to the level where it might not have enough juice to start the engine.  To extend the times these newer vehicles can sit unused and still start, they install larger capacity batteries, which inherently have more Cold cranking Amps, because of this.  Much higher CCA ratings than are actually required to start the engine.

There are lots of complaints on regular automotive forums of extremely short battery lives on newer vehicles that sit unused often. Most of the population does not know that a lead acid battery needs to stay as close as possible to fully charged to have a respectable lifespan, and most all act like the alternator is a magical instant battery charger, So these starting batteries in rarely driven or very short trip driven vehicles are in fact being deep cycled, and  then chronically undercharged and fail at 2 years or less.  Since Automotive designers can't really eliminate these parasitic loads on these newer vehicles, the best compromise is to have larger capacity batteries installed so that the vehicle has a better chance of starting after being left at the airport for 3 weeks.  Lots of newer vehicles also have much larger electrical requirements just to run the accessories.  Larger batteries are needed not only to give some buffer for alternator failure, but account for lower alternator outputs when hot and idling at low rpms.  However the starter motors on modern Vehicles largely use gear reduction and actual amp draw during engine cranking is still not a huge load.  Look at all these tiny  Lithium jump starters capable of only 200CCA or perhaps 400CCA which can easily crank and start engines over and over.  They just do not require all that much to start.

However this is off topic, not really pertinent to the OP's question, only replying to Jim's claim that I am wrong.  So I will not bother arguing anymore on this particular topic and I apologize to the OP and to Bob.  Debating and Nitpicking small details is not helping the OP.

There are multiple ways to get the alternator to charge house batteries quickly and effectively (to 80% State of charge or so) and keep the engine battery isolated with the engine off.  The simple continuous duty solenoid, rated for at least 100 amps continuous, that is triggered by the blower motor circuit is simple to wire up, and effective.  On Dodges, the blower motor circuit is not live during engine cranking, so any delicate electronics hooked to the house batteries will not be damaged by surges when the starter is disengaged.  There are many ways to trigger a solenoid to parallel the batteries.  

There are many Different methods  to parallel the batteries with engine running and separate them with engine off.  There is no one right way. Each has its advantages and disadvantages.  I choose a Manual 1/2/Both/off switch, but I must remember to turn the switch, and also be sure not to switch it to OFF with the engine running, or POOF go the diodes in the alternator.

When the engine is running, and the alternator is producing current and has battery voltage over 12.8 or so, there is NO amp flow between the batteries themselves.  The blower motor circuit on Dodges is Live with the key turned to ON, but not when cranked to Start.  So that short duration that the key is turned to ON, before engine cranking, the house battery will be drawing from the engine battery, but as soon as the engine starts the alternator will be feeding both sets of batteries, and the house batteries will take a majority of the current, if the engine battery was fully charged.

One can also put an illuminated manual switch inline on the blower motor solenoid trigger circuit and choose when to parallel the batteries at the flip of said switch, if one is worried about engine battery feeding the house batteries with the engine OFF, but key turned to on (not start).  Also depleted batteries can ask for so much current that on a cold engine, they present a rather heavy load on the engine.  Some sites clam that each 25 amps an alternator produces requires 1 engine HP. Some others claim this is much less, but I don;t think they are accounting for inefficiencies.

Often I will let my cold engine Idle with only the load of the Starting battery on the alternator( not including the juice required to run fuel pump and ignition) before turning my manual switch to 'BOTH' and my Ammeter which reads amps into the batteries, not total alternator amps, will go from about 7 amps right upto 64. The engine note changes as soon as the switch is flipped.  If it is also wet out, and my battery is depleted, rpms above 1300 with cold alternator and depleted battery will cause my single V belt to start slipping and squealing as amperage approaches the 85 range, so often I choose to not feed the depleted battery until the moisture has been burned off and the belt has proper traction.  This is with a Single group 31 12v battery rated at 130AH capacity.  A pair of depleted GC batteries can draw even more amps when depleted, so this option of when to allow the alternator to feed the house batteries via an illuminated switch on the solenoid trigger, can be a very desirable option.

I'd recommend moving the interior lights, and the dashboard Ciggy plug receptacle and stereo to the house battery fuse block, so that these devices draw from house battery(s), and not the engine battery.  I personally like to keep the Ignition/engine battery at full charge at all times, with ALL other loads on the house battery.  Some choose other methods.  There is more than one way to skin a cat.

Those considering AGM batteries as house batteries would do well to read this article by Mainesail.

http://forums.sbo.sailboatowners.com/showthread.php?t=124973

In short, he stresses that AGMs don't do well unless there is Solar/wind power to top them off, and that they have Minimum recommended charge rates, benefit from High recharge rates, and protest at slower recharge rates when deeply cycled.

As always it is better, cheaper and easier to use less electricity than to create gobs of it to replenish depleted batteries.  I am guestimating the OP will be requiring 45 to 65 amp hours each night and will have 115 or so total to use so as to not use more than 50% of the battery capacity.  If 65 amps hours are consumed then returning 45 of those can be done in an hour of driving, IF the charging circuit is thick, and the thicker the better, and engine rpms remain above ~800.  Mine Idles hot at 525 rpm, not sure about a 94 with a 3.9 V6.  A clamp on Ammeter here can be invaluable to see what the Alternator is actually capable of producing.  Guessing is no fun.

I own this one:
http://www.amazon.com/Craftsman-Digital-400A-Clamp-On-Ammeter/dp/B003TXUZDM 

It reads very closely with my Shunted meter.

The other 20 amp hours and the extra 10% or so required to return the batteries to a true full charge would take another 4 hours of driving, and as that is not going to happen in the OP's stated usage, the batteries will NOT be returned to full charge each day.  If they were subjected to 2 weeks of  cycling from 50 to 80% and back to 50%, they would only have about 80% of their original capacity to deliver.  But the OP states that they will have 48 hours to plug in every 5 days.

This ability to not only give the batteries a break from being cycled, but also enough time to reach full charge via a grid powered charging source,  is really Key to keeping them happy, and healthy enough so that they can be ridden hard the 5 other days.  While they might not be brought down to 50% the first night, by night 5 they could very well be brought down below 50% if using the same amount each night but not returning to full charge each day.

There is a somewhat respected battery charger over on RV.net than can do 40 amps, will do 14.7 Absorption voltage, and has the manual option of forcing a 15.7v equalization cycle, which is a forced overcharge that restores a battery to its maximum remaining available capacity, and is important to do when heavily cycling flooded batteries. I have no personal experience with this charger but there are some battery nerds over there who respect its abilities, and it is a good match for a pair of 6v GolfCart batteries.  However it is not capable of powering 12v loads whilst charging the batteries.  It will likely sense the  changing loads, suspect the problem and shut down.

http://www.amazon.com/Black-Decker-...bs_auto_1?ie=UTF8&refRID=00MP37VKANYCF5S16QFS

If the vehicle is to be lived in those 48 hours, then a RV converter is a better option.  the Iota DLS-45 will do 14.8V, when the batteries are depleted and the cabling to batteries is short and thick.  There are lots of reports that these converters either will not go into 'boost' mode and not apply their maximum amp rating when desired.  Usually this is because of too long and too thin wiring between converter and batteries.  With RV converters there are no Alligator clamps required, one must provide their own cabling, and RV manufacturers notoriously underwire wherever possible for their bottom line.

Also , 'Surface charge' is also responsible for many problems when living off battery power.  When the alternator has been feeding the batteries with 20+ amps for an hour or 2, then the engine is shut off, even though the batteries are far from fully charged, a voltage reading will show they are up in the 13.4 volt range.

Far too many vandwellers see this 13.4 volts of the surface charge, and declare their batteries fully charged, or even more than fully charged, when in fact the batteries are far from fully charged.  It can take 24 hours or more for the surface charge to dissipate.

Please read the following article, Also by Mainesail.

http://www.pbase.com/mainecruising/battery_state_of_charge

When a vehicle is recently driven, and then the vehicle is plugged into the grid, the Charging source, either an automatic battery charger, or an RV converter, will see this surface charge, assume the batteries are fully charged( when they are NOT), and then only apply enough amperage to hold float voltages, and in such cases it is possible that 48 hours of 13.6 volts will NOT fully charge the batteries.  2 months ago I took my group 31 to 50% state of charge, plugged into the grid, set my power supply to 13.6v, and 5 days later the specific gravity revealed it was NOT fully charged.  They required 2 more hours at 14.9v before SG climbed to 1.280 or higher on all cells.

One must be smarter than "smart" charging sources.  One needs to trick them by applying enough of a load to bring the battery voltage below 12.8v, and only then plug in the charging source.  This tricks the smart charger into bringing the batteries upto the Mid 14 volt range, and the charging source will apply its maximum output until it nears this voltage range.  If the cabling between charging source and battery is too thin, the charging source output terminals reach the mid 14's well before the batteries themselves reach that point, and charging is Not as fast as it could be, nor as effective as it could be, depending on the time plugged into the grid.

In short, any charging source should be wired short and thick to the battery bank.   It is not just about the ampacity of the wire used, but making sure the battery voltage is as close as possible to the output terminal voltage of the charging source.  Very very Few charging sources have separate voltage sense wires, which carry no load and thus suffer no voltage drop, that allows the charger to compensate for voltage drop on the circuit.

Any and all charging sources should be used whenever possible to keep lead acid batteries as close to fully charged as possible.  Making sure those charging sources can do their job properly is achieved by  short lengths of thick copper between charging source and battery.  Overkill is possible, but unlikely with the price of copper, and wire ampacity charts that mislead people into thinking inadequate wiring, is adequate for the task of recharging batteries.

When batteries are cycled deeply, daily, and their ability to power the devices required for the time required is important to ones health, and not just inconvenient, then one needs to take measures to prevent bad thing from happening.  The easiest method to make sure any charging source is doing all it can, is via large amounts of copper, properly terminated and fused, and the awareness that it is rather difficult and time consuming to actually fully charge a lead acid battery.  

With the proper tools, such as a Hydrometer, and an Ammeter along with a voltmeter, one can gain a  lot of experience with how their batteries are responding to their discharging loads, and how well the charging sources are doing to return what was used, along with about 10% more to account for inefficiencies.

Without these tools, one is blind, and while it works "just fine", for a while, at some point the batteries will not have enough capacity for the system to work "just fine".  And then panic sets in, and then research, and then one might take the steps required to prevent prematurely killing the next set of batteries.

My intentions on this forum, are to prevent that premature killing of that 'learner' set of batteries and their replacements.  I despise arguing, but not as much as I despise misleading information on a subject with which I am intimately familiar.

One not need strive for absolute perfection regarding the treatment of their batteries, they are of course only batteries, and only rented, but one should at least know what ideal perfection is, and make a line in the sand somewhere, and strive to reach that line rather than burying their heads in the sand well short of it, and demanding company there too.
 
By far the easiest solution is to run a 4 gauge wire from the positive post of the starting battery to the positive post of the house battery. But, you are at risk of accidentally draining your starting battery. (

The easiest solution to that is to put an on-off switch in the line and turn it off whenever the engine isn't running. But, there is still the risk of accidentally forgetting to turn it on or off leaving you with a dead starting battery or dead house battery.

The best solution is a 150 amp continuous duty solenoid. It will break the connection between the batteries when the engine is off and turn it back on when the engine is running. Perfect!! You won't be left stranded with a dead battery and the house battery will be charged everytime you start the engine!
Bob

PS (The house battery does need to be grounded, preferably to the frame but you can run a line back to the negative post of the starting battery.)
 
The best location for the house battery ground is at the alternator(-) output, or, if the alternator is grounded through its casing, at an Alternator mounting bolt.

Frame grounds tend to deteriorate quickly if not sterile before bolting and then covered with grease to prevent oxygen intrusion. Different metals, the copper of the ring terminal and the steel of the frame can also augment corrosion at the mating point. House battery Frame grounds are usually easiest and reduce the price of copper bought, but they are prone to issues down the road at some point.
If a frame ground is used to ground house bank, then, if the goal is still maximum alternator contribution to house battery bank, the engine battery to engine ground should also be upgraded, or a frame ground to alternator(-) ground should be added, preferably the latter.


With Dodges, it is wise to add a frame to engine ground anyway. The original battery to firewall ground is problematic, and bad/weak grounds are the culprits in most all strange electrical issues that crop up, So making sure they are clean and tight is good preventative maintenance.
 
akrvbob said:
Now, let's put our minds together and find Gypsychic another solution.
Bob

I'm trying to figure out a lot of the same things as Gypsychic, and there is some good info in this thread, but there is a lot going over my head.  This is what I got out of this thread and reading all(some of) the old threads.

AGM batteries are awesome, but they need to be a part of a system, and someone controlling the system that knows what they are doing.  (I was going to buy AGM batteries because the off gassing and maintenance of normal batteries sounded complicating, but after reading the link to the boat guys post I am definitely not going to buy AGM)

Short big copper is best.  Clean grounds, or a ground to the alternator.  Solenoid on accessory so you don't drain the starting battery.  If you do it wrong you can POOF the diodes in the alternator.  

Alternator is only going to charge a battery to 80%

And thats about all I know.  I haven't learned the math on how much juice a device uses so I can't determine how much coach battery I need, but I know I can figure it out.  Its just math.  I can get everything I need on 12v, I was looking at step up transformers and if a device has a power brick I should be able to make it run on DC.  The questions I still have about the battery installation, what kind of battery should I use(not size), how the battery should be treated, how the battery should be charged,  how not to screw up the chassis electrics, and if I put a battery in the cabin will it liquefy my lungs with acid gas.  

I just need general info, but some of you might think in specifics so here is the parameters of my van.  
OEM alternator on a 4.8L 2006 Express
4 devices that need 19v 3A (buy/build step up)  (samsung tv, monitor, big laptop, tiny laptop)
Light
2 Fan-tastic fans
1000w cheapo inverter because I already own it and never want to use it because I understand conservation of energy and how it doesn't work in the real world with cheapo components that don't have heatsinks.  
1000w Honda generator because I already own it and if I screw up my electrics I don't want to ask for a jump.  
I don't want to do solar.  
I don't want to be a slave to charging, but I don't want to abuse a battery by incorrectly charging it.
I don't want to be an expert.  

Hank
 
Reading my own post I found another question. Relating to the fact I haven't calculated my energy use. If my use per hour is X, and I want 20 hours, should I get a battery that only has 20 hours of capacity at X per hour and run it from fully charged to depleted, or should I get a battery that has 40 hours at X per hour that will take twice as long to charge when depleted? Or 160 hours at X per hour? The bigger the capacity the less likely the battery will get fully charged, a 160 hours at X battery would rarely be depleted or charged to full.
 
I just wrote a huge wordy response to your question hank, clicked submit, and it disappeared.

Now i am extremely angry, and tired.

Sorry hank.
 
907KHAM687 said:
Reading my own post I found another question.  Relating to the fact I haven't calculated my energy use.  If my use per hour is X, and I want 20 hours, should I get a battery that only has 20 hours of capacity at X per hour and run it from fully charged to depleted, or should I get a battery that has 40 hours at X per hour that will take twice as long to charge when depleted? Or 160 hours at X per hour?  The bigger the capacity the less likely the battery will get fully charged, a 160 hours at X battery would rarely be depleted or charged to full.

If on average you are drawing 5 Amps and you want to be able to use it for 20 hours, your total usage is 5 Amps x 20 hours = 100 Amp hours. For every Amp hour you use, you have to replace that from somewhere - either solar panels or an AC-to-DC converter, or the vehicle alternator, or a separate generator. If you have a battery system with a capacity of LESS than 100 Amp hours, you will run out of power. Simple as that. If you have a battery system with a capacity of MORE than 100 Amp hours, you won't run out of power, but your charging system, whatever it is, has to replace that 100 Amp hours, no matter how big the battery is. So if your charging system puts out an average of 50 Amps throughout the charging cycle, it will take somewhat more than two hours to recharge the battery, since 50 Amps x 2 hours is 100 Amp hours. I say "somewhat more than", because batteries are not 100% efficient - you have to put back a little more than you took out. How much more depends on the charging rate - the faster the charge, the less efficient it is, generally.

This does NOT mean that if you have a 50 Amp alternator in your vehicle, then driving for two hours will fully charge that battery. This is because the ratings on alternators are the MAXIMUM current, not the average over the charging cycle. The average charging current depends on the battery, and all other things being equal, the larger the battery, the more charging current it will draw.

There are two points I want to make sure are clear here:
1) You only have to put back a little more charge than you took out of your batteries, REGARDLESS of how big the batteries are. In fact, the bigger the batteries, like I said, the more charging current they will draw from your charging system. There's really no such thing as too big of a battery, except that too big means heavier and takes up more space.
2) What doesn't seem like a lot of average current adds up over time. 5 Amps is not a lot of current - about 50 Watts of load on an inverter will draw that much, and a standard-sized laptop will use that much power all by itself. And if you use that 50-Watt laptop for one hour, it's using 5 amp hours of battery capacity, so if your charging system can deliver 50 Amps, it will replace that in about six minutes. BUT, if you leave that laptop on (or an equivalent load) for 20 hours, you use 100 Amp hours, and that same charging system will take two hours. It's one thing to drive around for six minutes, it's quite another to have to drive for two hours just to be able to play computer games all day and night.

The key to an economical house electrical system is NOT to use loads that are used for 20 hours a day. If you want to use a small microwave oven, yes, you're going to need a hefty inverter, something on the order of 1500 Watts, from most reports. And when that microwave is running, it consumes about 1000 Watts. Because inverters in the real world are about 80% efficient, that will draw (1000 Watts / 12V) / (.8) = 104 Amps. OUCH! That sure sounds like a lot! But I don't know about you, but I seldom use my microwave for more than 5 minutes at a time. That's 0.42 hours x 104 Amps = 43 Amp hours. It's still a lot of energy your charging system has to replace, but now it doesn't sound so bad.

Now compare that with a small 120V compressor type refrigerator. Again you need the beefy inverter because of the high starting current of compressors, so that part doesn't change, but a typical small refrigerator consumes about 40 Watts on average, over the 24 hours a day. Those 40 Watts, using the same formula, are going to draw (40 Watts / 12V) / (.8) = 4.2 Amps. But that's 24 hours a day, so the total drain on your battery is 4.2 Amps x 24 hours = 100 Amp hours! So a relatively small load, if it's continuous, can actually require a bigger battery and bigger charging system than a very large load used only for short periods.

So the short answer to your question is, there is no penalty for having a larger battery other than the higher initial cost and more space and weight. And a bigger battery will give you more "reserve" power for those days when you don't have a chance to charge it, and it will ACCEPT a charge faster than a small battery from the same charging system. What is more important is that however many Amp hours you use, your charging system has to be capable of replacing them, which to me is the bigger challenge.

I hope this helps.
Jim
 
Battery capacity and charging source should be a good match. It is very possible to have too much battery capacity for ones charging sources.

Recharging at too slow a rate can be detrimental to the battery. AGM batteries especially are negatively affected by not meeting the manufacturer recommended minimum rates when deeply cycled.

Some people like to get huge capacity battery banks then rely solely on too little solar to feed them. While the solar might be enough to replace the AH consumed, plus a little more to account for inefficiency, when one puts a hydrometer into the cells, one realizes that this too little recharge rate and the flashing green lights of the charging sources are lying and not fully charging the battery.

Any recharging is better than no recharging, but meeting the battery manufacturer recommendations should be strived for, so 100 watts of solar to replace 40Ah daily into a pair of GC batteries, is possible, but it is far below the manufacturer recommended rate and a hydrometer will reveal, after many such cycles, Wide disparity between the cells of the batteries. The battery capacity will be lost faster than if they had more charging current available, even if more is not needed to replace what the person has used the night before.

it is not about just replacing what was used plus a little more, but about coming as close as possible to maxing oout the specific gravity each recharge cycle, and too little recharge current feeding too much capacity is detrimental to battery life. There is a penalty. how much of a penalty is hard to say.

Also while a healthy AGM battery depleted only might require 103% of what was taken from it to fully recharge, an Unhealthy flooded battery can require upto 150% of what was taken from it to fully recharge. That is certainly not 'just a little more'.

Also not mentioned in this thread is the Peukert effect. A hundred amp hour battery can provide 5 amps of current for 20 hours before battery voltage drops to 10.5v, which is considered 100% discharged. Currents over 5 amps on this 100 AH battery reduce the available capacity of the battery. The higher the current, the less capacity the battery has to give, and likewise currents under 5 amps, the battery has more capacity to give.

All batteries will have different peukert components, and good luck getting a figure from most battery manufacturers.

So a 100 amp load on a 100 amp battery for an hour is not at all equal to a 5 amp load for 20 hours. Not even close.

Simple arithmetic with regards to loads on lead acid batteries cannot yield accurate real world results, unless the load is exactly the load at which the battery was rated. And it only applies to when the battery still has its full capacity is healthy, and was fully charged in the first place.
 
one thing I would like to point out Jensen makes 12v TVs. these TVs are for severe duty made to take the vibrations, dust, and moisture of vehicles. mockturtle probably has one that's because Tiger uses them in their 4x4 motor homes. no invertor or step up needed, you can also use it as monitor. then get 12v adaptors for the lap tops, and it's basically plug and play. highdesertranger
 
SternWake said:
it is not about just replacing what was used plus a little more, but about coming as close as possible to maxing oout the specific gravity each recharge cycle, and too little recharge current feeding too much capacity is detrimental to battery life.  There is a penalty.  how much of a penalty is hard to say.
...
All batteries will have different peukert components, and good luck getting a figure from most battery manufacturers.
So a 100 amp load on a 100 amp battery for an hour is not at all equal to a 5 amp load for 20 hours.  Not even close.

Simple arithmetic with regards to loads on lead acid batteries cannot yield accurate real world results, unless the load is exactly the load at which the battery was rated.  And it only applies to when the battery still has its full capacity is healthy, and was fully charged in the first place.

I gave simple arithmetic examples to illustrate the basic principles using ball-park calculations. I would not expect that somebody is going to measure how many amp hours they suck out of their battery and stop charging as soon as they've put exactly that many back into it, since there's no practical way of doing that anyway. The previous poster (907KHAM687) appeared to be under the impression that given the same daily load, it would take more current to recharge large batteries than small ones. I was giving simple examples to show that this is not the case.

Do you have any PRACTICAL means of calculating these things that you say can't be calculated simply? Or do you have some way of measuring the specific gravity in sealed batteries? Do you have a way of determining how big a battery is too big? Because if not, then I don't see how this is useful. I'm trying to explain these things in the "Batteries 101" sense, and you are bringing up the "Special Considerations for Batteries 400" level exceptions.

The idea here is to give answers that people can use, even if the answers are not 100.000% precise, not to simply say that it's all too complex and therefore you can't really know what to do.

Jim
 
I don't understand the question. It seems like you have it figured out pretty well.

1) Charge a house battery off the starting battery with a solenoid in-between
2) Minimize your power use and make everything 12 volt if possible
3) Charge with the Honda if you run out.

What exactly is your question?
Bob

PS Either a 100 watt or 200 watt Renogy kit would make the most sense to me: quiet, trouble-free power that never costs anything after you buy and install it. No down-sides.
 
highdesertranger said:
one thing I would like to point out Jensen makes 12v TVs.  these TVs are for severe duty made to take the vibrations,  dust,  and moisture of vehicles.  mockturtle probably has one that's because Tiger uses them in their 4x4 motor homes.  no invertor or step up needed,  you can also use it as monitor.  then get 12v adaptors for the lap tops,  and it's basically plug and play.  highdesertranger

Good call HDR.

I have a 12v 13.3 inch LED backlit flat screen tv with built in DVD player that I keep on a 4 foot swing arm with a double elbow to position most anywhere for viewing comfortably.

Unfortunately, just recently, I realized the DVD portion no longer works, but the TV portion is fine.  Whereas it would draw about 1.5 amps playing a DVD(0.9 to 1.1 amps just playing TV) , I had to play the DVD on my laptop which required 3.8 amps to do this same task.

I have forgotten in the past to properly secure the TV when driving, and I believe this is perhaps the DVD function no longer works. I am trying to NOT open it up to see if I can see anything obvious as to why it no longer functions, but do not know if I will succeed in resisting.

This is my TV, I think I've owned it for close to 4 years now, and paid about $160 for it then.

http://www.amazon.com/RCA-DECK13DR-13-3-DVD-Combo/dp/B005GNR9G4/ref=cm_cr_pr_product_top?ie=UTF8

Regarding TV's broadcast over the air TV stations and Inverters, Inverters are electrically noisy, and can knock out reception of some weaker stations.

However my Laptop DC to DC converter knocks out one of my strongest TV stations, and I have some LED's which also can cause issues on some TV stations when their signals are marginal.

Bypassing the inverter is a good way to save battery power, if 12v options are available and one does not already own the 120V versions.
 
akrvbob said:
I don't understand the question. It seems like you have it figured out pretty well.

1) Charge a house battery off the starting battery with a solenoid in-between
2) Minimize your power use and make everything 12 volt if possible
3) Charge with the Honda if you run out.

What exactly is your question?
Bob

Question is how not to fry my battery. And what the best practices for charging a battery are so that it doesn't prematurely fail.

Aparently the 12v output on the honda is unregulated, and will continue to charge a fully charged battery and it is better to hook up a battery charger to the ac output rather than use the 12v output on the generator.

Hank
 
Top