Gypsychic:
It surprises me greatly to see so many expert answers to your question, since it is not possible to answer your question without some additional information!
If you want the short answer, go straight to the last paragraph.
There are three things that must be known in order to calculate how many hours (or what fraction of an hour) you must drive to keep your house battery charged:
1) The capacity of the house battery,
2) The average load on the house battery, and
3) The average current your alternator can deliver to the house battery.
One thing I can say definitively without any of this information, though, is that you are without question better off by using a separate battery to run your house electrical devices (lighting, appliances, charging of battery-powered devices), than you are trying to power these off your vehicle battery. The main reason for this is that while it may be inconvenient to run out of power for your living needs, this inconvenience gets radically worse when running down the battery also takes away your ability to start the vehicle engine. Having done this the wrong way before, I can state from personal experience that there are plenty of electrical devices (laptop computers, lighting, 120V inverters) that will happily continue operating and draining your battery after your battery's charge is down to a level where the engine won't start in the morning, and every time you do this reduces the life expectancy of your vehicle battery. And then you have to get it jump-started, which is both inconvenient and embarrassing. Using a separate battery and either a solid-state isolator (which I don't recommend because it reduces the voltage to BOTH batteries), a switch, a solenoid, a relay, or even an 18-gauge wire with cigar lighter plugs on both ends, means that no matter how badly you treat your house battery, this will never leave you stranded as long as you ensure that the two electric systems are disconnected from each other when you shut off your engine. And as an added bonus, if you should happen to leave your headlights on and run the vehicle battery down, you have the house battery to jump start it from!
Now on to the other things:
1) The capacity of the house battery is the easy one: all of the gel-cel batteries I've used have their capacities printed right on the label, which is in "Amp hours", or Ah. Note that this is NOT the same as Amps/hour or Amps per hour - there is no such thing as Amps/hour, and as a guy with a degree in electrical engineering and decades of engineering experience, I can back that up if you want to discuss it elsewhere. Saying "Amps per hour" makes no more sense than saying "Miles per hour per day", and I cringe every time I see the term. Most people who say "Amps per hour" really mean "Amp hours per hour", which intuitively just seems like it must be wrong, but is the same thing as "Amps", which is what they SHOULD be saying. I only bring this up because in my experience it causes a GREAT deal of confusion between Amp hours and Amps, when people say Amps/hour when they really mean Amps. Okay. The next part is a little more controversial: for most lead-acid batteries, the rule of thumb is not to use more than 50% of the Amp hour rating in any given charge/discharge cycle, but for "deep cycle" batteries you can safely use more than that. How much more depends on the manufacturer's specifications for the battery. But in any case, for maximum battery life, you should never let the voltage get below about 12.6V (or for 6V batteries, 6.3V each) when the battery is unloaded. That's controversial, because many devices are set to "protect" your battery by shutting off when they get down to somewhere in the range of 11.5V to 11.8V, at which point you may already be damaging your battery. The reason they set this threshold so low, is that when you plug things into a cigar lighter, there is enough voltage drop between the battery and the device, that 11.8V at the device MAY indicate 12.3V at the battery. So they set the cutoff at 11.8V when they really want the circuit to cut off when the battery gets down to 12.3V. Or more, or less, depending on what length and gauge of wire connects the cigar lighter to the battery. If they set the limit to 12.6V, it would cut off the current way before your actual battery voltage was that low, and people's satisfaction with 12V devices drops of pretty abruptly if they shut off too soon. (Off grid: this is probably why you were able to start your engine even though the measured voltage was 11.8V.) So to be on the safe side, let's just assume you want to keep your house battery more than half charged at all times. Since you said you're using 6V gel-cells, I'm assuming you're using two of them in series, in which case the total Amp hours is the same as the Amp hours for either battery. Batteries wired in series should have the same Amp hour capacity.
2) Average load (current consumption) from the house battery. It sounds like you've already done the work for most of your devices, but I want to make one correction: when you say that 0.34A @ 120V is 3.4A @ 12V, you are making an assumption that your inverter is 100% efficient. At best, inverters are about 90% efficient, and that's when they're running near their full rated continuous power. At lower power levels, typical efficiency is closer to 80%, so it takes more 12V amps than that, by a factor of about 25%, so really that's going to be about 4.25A @ 12V. It seems like a small thing, but these small things add up.
3) Average current from the alternator into the house battery. This one is tricky, because knowing the current rating of your alternator will not help at all. Why not? Because that's the maximum current that alternator will provide rather than the typical value, and because there are plenty of devices on the vehicle already that consume much of that, such as lights, ignition, fuel pump, heater fan, and charging of the vehicle battery. Also, if the alternator delivered its full current all the time, it would boil your vehicle's battery dry in a matter of days, so this current is reduced as the batteries charge, so the charging current depends more on the battery's charge level than it does on the alternator's rating. The only time you'll ever see the full current coming out of an alternator is right after starting the engine, when the vehicle battery is at its lowest charge level. The only way to know the charging current is to buy, borrow, or steal an ammeter (a device that measures DC Amps) and put it in the circuit between your alternator and your house battery. Do this with the house battery partially discharged, like at a starting voltage of around 13.0V. When you start your engine and rev it a bit, say to 1500 RPM or higher, the charging current will start out relatively high, and will taper off slowly and settle to an steady current, and that's your average charging current. Off grid is correct - you can charge through an 18-gauge wire between cigar lighter plugs and still get a full charge. BUT, SternWake is also correct - this will only work if the devices connected to the house battery aren't consuming much current at all, because if they are, it will take more time to charge the house battery than you will be willing to drive. Using small wire like that restricts the current that can go from your alternator to the house battery, and as SW says, this causes a voltage drop, BUT, as the house battery charges up, it accepts less and less current, and the voltage drop decreases, so as the battery (slowly) approaches a full charge, it rises to the full alternator voltage, and you still get a full charge. It just takes longer. This is according to the laws of physics, SW, and I got pretty good grades in physics. A good thing to remember about electrical sources and loads, is that the source primarily determines the voltage, and the load primarily determines the current.
Once you have all three of these figures, then it's a simple matter: you can calculate how many hours you can use your house battery with all of your devices by taking the total capacity of the battery (in Amp hours) divided by 2 (because remember, you only want to use half of that capacity) and dividing that by the total average consumption (in Amps). The answer will be in hours, so if you want to know how many days, just divide that by 24. If you want your house battery to live a long life, you should not use it longer than this between full recharges. To know how long it will take the vehicle's alternator to fully recharge the house battery from that half charge, divide the amount of charge you need, which again is half the house battery capacity, by the average current the alternator supplies to the house battery (again in Amps). Again that is in hours, so if you want minutes, multiply by 60. If you DO mess up and discharge the house battery more than that, you must extend the charging time by a corresponding amount to get back to full charge.
There's also a somewhat quicker way to do this that may be more practical at times, such as determining how long you need to drive to recharge on a particular day: if you divide the average current consumption from the house battery by the average charging current to the house battery, you will get a number that indicates the ratio of discharge time to charging time. For example, if your average current consumption is 5 Amps and the average charging current is 50 Amps, this ratio is 1:10, which means that for every 10 hours you discharge the battery, you need to charge it for 1 hour. But remember that this is just an example - your results will be different, based on YOUR average charging and consumption currents.
Just keep in mind that because all of these average currents can vary in actual operation, these calculations only give you an approximate answer, so you should treat the charging time as a minimum and the discharging time as a maximum. Also, most alternators produce very little current when the engine is idling (this is a design feature, to reduce the fuel consumption at idle), so driving time does not include time spent waiting for the engine to warm up, waiting at traffic lights, or any of the "stop" time in stop-and-go traffic.
So that's the correct answer - that you need to take some measurements and do the calculations before you will know the true answer. The rest of what I'm seeing in this thread is all guesswork. So in the spirit of the thread, I'll throw in a little guesswork of my own: if you're driving 500 miles/month, I'm guessing 25 miles/hr average (i.e., driving around town), that's 20 hours/month, or about .67 hours/day, which SHOULD be enough to fully charge your house batteries unless they're huge and you're using a wimpy little wire like Off grid's, but no promises from me until the charging current measurement is taken. And if you're plugged into the grid 48 hours/week, it matters whether that means two days in a row, like every weekend so it goes five days between charges, OR for example, home every Tuesday and Friday, so it never goes more than three full days without being on a charger. What's significant isn't the number of hours plugged in, but the maximum time between plug-ins. Here I'm guessing that your charger has enough output current to fully charge the house battery overnight, so there's no additional benefit to having it plugged in two days in a row. And just to be TOTALLY guessing, I would say that driving 500 miles/month AND plugging in two nights per week should be enough to keep your batteries happy. But since we're just guessing here, buy a cheap digital voltmeter and make sure your house batteries never go below 6.3V each when everything is turned off - that is, engine off, house battery disconnected from the vehicle battery, and inverter and other high-current loads shut off.
Thanks for getting all the way to the end of such a long, painfully detailed, and sometimes nit-picky reply.
Jim