Is good enough good enough?

OK – I screwed up.  I missed implementing 2 features in the original board design, and now I’m paying for it trying to figure out how to hack the boards into doing what I need.

The first missing feature was just stupid on my part:  It MUST monitor total battery voltage so it can know when to turn off the discharge relay.  How could I possibly have missed that?

The second one is scope creep:  I put a 0.1 ohm current sense resistor in the discharge path so I could put a voltmeter across it to measure the actual discharge current non-disruptively.  But it didn’t occur to me that I could read that voltage from the Arduino (doh!).  Not essential, but sure would be nice to have.

I’m willing to spend one of my 16 channels to monitor the total battery voltage.  That limits the device to 18V packs instead of 19.2V, but I guess I can afford that.  But if I spend another channel on the current sense, I cut 18V packs out of my capabilities, and that’s really not acceptable.

I can’t just use a couple more analog inputs on the Arduino, since all Arduino A/D measurements are with respect to ground, and the bottom of the battery is usually not connected to ground.  (If I had done that, the 10-bit resolution of the A/Ds spread across a 25V range would put single cell voltage measurements at the very ragged edge of the resolution I’d like.  The boards are laid out to use two 16:1  analog switches using one to connect the A/D input to the top of a cell and the other to connect ground to the bottom of that cell.  I get an order of magnitude better resolution that way.)

I was thinking about adding a third analog switch chip to pick those 2 measurements, but that’s a hassle on several grounds.

So the latest thinking is:  Permanently reallocate one channel to the voltage monitor.  Put in a physical DPDT switch to use the next highest channel (on each of the 2 analog switches) for either monitoring that last cell of an 18V pack OR measuring discharge current.  Most of the packs will be less than 18V, so most of the time I get my automated current measuring.  When I need to do an 18V pack, I’m back to manually hooking a voltmeter up to read the current.  (And I can no longer get automated per-sample current readings for best accuracy of cell capacity.)  But I can live with that.

The down side is I’m at the ragged edge of keeping track of all the hacks I need to make to the boards to implement this.  The fact that the two boards are very similar, but with the analog switch inputs off by one between the boards (so input 0 of one switch goes to the bottom of one cell, but input 0 of the other switch goes to the bottom of the next cell (== top of first cell)) just makes for more brain overload keeping track of which hack wire goes where.

The latest worry is whether I should simplify all the way back to using a single analog switch.  That would dump the second board and make the hacks much simpler.  The bottom of the battery would be always tied to ground.

Life would be much simpler, but I’m back to the resolution problem:  Since max battery voltage is ~25V (for overcharged 19.2V pack), I have to scale the input voltage to 25V full scale.  The 10 bits give me about 0.025V per step.  By subtracting the absolute voltage at one cell top from the abs V of the next cell top I get the cell voltage of one cell.  The range of voltages of interest for one cell are ~1.3V-1.0V.  That’s about 12 A/D steps covering the whole range.  Considering the usual +/- 1 LSB, I have maybe 10 steps to cover hot-off-the-charger to end-of-service.  Is that enough?  Well, I suppose so, but it’s pretty coarse.

So the real question is:  Is (barely) “good enough” good enough to let me do the major hack of simplifying the design back ot a single board?  Or is the challenge of making the more elegant design work worth the extra hassle?  Ugh.

This entry was posted in Discharge tester - PCB build. Bookmark the permalink.

Leave a Reply

Your email address will not be published. Required fields are marked *