As you point out, it either works or doesn't work with digital solutions. When you are integrating a truly analog instrument like a pH probe into some form of readout, it is nice to be precise. But it still requires frequent calibration in order to assure that reading. I contend that employing an analog meter with an analog probe doesn't really cost you anything in terms of accuracy, since both have to be calibrated prior to use.
Flow meter, density meter, scale, thermometer, gravimeter, sound pressure level meter, flue gas analyzer, spectrophotometer, barometer/altimeter, ORP meter, conductivity meter, magnetometer, radiation detector, light meter etc., are all the same. An analogue signal from a sensor is digitized and standards (of pressure, temperature, pH, conductivity, weight, CO content, SPL, absorption etc. are used to develop the parameters of a calibration curve in accordance with a model. When we had no alternative but to do these things in analogue circuitry we did. We are no longer so limited and can thus
1)Use more sophisticated models
2)Simplify the calibration process (from the operators POV - the actual calibrations and parameter estimate computations are more complex as the digital implementation permits this less burden on the operator.
The advantage of digital, is that an additional source of potential error or drift is REDUCED, but not eliminated.
It is, for all practical purposes, eliminated. The gain of the temperature compensation amplifier may change with time and backlash may interfere with precise setting of the temperature compensation potentiometer (and slope and offset pots as well) but when you multiply by 1.0234 in a microprocessor and that number has been calculated from accurate temperature measurements you always have a temperature adjustment of 1.0234. Yes, that number is quantized but quantized well beyond the point it needs to be in order to have no significant effect on the performance of the instrument.
An additional advantage is that you can program all sorts of compensations and checks to help improve accuracy.
Not only do they improve accuracy but make the calibration process much easier, quicker and robust. You do not have to read temperature for the first buffer, set the ATC gain pot for that temperature, look up the pH of the buffer at that temperature, adjust the offset pot to read that pH, measure the temperature of the second buffer, adjust the ATC gain pot, adjust the slope pot, read the sample temperature, readjust the ATC pot (or alternatively have both buffers and sample in a water bath) and finally rad the pH. It is all done automatically. You don't even have to pay attention to which buffer you are in. This allows you to focus more on the task at hand - measuring the pH of your wort or beer.
But in the case of some digital pH meters, those checks and balances can force you to make that hard decision to replace a probe earlier than truly necessary. When you have freshly calibrated your meter and probe in the range of interest, you can get by with SOME level of accuracy. The question is...how accurate is your meter when the probe is starting to go south?
That's a valid question whether the implementation be digital or analogue. In either case it is easy enough to verify with a quick cal check or if in doubt a full stability test.
In some ways, having the meter force you into replacing an out of spec probe is a good thing.
It is for the manufacturers.
But if the criteria used by the meter to assess probe life is too stringent, you are going to be buying a lot of probes. In the case of those proprietary probes, the manufacturer has you right where they want you...buying more replacement parts.
Not until you want to. You can keep going past the point where the meter's firmware won't give you a cal because the slope is below 95% or the offset greater than X mV. You may want to start looking at response time and stability at this point to see if maybe you should replace that electrode but it is easy enough to continue to calibrate the meter based on its mV (or if these are not available, pH) readings in two buffers. If you have made all the preparations for brewing today (lets make it Christmas day so there is absolutely no chance of getting a new electrode), go to calibrate and your meter says "Sorry, no cal, slope < 95%" you will definitely want to do this as the choice is the inconvenience of the manual cal or no pH measurements. All brewers should have the manual cal procedure in a handy spreadsheet (sounds like a good side-calculator for Brun Water) for this eventuality). I actually do this as a matter of course in my lab work. I haven't calibrated that meter in years. I don't mean that I don't measure buffers. I just do the pH calculation based on comparison of buffer and sample mV readings (and their temperatures). This is actually not a bad way to go if you have, for example, a Hanna pHEP meter which, while it is quite stable, doesn't calibrate properly because it takes the cal readings too soon.
While you do still have to assess the stability and accuracy of your probe and its ability to function properly, I believe it is still valid to employ analog meters to measure pH.
Yes, of course. If you have an analogue meter that passes a cal check and./or stability check it is fine.
I don't believe that digital meters are going to definitely provide higher accuracy or stability when both instruments are freshly calibrated.
Stability is not at issue right at cailbration. It is what happens after calibration that is important. The digital implementation is going to be more stable than the analogue because digital circuits don't drift while analogue ones do. In engineering the goal is to get 'noise' down to at least 10 db below the signal. The noise limit in pH measurement is the buffer tolerance (±0.02 pH in most cases) and we therefore want drift to be less than ±0.006 pH. We want the electronics noise to be 10 dB less than that i.e. ±0.002 pH so that the electrode can be the limiting factor in the equipment related part of the error budget. This is easily done in a digital implementation. If the electrode is the limiting factor then it determines how often we must recalibrate. Not the electronics. A more sophisticated argument in favor of an analogue approach would argue that the contribution of an analogue circuit to the error budget is not appreciably greater with modern techniques than that of a digital one. But I think the big advantage for the digital implementation is the reduction in demand on the user and the robustness of the calibration process. It is clear from the market place that high performance meters are less expensive to produce using digital means and that this is what the consumers desire.
I am reminded of the days (40 years ago) of the arguments that digitally recorded music couldn't sound as good as a vinyl record.