## What most ADCs have in common

SAR and Sigma-Delta ADCs measure the input voltage in very different ways, but they both have one thing in common:

**It takes time to complete a reading.**

If the input voltage changes between the instant a reading starts and the instant it ends, it hurts the accuracy of the reading.

The SAR architecture is most sensitive to that kind of problem because it makes its largest assumptions first. A worst-case example would be a signal that starts just lower than VREF/2 when the SAR sets its most sigificant bit, then rises above VREF/2. The SAR's output will be 01111 regardless of how far above VREF/2 the input goes. The DAC's output can never cross VREF/2 after the MSB has been set to 0.

Sigma-Delta ADCs are better at tracking a moving signal, but are sensitive to changes near the end of the counting cycle. The worst-case example there is to have the summing capacitor still trying to reach the input voltage when the counting cycle ends.

ADC designers are painfully aware of those problems, and have found a solution that makes life better for both kinds of ADC:

## The sample-and-hold buffer

A sample-and-hold circuit uses the same 'store voltage in a capacitor' trick as a Sigma-Delta ADC. The usual S&H circuit has a capacitor, a set of analog switches, and a voltage buffer:

The sample-and-hold process starts with S1 closed and the other two switches open. The capacitor charges to known voltage V_{ref}.

The next step is to open S1 and close S2, letting the capacitor charge to V_{in}.

A capacitor's voltage can't change instantly, and in most cases the rate of change is controlled by the size of the capacitor and the resistance of the signal source charging it. The product of those two values is called the **RC time constant**, and has the units of seconds. The capacitor's voltage gets 63% closer to Vin every RC time constant.

For an ADC to work accurately, the sample-and-hold circuit needs to let the capacitor to charge within 1LSB of Vin. Here's a table showing the minimum number of RC time constants necessary to bring the capacitor voltage within 1LSB of Vin for various ADC resolutions:

8 10
12 16
24 32 |
0.0039 0.00097
0.00024 0.000015
0.000000059 0.00000000023 |
5.5 6.9
8.3 11
16.6 22 |

To get the capacitor voltage to within the recommended 1/6th of 1LSB, you'd need to add another 1.7 RC time constants to the values above.

Sampling capacitors are small (usually around 10pF), so with 1k of resistance from the signal source, one RC time constant is only 10 nanoseconds. Even so, the capacitor needs to stay connected to V_{in} long enough to charge properly.

The capacitor also needs to start from V_{ref} every time. If it starts at a different voltage (like the voltage of the previous sample), there will be some carry-over from one sample to the next. The rule for precision work is to try and do things the same way every time.

Once the capacitor voltage is sufficiently close to V_{in}, the sample-and-hold circuit can open S2 and close S3, connecting the capacitor to the input of a voltage buffer. The buffer protects the sampling capacitor from any disturbances caused by the ADC itself.

Ideally the capacitor voltage would stay constant until the sample-and-hold circuit opened S3 and closed S1, taking the capacitor voltage back to V_{ref} again. Unfortunately, all circuits have some leakage current, and it doesn't take much to create significant errors in a 10pF capacitor. The ADC's designers have to make sure the voltage remains stable long enough for the ADC to collect its reading.