Friday, June 09, 2006
New calibration standards
Calibration standards have to be made in order for the mass spec to correlate the intensity of the ion signal to concentration. You can't just turn it on and stick a sample in and get a result of 50ppb Cd like on CSI. (A mass spec finds atoms of interest in big piles of random atoms). What would happen if you did that is you would get some random result such as 564,000 CPS (counts per second). When I "calibrate" the instrument by analyzing a 25ppb (10,000CPS), 50ppb (20,000CPS), and 100ppb (40,000CPS) at the beggining of the run, then the instrument knows what 564,000 CPS is. Yes, if you graphed intensity (CPS) vs. concentration (ppb), it would be linear, that's how it works. Technically, 564,000 CPS is obviously overcal in this example and would have to be rerun diluted.
Back to the calibration standards, or multi-element solutions, these are a pain to make. You take bottles off the shelf such as 10,000ppm Sb, it's dissolved in acidified D.I. water from some kind of compound like Sb2O3. You take about 20 to 30 of these bottles off the shelf including things like Tl, Pb, Mn, and Ag. The tricky part is dispensing a very small and precise amount of each element into a 250 or 500mL volumetric flask to arrive at a solution where each element is exactly 50ppb. There is a quite simple calculation to find out how much 10,000ppm Sb to add to a 500mL flask to get 50 ppb. M1*V1 = M2*V2
(10,000ppm Sb)*(x mL) = (0.050ppm)*(500mL)
x = 0.0025mL or 2.5uL
In this case I would have to make an intermediate of say 10ppm and then dilute that down to 50ppb because our pipettes don't go down to 2.5uL.
Then I have your basic quality control guidelines and if the solution I made doesn't "pass", it has to be remade. One can see where it can become frustrating. And besides just calibration standards, I need to make a second source calibration check, an interference check solution, as well as all the spiking solutions used in the lab.