Abstrakt: |
A theoretical model was developed to describe the loss of analyte atoms in graphite furnaces during atomization. The model was based on two functions, one that described the supply of analyte by vaporization, and another that described the removal of the analyte by diffusion. Variation in working pressure was shown to affect the competition between these two processes. Optimal atomization efficiency was predicted to occur at a pressure where the supply of the analyte was maximized, and gas phase interactions between the analyte and matrix were minimized. Experiments to test the model included the direct determination of phosphorus and tellurium in nickel alloys and of cobalt in glass. In all cases, reduction in working pressure from atmospheric pressure to 7 Pa decreased sensitivity by 2 orders of magnitude, but improved temporal peak shape. For the atomization of tellurium directly from a solid nickel alloy, and the atomization of cobalt from an aqueous solution, no change in sensitivity was observed as the working pressure was reduced from atmospheric pressure to approximately 70 kPa. If a reduction in working pressure affected only the diffusion of the analyte, poorer sensitivity should have been obtained. Only a commensurate increase in analyte vaporization could account for maintained sensitivity at lower working pressures. Overall, analyte vaporization was not dramatically improved at reduced working pressures, and maximum atomization efficiency was found to occur near atmospheric pressure. |