Determination of Antimony in Lead-Antimony Alloys by Atomic Absorption Spectroscopy Using Indium as an Internal Standard

Author: Kramer, Gary W.

Source: Applied Spectroscopy, Volume 33, Issue 5, Pages 439-529 (September/October 1979) , pp. 468-470(3)

Publisher: Society for Applied Spectroscopy

Buy & download fulltext article:

OR

Price: $29.00 plus tax (Refund Policy)

Abstract:

A rapid, precise analytical method has been developed by the United States Department of the Interior, Bureau of Mines, for the determination of antimony in lead-antimony alloys. The alloy (0.10 to 30.0 wt.% Sb) is rapidly dissolved in a 10% volume nitric acid solution containing ~5 g of tartaric acid. Antimony is determined by atomic absorption using indium as an internal standard. The relative standard deviation for this internal standard method, based on the analysis of three National Bureau of Standards (NBS) Standard Reference Materials, is approximately 0.50%; an improvement over conventional atomic absorption and a volumetric method by a factor of 5 to 8. The internal standard procedure gives values that are in better agreement with the NBS certified values than analyses using the alternative methods. Analytical time required is less than 30 min.

Keywords: Analysis, for antimony; Atomic absorption, internal standard; Methods, analytical

Document Type: Research Article

DOI: http://dx.doi.org/10.1366/0003702794925228

Affiliations: Bureau of Mines, Avondale Metallurgy Research Center, Avondale, Maryland 20782

Publication date: September 1, 1979

More about this publication?
Related content

Tools

Key

Free Content
Free content
New Content
New content
Open Access Content
Open access content
Subscribed Content
Subscribed content
Free Trial Content
Free trial content

Text size:

A | A | A | A
Share this item with others: These icons link to social bookmarking sites where readers can share and discover new web pages. print icon Print this page