Proposal of Analytical Work

Background:

Analysis of WTC dust, focusing on red/gray chips and their properties, composition and reaction products.

Proposal:

Sample Preparation:

- Red/gray chip separation using optical microscopy and magnetic attraction to assist in isolation of particles of interest.

- Optical images of collected particulates as collected at appropriate magnifications to record condition as collected.

Sample Analysis:

- SEM/EDX with elemental quantification of red/gray chips, both red and gray layers.

- FTIR analysis of organic components of red/gray chips, both red and gray layers.

- ESCA small spot technique with argon ion sputter for depth profiling to definitively establish the presence of elemental aluminum within the red layer of the red/gray chips. Scans of gray layer also to be taken to add to information base.

- DSC analysis of red/gray chips focusing on exothermic/endothermic reactions near 400 degrees C. Some chips to be scanned in inert atmosphere and some in air or oxygen containing gas stream.

- SEM/EDX with elemental quantification of residual products of DSC analysis of red/gray chips.

- Optical images of reaction products after DSC experiments.
 

Analytical Costs:

The following work is in need of funding to be run at independent facilities.

- DSC costs are $190 per scan and an estimated 5 to 20 scans are desired, to look at the following materials in both air and inert atmospheres;

2 samples each of known WTC primer paint

2 samples each of red chips of suspected primer from WTC dust

5 sample each of red/gray chips or red layer only from red/gray chips

-ESCA costs are $330 per hour and a total of 4 to 8 hours is desired. This should allow for evaluation of at least two known thermitic red/gray chips with some sputtering for depth profile information as well.


Appendix I

Analytical Techniques Information

SEM/EDX- this acronym means (SEM) scanning electron microscope with (EDX) energy dispersive x-ray spectroscopy.

From Wikipedia with a bit of editing to correct some typos and to tailor the description to the SEM.

The technique known as energy-dispersive X-ray spectroscopy (EDS or EDX or EDAX) is an analytical technique used for the elemental analysis or chemical characterization of a sample. It relies on the investigation of an interaction of a source of X-ray excitation and a sample. Its characterization capabilities are due in large part to the fundamental principle that each element has a unique atomic structure allowing X-rays that are characteristic of an element's atomic structure to be identified uniquely from one another.

To stimulate the emission of characteristic X-rays from a specimen, a high-energy beam of charged particles such as electrons or protons (see PIXE), or a beam of X-rays, is focused into the sample being studied. In the SEM/EDX system the charged particles are high energy electrons, with energies of 20-30kV. At rest, an atom within the sample contains ground state (or unexcited) electrons in discrete energy levels or electron shells bound to the nucleus. The incident beam may excite an electron in an inner shell, ejecting it from the shell while creating an electron hole where the electron was. An electron from an outer, higher-energy shell then fills the hole, and the difference in energy between the higher-energy shell and the lower energy shell may be released in the form of an X-ray. The number and energy of the X-rays emitted from a specimen can be measured by an energy-dispersive spectrometer. As the energy of the X-rays are characteristic of the difference in energy between the two shells, and of the atomic structure of the element from which they were emitted, this allows the elemental composition of the specimen to be measured.

The Wikipedia link for more information is

http://en.wikipedia.org/wiki/SEM-EDX


FTIR- this acronym means Fourier transform infrared spectroscopy.

From Wikipedia with a bit of editing to correct some typos and to tailor the description to our interests.

FTIR- is a particular form of the more general technique of general infrared spectroscopy (IR spectroscopy) which is the spectroscopy that deals with the infrared region of the electromagnetic spectrum, that is light with a longer wavelength and lower frequency than visible light. It covers a range of techniques, mostly based on absorption spectroscopy. As with all spectroscopic techniques, it can be used to identify and study chemicals. A common laboratory instrument that uses this technique is a Fourier transform infrared (FTIR) spectrometer.

The infrared portion of the electromagnetic spectrum is usually divided into three regions; the near-, mid- and far- infrared, named for their relation to the visible spectrum. The higher energy near-IR, approximately 14000–4000 cm−1 (0.8–2.5 μm wavelength) can excite overtone or harmonic vibrations. The mid-infrared, approximately 4000–400 cm−1 (2.5–25 μm) may be used to study the fundamental vibrations and associated rotational-vibrational structure. The far-infrared, approximately 400–10 cm−1 (25–1000 μm), lying adjacent to the microwave region, has low energy and may be used for rotational spectroscopy. The names and classifications of these subregions are conventions, and are only loosely based on the relative molecular or electromagnetic properties.

Infrared spectroscopy exploits the fact that molecules absorb specific frequencies that are characteristic of their structure. These absorptions are resonant frequencies, i.e. the frequency of the absorbed radiation matches the frequency of the bond or group that vibrates. The energies are determined by the shape of the molecular potential energy surfaces, the masses of the atoms, and the associated vibronic coupling.

Fourier transform infrared spectroscopy (FTIR)[1] is a technique which is used to obtain an infrared spectrum of absorption, emission, photoconductivity or Raman scattering of a solid, liquid or gas. An FTIR spectrometer simultaneously collects spectral data in a wide spectral range. This confers a significant advantage over a dispersive spectrometer which measures intensity over a narrow range of wavelengths at a time. FTIR has made dispersive infrared spectrometers all but obsolete (except sometimes in the near infrared), opening up new applications of infrared spectroscopy.

The term Fourier transform infrared spectroscopy originates from the fact that a Fourier transform (a mathematical algorithm) is required to convert the raw data into the actual spectrum.


The goal of any absorption spectroscopy (FTIR, ultraviolet-visible ("UV-Vis") spectroscopy, etc.) is to measure how well a sample absorbs light at each wavelength. The most straightforward way to do this, the "dispersive spectroscopy" technique, is to shine a monochromatic light beam at a sample, measure how much of the light is absorbed, and repeat for each different wavelength. (This is how UV-Vis spectrometers work, for example.)

Fourier transform spectroscopy is a less intuitive way to obtain the same information. Rather than shining a monochromatic beam of light at the sample, this technique shines a beam containing many different frequencies of light at once, and measures how much of that beam is absorbed by the sample. Next, the beam is modified to contain a different combination of frequencies, giving a second data point. This process is repeated many times. Afterwards, a computer takes all these data and works backwards to infer what the absorption is at each wavelength.

The beam described above is generated by starting with a broadband light source—one containing the full spectrum of wavelengths to be measured. The light shines into a certain configuration of mirrors, called a Michelson interferometer, that allows some wavelengths to pass through but blocks others (due to wave interference). The beam is modified for each new data point by moving one of the mirrors; this changes the set of wavelengths that pass through.

As mentioned, computer processing is required to turn the raw data (light absorption for each mirror position) into the desired result (light absorption for each wavelength). The processing required turns out to be a common algorithm called the Fourier transform (hence the name, "Fourier transform spectroscopy"). The raw data is sometimes called an "interferogram

Links to the full articles for those wanting more information.

http://en.wikipedia.org/wiki/FTIR

http://en.wikipedia.org/wiki/Infrared_spectroscopy


ESCA- this acronym means electron spectroscopy for chemical analysis.

From Wikipedia with a bit of editing to correct some typos and to tailor the description to our interests.

X-ray photoelectron spectroscopy (XPS) is a quantitative spectroscopic technique that measures the elemental composition, empirical formula, chemical state and electronic state of the elements that exist within a material. XPS spectra are obtained by irradiating a material with a beam of X-rays while simultaneously measuring the kinetic energy and number of electrons that escape from the top 1 to 10 nm of the material being analyzed. XPS requires ultra high vacuum (UHV) conditions.

XPS is a surface chemical analysis technique that can be used to analyze the surface chemistry of a material in its "as received" state, or after some treatment, for example: fracturing, cutting or scraping in air or UHV to expose the bulk chemistry, ion beam etching to clean off some of the surface contamination, exposure to heat to study the changes due to heating, exposure to reactive gases or solutions, exposure to ion beam implant, exposure to ultraviolet light.

Links to the full article for those wanting more information.

http://en.wikipedia.org/wiki/X-ray_photoelectron_spectroscopy
 


DSC- this acronym means Differential scanning calorimetry.

From Wikipedia with a bit of editing to correct some typos and to tailor the description to our interests.

Differential scanning calorimetry or DSC is a thermoanalytical technique in which the difference in the amount of heat required to increase the temperature of a sample and reference is measured as a function of temperature. Both the sample and reference are maintained at nearly the same temperature throughout the experiment. Generally, the temperature program for a DSC analysis is designed such that the sample holder temperature increases linearly as a function of time. The reference sample should have a well-defined heat capacity over the range of temperatures to be scanned.

The basic principle underlying this technique is that when the sample undergoes a physical transformation such as phase transitions, more or less heat will need to flow to it than the reference to maintain both at the same temperature. Whether less or more heat must flow to the sample depends on whether the process is exothermic or endothermic. For example, as a solid sample melts to a liquid it will require more heat flowing to the sample to increase its temperature at the same rate as the reference. This is due to the absorption of heat by the sample as it undergoes the endothermic phase transition from solid to liquid. Likewise, as the sample undergoes exothermic processes (such as crystallization) less heat is required to raise the sample temperature. By observing the difference in heat flow between the sample and reference, differential scanning calorimeters are able to measure the amount of heat absorbed or released during such transitions. DSC may also be used to observe more subtle phase changes, such as glass transitions. It is widely used in industrial settings as a quality control instrument due to its applicability in evaluating sample purity and for studying polymer curing.[4][5][6]

Links to the full article for those wanting more information.

http://en.wikipedia.org/wiki/Differential_scanning_calorimetry
 


Proposal of Analytical Work_A&E_11_09_2011.doc

MarkBasile.org