When I try telling the software that my samples are coated with 0.6nm of iridium instead of the default 20nm of C, is the software able to handle this? I have noticed that the numbers are exactly the same (down to the third decimal place) if I process the data both with the C and the Ir coating. This seems a little suspect to me.... but if the standards and the unknown have the same coating then maybe it is real? Though or ideas?
Quote from: pgopon on September 22, 2015, 09:10:46 AM
When I try telling the software that my samples are coated with 0.6nm of iridium instead of the default 20nm of C, is the software able to handle this? I have noticed that the numbers are exactly the same (down to the third decimal place) if I process the data both with the C and the Ir coating. This seems a little suspect to me.... but if the standards and the unknown have the same coating then maybe it is real? Though or ideas?
The coatings you specify in the Standard menu (for standards) and the Calculation Options dialog (for unknowns) are there for documentation only- until you turn on the analysis options as described here:
http://smf.probesoftware.com/index.php?topic=23.msg1258#msg1258
Wingo!! thanks John didn't have the use conductive coating for absorption/tranmission in analysis options checked. Works now
phil