I'm starting a new topic on the history of EPMA instruments and whatever else (software, people, etc.) that we want to talk about.
I'll start with some comments I made in response to Paul's mention of the shaw.dat k-ratio measurements which were apparently made on a MAC EPMA instrument long, long ago (fortunately the glass materials seem to still be available for new measurements).
Quote from: Paul Carpenter on April 28, 2017, 08:42:14 AM
Secondly. These measurements were made either on an ARL or the MAC probe and are subject to discussion regarding the instrumental stability in the case of the ARL (and takeoff angle not directly comparable to all other measurements made at 40 deg), and in the case of the MAC, non-normal beam incidence.
Since the ARL (SEMQ) was known for it's relatively high 52.5 deg takeoff angle (it was my first EPMA instrument when I arrived at at UC Berkeley), I'm guessing that the 38.5 degrees in the shaw.dat file means these measurements were done on a MAC probe? I never saw one of these instruments myself, but if I google "MAC EPMA 38.5", the first search result returned is a talk by a guy named Paul Carpenter that gave a presentation at UofO in 2007 entitled "Electron-Probe Microanalysis: Instrumental Calibration, Standards, Quantitative Analysis, and Problem Systems":
http://epmalab.uoregon.edu/Workshop2/Carpenter_Oregon_Workshop_2007.pdf
It's only 118 slides long, so just a brief overview! But in scanning through it I find results on slide 33 from the "Shaw" dataset measured on a MAC probe with a takeoff angle of 38.5 degrees. I also found a link to a talk by John Fournelle that mentions the ARL EMX and MAC instruments from 1960 on slide 27:
www.geology.wisc.edu/~johnf/g777/ppt/10_Historical_Development.ppt
So I guess the EMX and MAC instruments were both made by ARL prior to their SEMQ model?
john
Then Paul responded:Quote from: Paul Carpenter on May 01, 2017, 03:53:43 PM
Thanks for reminding me about the MAC takeoff angle of 38.5 degrees. ARL did not mfg. the MAC probe. I think it stands for Materials Analysis Corporation.
Cheers,
Paul
Interesting. It makes sense because I seem to remember now that the EMX probe was what ARL manufactured before the SEMQ. I think I even saw one once "in the wild", but can't remember where. Does anyone know what takeoff angle the EMX had?
Oregon may have had a MAC probe at one time. After I arrived here I found a standard mount labeled "MAC standards". I attach a drawing of the unusual layout below.
I'm more familiar with the subsequent history of the ARL SEMQ. ARL was eventually bought by Bausch & Lomb, then split and sold off to Shimadzu and Advanced Microbeam. Shimadzu still sells new EPMA instruments in Japan (and China also). And they all have a 52.5 degree takeoff angle!
john
Then Anette responded:
Quote from: Paul Carpenter on May 01, 2017, 03:53:43 PM
Thanks for reminding me about the MAC takeoff angle of 38.5 degrees. ARL did not mfg. the MAC probe. I think it stands for Materials Analysis Corporation.
Paul
Yes, ARL did not manufacture the MAC probe but both came out of California apparently.
Wittry (2001, M&M) has much more information on the MAC probes, started in 1960 by Macres, who studied under Ogilvie at the MIT: "The competitive pressure forced on the industry by ARL with its high take-off angle, dictated that new instruments also have a high take-off angle. However, ARL's inverted lens design was patented, so the MAC 400 achieved its 38.5-degree take-off angle by inclining the sample. Many of the leaders in the microprobe community strongly objected to the use of an inclined sample, as all the quantitative algorithms were either developed or substantiated using normal electron beam incidence on the specimen."
The University of Minnesota had a MAC probe at one time and I still have one spectrometer. For fun, here is a list of commercial sources for EPMA, coming out of an ASTM booklet from 1972.
Anette
Anette also wrote:
Quote from: Probeman on May 04, 2017, 01:13:26 AM
Does any one know what takeoff angle the EMX had?
It was also 52.5. I attached the Wittry paper that goes into all the details of the various spectrometer designs very nicely.
Quote from: Probeman on May 04, 2017, 01:13:26 AM
Oregon may have had a MAC probe at one time. After I arrived here I found a standard mount labeled "MAC standards". I attach a drawing of the unusual layout below.
Thank you for that layout! I have the exact same standard block! Except that the Be was killed before my time and then I killed the Mg and Mn metal unfortunately (and I am still looking for advice how to best re-polish this mount with a wide range of hardnesses).
Quote from: Probeman on May 04, 2017, 01:13:26 AM
Shimadzu still sells new EPMA instruments in Japan (and China also). And they all have a 52.5 degree takeoff angle!
john
They are also now on the American market. They have a "show room" Shimadzu Lab in Arlington, Texas (http://www.uta.edu/sirt/cefms/equipment/EPMA/EPMA.php) and someone bought one at least in Brazil (?).
Anette
Hi Anette,
How right you are about polishing a standard block with material hardnesses that range from from Al2O3 to Au!
The only person I can think to ask is Tim Teague at UC Berkeley. He can polish anything. He's close to retirement so don't hesitate to contact him!
john
Hi John, could your MAC standard block also be from these guys:
http://www.macstandards.co.uk
Quote from: Karsten Goemann on October 24, 2017, 02:09:47 AM
Hi John, could your MAC standard block also be from these guys:
http://www.macstandards.co.uk
Hi Karsten,
Interesting.
I'll bet you are correct!
john
You learn something new everyday!
One of my students asked why the element fluorine has the same root as fluorescence. So we did some wiki searches and found that the word fluorescence comes from the Latin word "fluo" which means "to flow".
https://en.wikipedia.org/wiki/Fluorescence
Now why would that be? Well it turns out that the word fluorescence originates from the fact that the mineral fluorite was one of the first minerals that fluorescence was observed in! Due to REEs apparently. But why "to flow"? Well, because fluorite was used as an early "flux" material (note the same root!) to remove oxides to lower the melting point when smelting ores and brazing metals.
https://en.wikipedia.org/wiki/Fluorite
And what does this have to do with fluorine? Well of course fluorite was the first material from which fluorine was attempted to be isolated from (dangerous work!).
https://en.wikipedia.org/wiki/Fluorine
I wish I could have said that my high school Latin came in handy but I've forgotten more than I remember!
john
Quote from: John Donovan on October 24, 2017, 10:26:42 AM
Quote from: Karsten Goemann on October 24, 2017, 02:09:47 AM
Hi John, could your MAC standard block also be from these guys:
http://www.macstandards.co.uk
Hi Karsten,
Interesting.
I'll bet you are correct!
john
I don't think so. I have the exact same standard block (elements and layout) and it definitely predates MAC Ltd (founded in 1981 according to their webpage).
To my knowledge, it was a standard block that came with the MAC400 electron microprobe.
When I was hunting for some old papers I found ads for various electron microprobes among the digitized content. Maybe someone else finds them as enjoyable as me. Mostly from the 60'ies and 70'ies.
First comes some ARL....
and some more ARL....
and then some Cameca. If anyone has anything else (JEOL, MAC etc) I would be very interested.
Hi Anette,
These ads are a "blast from the past". Thanks for posting them.
I actually started on an ARL SEMQ at UC Berkeley in the mid 1980s as a mechanical technician. It was an "interesting" instrument to say the least. After a student managed to implode the vacuum chamber (let's call it an explosive depressurization, a long story), the electronics tech and I rebuilt it completely. It was an education in EPMA I can tell you. It was after that when I realized the software needed to be improved. The rest is history as they say.
This isn't an ad, but it's the first Cameca MS85 EPMA (MS = MicroSonde) built in 1956 (see attached below- remember to login to see attachments). We're both the same age...
john
I forgot that I have one ad for JEOL too, unfortunately not for a microprobe but SEM. Still fun.
Quote from: Anette von der Handt on February 15, 2018, 04:26:29 PM
I forgot that I have one ad for JEOL too, unfortunately not for a microprobe but SEM. Still fun.
100 angstroms resolution is 10 nm, so not bad for almost 50 years ago!
This video is described as showing an electron probe magnifier from the 1960's. It might be an SEM though. Anyone can recognize a make/model? What is that rotating turret at the end?
https://www.youtube.com/watch?v=i2oyzt98oc0
Hello,
A snapshot of the early history of electron microprobe manufacturers may be found in Table 18 of Beaman, D.R. & Isasi, J.A. (1972) Electron Beam Microanalysis. ASTM Special Technical Publication 506. This table (attached) lists the various manufacturers, their common acronyms, and then-current U.S. addresses.
A handy figure that shows the various takeoff angles for X-rays in many early instruments can be found in Figure 2 of Smith, D.G.W. & Rucklidge, J.C. (1973) Electron microprobe analysis in the earth sciences. Advances in Geophysics 16, 57-154. I have also attached this file to this post.
Cheers,
Andrew
Further early history: 11 manufacturers (including MAC) are listed in the table given in Wittry, D.B. (1969) Recent Advances in Instrumentation for Microprobe Analysis. In: Möllenstedt G., Gaukler K.H. (eds) Vth International Congress on X-Ray Optics and Microanalysis. Springer, Berlin, Heidelberg. A copy in PDF format of the table is attached.
I'm still enjoying reading "The Disappearing Spoon- and other true tales of madness, love, and the history of the world from the periodic table of the elements"- whew, long title.
There's the story of how Monte Carlo calculations came to be (named). Basically Stanislaw Ulam who played solitaire card games starting wondering what the chances of winning any randomly dealt hand were. That led to discussions with John von Neumann about such calculations, when they realized that this idea could be applied to all sorts of problems with lots of random variables. After all quantum mechanics are strictly probabilistic. And computers were just becoming able to handle these types of calculations.
The naming of this calculation method is not completely clear according to the author, but he states that "historically, the science of probabilities has roots in aristocratic casinos, and Ulam liked to brag that he named it in memory of an uncle who often borrowed money to gamble on the "well known generator of random integers (between zero and thirty six) in the Mediterranean principality"".
Quote from: Probeman on February 14, 2018, 04:27:39 PM
I actually started on an ARL SEMQ at UC Berkeley in the mid 1980s as a mechanical technician. It was an "interesting" instrument to say the least. After a student managed to implode the vacuum chamber (let's call it an explosive depressurization, a long story), the electronics tech and I rebuilt it completely. It was an education in EPMA I can tell you. It was after that when I realized the software needed to be improved. The rest is history as they say.
Looking at the new Shimadzu literature on their EPMA-1720, it looks like the spectrometers accept xrays from above the objective lens. Were the ARL probes like that? It's a very interesting design. Does anybody else do it that way?
Quote from: Jacob on October 14, 2018, 05:48:04 PM
Quote from: Probeman on February 14, 2018, 04:27:39 PM
I actually started on an ARL SEMQ at UC Berkeley in the mid 1980s as a mechanical technician. It was an "interesting" instrument to say the least. After a student managed to implode the vacuum chamber (let's call it an explosive depressurization, a long story), the electronics tech and I rebuilt it completely. It was an education in EPMA I can tell you. It was after that when I realized the software needed to be improved. The rest is history as they say.
Looking at the new Shimadzu literature on their EPMA-1720, it looks like the spectrometers accept xrays from above the objective lens. Were the ARL probes like that? It's a very interesting design. Does anybody else do it that way?
Hi Jacob,
Yes, it is an interesting design. Everyone else seems to have settled on a 40 degree takeoff angle accepting x-rays from underneath the objective lens. The ARL/Shimadzu was designed that way to obtain a very high takeoff angle of 52.5 degrees (if you've ever seen this angle and wondered where it comes from, it's from the ARL/Shimadzu design). One advantage is smaller matrix corrections, for example F Ka in CaF2 at 40 degrees:
SAMPLE: 32767, TOA: 40, ITERATIONS: 0, Z-BAR: 14.64681
ELEMENT ABSCOR FLUCOR ZEDCOR ZAFCOR STP-POW BKS-COR F(x)u Ec Eo/Ec MACs
Ca ka 1.0024 1.0000 1.0134 1.0158 1.0505 .9646 .9606 4.0390 3.7138 153.760
F ka 3.2329 .9998 .9795 3.1661 .9449 1.0366 .2318 .6870 21.8341 6822.03
ELEMENT K-RAW K-VALUE ELEMWT% OXIDWT% ATOMIC% FORMULA KILOVOL
Ca ka .00000 .50536 51.335 ----- 33.333 1.000 15.00
F ka .00000 .15371 48.665 ----- 66.667 2.000 15.00
TOTAL: 100.000 ----- 100.000 3.000
And here at 52.5 degrees:
SAMPLE: 32767, TOA: 52.5, ITERATIONS: 0, Z-BAR: 14.64681
ELEMENT ABSCOR FLUCOR ZEDCOR ZAFCOR STP-POW BKS-COR F(x)u Ec Eo/Ec MACs
Ca ka 1.0019 1.0000 1.0134 1.0153 1.0505 .9646 .9679 4.0390 3.7138 153.760
F ka 2.7944 .9998 .9795 2.7366 .9449 1.0366 .2825 .6870 21.8341 6822.03
ELEMENT K-RAW K-VALUE ELEMWT% OXIDWT% ATOMIC% FORMULA KILOVOL
Ca ka .00000 .50559 51.335 ----- 33.333 1.000 15.00
F ka .00000 .17783 48.665 ----- 66.667 2.000 15.00
TOTAL: 100.000 ----- 100.000 3.000
This originally American designed/made instrument was innovative in several ways, but it also had some significant issues for example, very small Bragg crystals which was only partially compensated for by using a very small (127 mm) focal circle. But the ability to have six tunable spectrometers is very attractive from an analytical perspective, if fact that is why Probe for EPMA still has the capability of handling six spectrometer instruments!
I'm compiling a list of (dead) people that should be mentioned in the history of EPMA.
I have:
Hans Bethe (for electron energy loss)
J. J. Thompson (for discovery of the electron)
Wilhelm Röntgen (for the discovery of x-rays)
Henry Mosely (for discovery of atomic number and wavelength relationship)
Raimond Castaing (for invention of EPMA instrument and physical basis of matrix corrections)
Any others come to mind?
john
Quote from: Owen Neill on November 30, 2018, 02:41:38 PM
John - you're list is going to be a long one, but for starters:
William Lawrence and William Henry Bragg (Braggs' Law)
Andre Guinier (Guinier XRD cameras and Castaing's PhD supervisor)
August Beer (X-ray attenuation)
Frans Michel Penning (cold-cathode vacuum gauges)
Max Planck (Planck's law)
Ernst Ruska/Max Knoll/Manfred von Ardenne (scanning electron beams)
Walter Schottky (Schottky effect)
Doh! Of course the Bragg father/son duo is sorta obvious. Thanks!
Quote from: Owen Neill on November 30, 2018, 02:41:38 PM
John - you're list is going to be a long one, but for starters:
William Lawrence and William Henry Bragg (Braggs' Law)
Andre Guinier (Guinier XRD cameras and Castaing's PhD supervisor)
August Beer (X-ray attenuation)
Frans Michel Penning (cold-cathode vacuum gauges)
Max Planck (Planck's law)
Ernst Ruska/Max Knoll/Manfred von Ardenne (scanning electron beams)
Walter Schottky (Schottky effect)
OK, we added some more birthdays of famous people in EPMA to the "special greeting" list in Probe for EPMA. There are several in December. Maybe you will find out who they are!
I am saddened to receive this notice on the passing of Dr. Robert (Bob) Tracy:
QuoteDear Faculty, Staff, and Students:
With great sadness, I must inform you that Dr. Bob Tracy passed away late last night.
Bob has been a stalwart of our department for many years, having served as both a professor and department head. He was a generous colleague, a connoisseur of wine and monazite, a peerless microprobe-whisperer, and a selfless mentor to students and junior faculty alike. He leaves a void that will be impossible to fill.
At the end, Bob's wife Pat was with him, and he had been visited by several close friends from the department, including Mark and Kristie Caddick, Nancy Ross, and Maddy Schreiber.
While this is a sad day for our department, at an appropriate point in the future we will hold a memorial event that will celebrate Bob's life and his many contributions to the Department of Geosciences, the College of Science, and Virginia Tech.
Steve
____________________________
W. Steven Holbrook
Professor and Head
Department of Geosciences
Virginia Tech
926 West Campus Drive
4044 Derring Hall MC 0420
Blacksburg, VA 24061
wstevenh@vt.edu
http://www.steveholbrook.com
540-231-6521 (voice)
Begin forwarded message:
From: Mark Caddick <caddick@VT.EDU>
Subject: Bob Tracy
Date: January 10, 2019 at 12:09:24 PM CST
To: <GEO-METAMORPHISM@JISCMAIL.AC.UK>
Reply-To: Metamorphic Studies Group <GEO-METAMORPHISM@JISCMAIL.AC.UK>, Mark Caddick <caddick@VT.EDU>
Dear All,
It is with profound sadness that we write to inform you of the loss of our friend and colleague Robert (Bob) Tracy, who died at his home in Blacksburg early on Sunday morning. Bob is survived by his wife of 50 years, Patricia.
Bob was a metamorphic petrologist whose thinking always stemmed from the deepest understanding of mineral structure and chemistry, and from a delight in the beauty of phase equilibria. He made important contributions to our understanding of the measurement and significance of chemical zoning in metamorphic minerals, of metamorphic phase equilibria, of the processes of crustal melting and the mineralogy of residual rocks, of the behavior of sulfur during metamorphism, of microprobe dating of monazite, and of the tectono-metamorphic evolution of New England. More importantly, Bob was a generous colleague, an enthusiastic teacher, and a fiercely loyal friend to the petrologic community.
Bob was born in Washington, D.C., in 1944. He obtained his A.B. degree from Amherst College in 1967 before receiving an M.S. from Brown University in 1970 for work that first introduced him to the Cortlandt Complex of New York State – rocks that he would continue to work on throughout his career. Bob's Ph.D. (1975) was from the University of Massachusetts at Amherst under the supervision of Peter Robinson, focusing on metamorphic reactions and partial melting in pelitic schists of the Quabbin Reservoir Area, MA. He then moved to Harvard as a research fellow, during which time he published influential work with Alan Thompson on anatexis in pelitic rocks and inferring metamorphic histories from chemical zoning in garnet. Bob moved to Yale in 1978 as an assistant and then associate professor and moved to Virginia Tech in 1986 as full professor. Bob was a great departmental citizen, serving as Department Chair from 2005 and 2008 and as Associate Chair from 2012 to 2018, and devoted much time to looking after and nursing along aging electron microprobes from which he was able to extract phenomenal data. He was also extremely active in professional societies, in particular the Geological Society of America, in which he assumed numerous leadership roles.
Despite Bob's substantial and diverse research and service contributions, he will be known to many students because of his co-authorship with Harvey Blatt on the second edition of the textbook "Petrology: Igneous, Sedimentary, and Metamorphic" (published 1985). This landmark textbook was updated with Brent Owens and published in its 3rd edition as Blatt, Tracy and Owens in 2006. It is still in common use today.
Those who knew Bob well, or who have subscribed to this listserv for some time, will be aware of both the depth and the breadth of his knowledge. He was a font of information, a walking encyclopedia of mineralogy, petrology, optical and electron microscopy, sample preparation, wine, french cuisine, European and American political history, fishing, barbecue (amongst many other topics). Bob liked nothing more than using this information to help people, and students in particular, generally espousing his wisdom with a carefully pointed wit. His generosity in terms of sharing his time, knowledge, ideas, data and opinion are difficult to equal.
He will be greatly missed,
Mark Caddick & Nancy Ross,
Department of Geosciences, Virginia Tech
########################################################################
I think it's Peter Duncumb's birthday today:
(https://smf.probesoftware.com/gallery/395_26_01_19_12_31_59.png)
Here's a photo I found recently that shows the ARL SEMQ that UC Berkeley purchased in the late 1970s at a trade show. Yes, the actual instrument which they got a discount on because it was a "prototype".
(https://smf.probesoftware.com/gallery/395_14_03_21_3_58_36.png)
My advice: never buy anything with a low serial number!
After the instrument had been destroyed by an errant airlock exchange valve decapitating the stage and imploding the diffusion pump thus spraying Fomblin fluid all over the electron column, light optics and WDS spectrometers, which then polymerized forming a hard plastic coating on everything, the electronics tech (George Engeman) and mechanical tech (yours truly) were called in to disassemble, clean (in boiling methylene chloride!), and rebuild the instrument. Here is a photo of myself testing the new instrument still running on the trusty PDP-11 with 64K RAM:
(https://smf.probesoftware.com/gallery/395_14_03_21_4_14_38.png)
The PDP-11 and the (removable) 10 MB disk packs are seen in the lower left.
It was then we realized that we needed a better computer so here is the same mechanical technician after writing new software for the then recently released IBM AT PC with 10 times the memory (640K)!
(https://smf.probesoftware.com/gallery/395_14_03_21_4_15_37.png)
Remember "640K ought to be enough for anybody" sometimes attributed to Bill Gates. The IBM PC can be seen behind the technician.
Also note the monocular light optics which was a modification, because the original binocular optics produced images of a slightly different sizes, which caused most users "wanging" headaches because the brain was trying to see them as the same size FOV.
cracking pictures - that ARL looks like how I imagined the future! and thats no Technician!
I don't know if anyone has pics, but the old Camebax we had in Bristol was held together by vacuum alone (there was a bet that someone, no names, might knock the 'power off' and the whole column collapse around them with comedy effect).
I was quite fond of this old banger until I found an old replacement green CRT screen in a box behind it. The screen (the imaging screen) had been made in a factory that catastrophically blew up whilst I was working there. My mate had got me the job when I left school and, luckily for me, I was a bit hungover one morning and had sneaked off to the canteen for a bacon roll rather than checking on a vapour deposition process. What my chum didn't tell me was that he had been blown up two years previously checking the same process..... Anyway, thats why, if you needed a replacement CRT screen for a Camebax you couldn't get one for love nor money - I was hungover.
Our Cameca SX50 came with special Sony "overlay" monitors that were originally built for the film industry. They were used to display number of feet of film remaining, etc. Cameca utilized them for displaying the WDS crystals and mag info.
I was happy to replace these old vacuum tube monitors with flat screens (using a homemade video overlay circuit), but at least the old Sony monitors never blew up on us!
https://smf.probesoftware.com/index.php?topic=173.msg752#msg752
Login to see the attachments which include our video display modifications in order to utilize flat screen monitors for the overlay video.
Who reported the first Earth Science material compositions by EPMA?
Anything older than this one?
STUMPFL, E.F. (1961). Some new platinum-rich minerals identified with the electron microanalyser. Mineralog Mag 32, 833–847
Many thanks
Eric
Quote from: ericwgh on July 22, 2021, 01:55:45 AM
Who reported the first Earth Science material compositions by EPMA?
Anything older than this one?
STUMPFL, E.F. (1961). Some new platinum-rich minerals identified with the electron microanalyser. Mineralog Mag 32, 833–847
Many thanks
Eric
Looking at the Stumpfl article, it references Castaing's 1960 chapter Electron Probe Microanalysis in Advances in Electronics and Electron Physics (https://doi.org/10.1016/S0065-2539(08)60212-7). In that, there's a section on mineralogy that references some early work on earth science topics:
Guillemin, C. and Capitant, M., (1960) Utilisation de la microsonde électronique de Castaing pour des études minéralogiques, a report from the 21st international geological congress
Castaing, R., and Fredriksson, K., Analyses of cosmic spherules with an X-ray microanalyser Geochim. et Cosmhim. Acta 14, 114 (1958).
https://doi.org/10.1016/0016-7037(58)90099-1
Plus a couple from 1957 that I couldn't access.
Quote from: JonF on July 22, 2021, 04:23:01 AM
Quote from: ericwgh on July 22, 2021, 01:55:45 AM
Who reported the first Earth Science material compositions by EPMA?
Anything older than this one?
STUMPFL, E.F. (1961). Some new platinum-rich minerals identified with the electron microanalyser. Mineralog Mag 32, 833–847
Many thanks
Eric
Looking at the Stumpfl article, it references Castaing's 1960 chapter Electron Probe Microanalysis in Advances in Electronics and Electron Physics (https://doi.org/10.1016/S0065-2539(08)60212-7). In that, there's a section on mineralogy that references some early work on earth science topics:
Guillemin, C. and Capitant, M., (1960) Utilisation de la microsonde électronique de Castaing pour des études minéralogiques, a report from the 21st international geological congress
Castaing, R., and Fredriksson, K., Analyses of cosmic spherules with an X-ray microanalyser Geochim. et Cosmhim. Acta 14, 114 (1958).
https://doi.org/10.1016/0016-7037(58)90099-1
Plus a couple from 1957 that I couldn't access.
A useful source of some of the early literature is:
B. Banerjee, "Classified Bibliography on Electron Probe X-Ray Microanalysis," in Symposium on Advances in Electron Metallography and Electron Probe Microanalysis, edited by Committee E-4 (West Conshohocken, PA: ASTM International, 10.1520/STP43687S), 207-1962. https://doi.org/978-0-8031-5971-6.
See "Microprobe Analysis, 3. Applications".
Birks & Brooks (1957) mention analysis of a copper-iron mineral and its inclusions, which is subsequently elaborated upon in Birks et al. (1959):
Birks, L.S. and Brooks, E.J., 1957. Electron Probe X‐Ray Microanalyzer. Review of Scientific Instruments, 28(9), pp.709-712.
https://aip.scitation.org/doi/pdf/10.1063/1.1715982
Birks, L.S., Brooks, E.J., Adler, I. and Milton, C., 1959. Electron probe analysis of minute inclusions of a copper-iron mineral. American Mineralogist: Journal of Earth and Planetary Materials, 44(9-10), pp.974-978.
https://pubs.geoscienceworld.org/msa/ammin/article/44/9-10/974/541549/Electron-probe-analysis-of-minute-inclusions-of-a
So maybe I'm dating myself a bit, but I cut my teeth on an ARL EMX-SM in grad school and got more into it (literally) at USGS. Later, the SEMQ, several JEOL models and an SX50. Everything but a MAC just about. I think the early Shimadzu's were a take-off (ha!) on the ARL EMX, EMX-SM.
PS-I am new to this forum; am finally getting around to running PFE. And my name here may show up as "probe ogre" in deference to a title I once had.
Jim McGee
Quote from: mcgeejj on July 27, 2021, 06:07:06 PM
So maybe I'm dating myself a bit, but I cut my teeth on an ARL EMX-SM in grad school and got more into it (literally) at USGS. Later, the SEMQ, several JEOL models and an SX50. Everything but a MAC just about. I think the early Shimadzu's were a take-off (ha!) on the ARL EMX, EMX-SM.
PS-I am new to this forum; am finally getting around to running PFE. And my name here may show up as "probe ogre" in deference to a title I once had.
Jim McGee
Hi Jim,
Welcome to our EPMA user forum!
Very pleased to hear you are finally getting an opportunity to run Probe for EPMA on a modern EPMA instrument. Just so you know we do offer remote training modules for Probe for EPMA (and EPMA in general) as described in this topic:
https://smf.probesoftware.com/index.php?topic=1297.0
I'll post more on remote training in that topic, but for now you might also want to check out the Shimadzu topic here:
https://smf.probesoftware.com/index.php?topic=1275.0
Obituary for David Joy who passed away about 3 months ago (see attached).
I am writing a paper where the dates of commercial introduction of EPMAs are important. Does anyone know the dates for the ARL SEMQ-II and EMX-SM and the JEOL JXA-50A?
Quote from: rickard on December 01, 2022, 09:12:15 AM
I am writing a paper where the dates of commercial introduction of EPMAs are important. Does anyone know the dates for the ARL SEMQ-II and EMX-SM and the JEOL JXA-50A?
Check this post from John Fournelle:
https://smf.probesoftware.com/index.php?topic=1332.0
Quote from: rickard on December 01, 2022, 09:12:15 AM
I am writing a paper where the dates of commercial introduction of EPMAs are important. Does anyone know the dates for the ARL SEMQ-II and EMX-SM and the JEOL JXA-50A?
The JEOL JXA-50A came out in 1971,
History of JEOL EPMA's (From a talk to honor Dr. Hideyuki Takahashi when he received the MAS Presidential Science Award in 2021)
1959: Kick off (The column was a modified JEM-5A)
1960: Prof Castaing was invited to Japan
1961: JXA-2 (Prototype)
1962: JXA-3 (1st commercial type)
1968: JXA-5
1969: Lunar rocks were analyzed
1971: JXA-50A
1978: JXA-733 (mini-lens)
1982: JCMA-733 (LDE, stage mapping, DEC)
1986: JXA-8600, JXA-8621 (WD/ED)
1994: JXA-8800, JXA8900 (WS, H-type)
2000: JXA-8100, JXA-8200 (Phase, Particle...)
2003: JXA-8500F (1st FE-EPMA)
2009: JXA-8230, JXA-8530F
2019: JXA-iSP100, JXA-iHP200F
Quote from: rickard on December 01, 2022, 09:12:15 AM
I am writing a paper where the dates of commercial introduction of EPMAs are important. Does anyone know the dates for the ARL SEMQ-II and EMX-SM and the JEOL JXA-50A?
This paper could be helpful potentially although it does not give absolute years (but maybe I missed them)
Eklund, R.L., 1981. Bausch & Lomb-ARL: Where We Come From, Who We Are. Applied Spectroscopy, 35(2), pp.226-235. (see attached)
Found a bit more in a SEMQ brochure:
Partial History of ARL microprobes:
1953: Introduction of the first ARL X-ray quantometer
1960: Electron Microprobe X-ray Analyzer (EMX)
1963: AMX
Quote from: Anette von der Handt on December 06, 2022, 11:05:51 AM
Found a bit more in a SEMQ brochure:
Partial History of ARL microprobes:
1953: Introduction of the first ARL X-ray quantometer
1960: Electron Microprobe X-ray Analyzer (EMX)
1963: AMX
At UC Berkeley we had an ARL SEMQ (#0004). It arrived a bit before I did, but I think it was in 1978. I know Dave Witty helped to design it.
So how does this table look?
Year introduced
SEMQ-II ARL 1978
SX-100 CAMECA 1987
JXA-50A JEOL 1971
EMX-SM ARL 1960
CAMEBAX MICROBEAM 1974
Quote from: rickard on December 11, 2022, 04:31:15 AM
So how does this table look?
Year introduced
SEMQ-II ARL 1978
SX-100 CAMECA 1987
JXA-50A JEOL 1971
EMX-SM ARL 1960
CAMEBAX MICROBEAM 1974
You are leaving out a lot of older EPMA models, e.g., SX-50, SX-51, JEOL 733, JEOL 8900.
Yes, but these were the ones used in the studies I'm reviewing.
For history buffs I might add that my first probe was the Cambridge Microscan at Imperial College which I used in 1964. It was a green machine which worked if you kicked the lower left module with your toe about 20 cm from the floor. We then got the Cambridge Stereoscan but lowly students were not allowed to use it. I worked with an operator and he didn't seem to need to kick it (often) so it was a step up. D.
Hi,
the Camebax Microbeam came out in 1982. It was the microprocessor-controlled version of the Camebax (released in 1974).
I don't think the Cameca SX100 came out as early as 1987 ?
The SX50 here in Tasmania was installed in 1989 and in Clausthal in Germany we had an "early" SX100 (#654, with Unix system) installed in 1995.
Quote from: Karsten Goemann on December 12, 2022, 03:56:19 PM
I don't think the Cameca SX100 came out as early as 1987 ?
The SX50 here in Tasmania was installed in 1989 and in Clausthal in Germany we had an "early" SX100 (#654, with Unix system) installed in 1995.
That's right. Our Berkeley SX-51 was installed in 1992 or so, so the SX100 was after that.
Cameca SX100 was officially launched in 1994 according to de Chambost (2011), History of Cameca. This paper also states that "..the SX100, the study of which started in 1987, had a new electronics system.." which is confusing wording IMO.
Cameca SX50 came out in 1984.
For the record, officially confirmed by Kim Epps from JEOL that the JXA-50A was introduced in 1971.
Revised listing after inputs from Forum:
Year introduced
SEMQ-II ARL 1978
SX-100 CAMECA 1994
JXA-50A JEOL 1971
EMX-SM ARL 1960
CAMEBAX MICROBEAM CAMECA 1982
Quote from: Anette von der Handt on December 12, 2022, 04:55:56 PM
Cameca SX100 was officially launched in 1994 according to de Chambost (2011), History of Cameca. This paper also states that "..the SX100, the study of which started in 1987, had a new electronics system.." which is confusing wording IMO.
Cameca SX50 came out in 1984.
As operator of SX100 (manuf. 1998) and SXFiveFE (manuf. 2014) I had also found that sentence absolutely confusing, but not by the quoted part, but the part which follows after.
Actually I am not confused by dates. Contrary, I think that reveals really interesting developement (and excellent practice of company behind). SX50 came out in 84 (officially) probably similarly to SX100 Cameca it was being developed many years prior. Indeed I find this incredible that only three years after launching SX50 they started to look to the future and started developing SX100. It is interesting that it took 7 years to get the SX100 stable enough to launch as a new product, However it gets even more interesting when tracking the introduction dates of used components in finally launched product - it reveals some flexibility and adaptability to the changing market opportunities and advancing technology, something which existed then in Cameca.
Anyway as I mentioned the real confusion is actually not "having a new electronics system from 1987", but actually further in that paragraph when considering FPGA's:
Quote from: de Chambost(2011)
The SX100, the study of which started in 1987, had a new electronics system that used programmable logic devices known as field-programmable gate arrays (FPGAs), which can often replace an entire printed circuit board (PCB) by a single circuit.
I think de Chambost got confused himself as SX100's in 2011 were being rolled out from factory with newer hardware than the hardware in prototypes from 1987 or at launch in 1994. Indeed in 2011 FPGAs were replacing lots of large circuit boards and SX100 was being sold with such new boards (the same hardware as later launched SXFive, which then probably was already in development). Actually modern implementation in FPGA-based solutions could achieve even more impressive effect than what De Chambost stated: i.e. 3 fully populated VME boards (6U extended size eurocards) (that is a Scanning Board, an Aquisition board and a Visualisation board) and squeeze that logic into a single board! three - into one!. This also eliminated lots of complications as logic of these three boards need tightly to work together, and previously signaling had to cross over VME motherboard - lots of delay and timing issues. In a single FPGA chip it is much easier to synchronize all of three logics, and it also consumes much less power and produce less heat. The similar FPGA-introduced improvements are present for stage and WDS boards, where stage board got twice smaller, and WDS got extremely un-cluttered and streamlined. The thing is this huge improvement come only with late SX100 or as an upgrade option for older SX100.
Now the confusion at my side originated from the fact that exact model of FPGA's on these new boards is Altera Cyclone (the I-st generation), which hit the market only in mid 2003! However, while this particular FPGA hit the market 2003, the first kind of mass-produced FPGA device was manufactured in 1984 (Altera EP300). Thus the experimentation with these kind of devices really could realistically start at Cameca R&D in 1987. Albeit these first FPGA was much less capable, and claim of "replace an entire printed circuit board" from POV of capabilities of FPGA's up to mid 90'ies is very huge overstatement for SX100 officially launched in 1994. However the initially launched SX100 (with old hardware boards, thus also including our manufactured in 1998) had few FPGA's already replacing some very limited part of circuits. The only place where it was introduced was the WDS board. There the two FPGA's (Actel A1280A) in parallel was controlling counting of pulses from 5 spectrometers. Actels A1280A hit the market in 1990 and was one of highest density FPGA's available then, thus it actually only 4 (not 7) years of SX100 from final prototype to final product.
Old WDS board. Single FPGA could not replace whole PCB, densities and I/O pins of FPGA's in early 90'es was so limited that actually more than one FPGA's were needed to do something like that. These FPGA's probably allowed to evade the need for (additional) piggy-back card, something which is seen on other older VME boards:
(https://smf.probesoftware.com/gallery/1607_17_01_23_4_22_39.jpeg)
Note that it is picture only of half of old WDS board. Please note the stickers with FPGA gateware at v1.01 dating to 12th January 1995, while board was produced at 1998. Does it means that they managed to write proper Gateware and required only a single fix half year after launch, and needed no more fixes after? - that indeed is impressive.
Anyway, the detailed analysis of electronics evolution reveals very interesting healthy development environment in Cameca. De Chambost clearly oversimplified the historical description (introduced some mental-shortcuts) introducing some confusion - the most important aspect of his message is actully not introduction of SX100, but that Cameca was early adopters of FPGA technology and they mastered usage of FPGA. I personally think Cameca was closely tracking the FPGA improvements and adopting the relevant advantages in the design. Thus because they started with simple FPGA from 1987, the R&D of Cameca could easily keep up and adapt the advantages of changing FPGA landscape for own advantage (getting in nowadays FPGA's is extremely steep learning-curve, but tracking FPGA evolution from 80's on company level would give a much more shallow learning curve to master FPGA usage). So currently large parts of most complicated logic on latest SXFive(FE) is indeed Gateware*, and if Cameca would decide to get back to production of non-shielded EPMA, they could pull that very easily, despite changing landscape of availability of electronic components used in current generation of boards. I guess they apply experience with FPGA to different instrumentation (SIMS, LEAP) where tight timing and integration is even more important than on EPMA. Probably Cameca could start moving to Gateware on different FPGA models on some of prototype iterations which we had never seen. I find it mind-blowing that Cameca had the mentioned Acquisition-Scanning-Visualization board ready only a single year from Altera Cyclon (I) being released to the market! Adopting to FPGA is extremely competitive advantage, hopefully Cameca will not waste this potential.
P.S. actually mind-blowing is only the 3 months for stage board FPGA adaptation, which they got the same year as Altera Cyclon FPGA got released. So Cameca had indeed achieved mastering in FPGA and I think they could one day surprise us with a new EPMA.
P.S.S. but even bigger mind-blowing is that the new type WDS board was developed before Altera Cyclone officially hit the market, which means it had to be started developed earlier and thus very likely Cameca was not just using mature FPGA available on shelf, they were clearly actively probing different pre-market source for these FPGA - now that is absolutely different level of engineering (in absolutely positive way).
*Gateware. We know more less what is hardware and software. There is also some word as "firmware" which partially could be used instead of gateware. But firmware is more the software which runs on the hardware, while gateware can return just intermediate products (bits) of processing and could be not exposed to any software ready I/O interfaces, but to other intermediate hardware. One is clear - gateware as software can be coded and digitally analysed and simulated before instantiated as combination of gate connections inside FPGA. Other important aspect is that it can be modified and FPGA can be reflashed with newer version. The FPGA can be switched to different model of different vendor and if FPGA has enough of capability the manifestation of gate connection from gateware code can be achieved on this other FPGA with very little of modifications.
Ed Vicenzi calls this the "unearthed image the government doesn't want you to see":
(https://smf.probesoftware.com/gallery/395_01_02_24_8_50_26.png)
Because he has such a beautiful baby face from 1993 in front of his PGT UNIX system...
That lab did have some paranormal activity, mainly confined to one spectrometer. :)
Quote from: qEd on February 03, 2024, 12:55:20 PM
That lab did have some paranormal activity, mainly confined to one spectrometer. :)
You mean the "Ed" spectrometer? :D
As many of you know, we at Probe Software have released many updates to Probe for EPMA over the years, the current version being 14.0.2.
In fact version 1.0 (originally called Probe for Windows) was released in March 1996. This was when we went from the DOS version, then called Probe (which was written in FORTRAN!), to Windows (written in Visual Basic). But let's talk about backwards compatibility for a moment, because I think people don't talk about the importance of backwards compatibility enough.
In science we'd like to be able to access our old data and re-process it again if necessary. Ideally without the bother of keeping an ancient computer around with an ancient application, running on an ancient operating system . There are three main aspects to backwards compatibility:
1. The data file.
Our raw data files (for all fields in science) should be able to be opened by every subsequent version of the applications which created them. This is a key aspect of data compatibility going forward. I personally can't speak about other commercial software applications, but below we will look at the data file backwards compatibility in Probe for EPMA, which is going back for almost 30 years!
2. The operating system.
The application should run on all current (and future) operating systems. Fortunately Microsoft has done an excellent job maintaining the Win32 API and allowing 32 applications that stay within the Win32 "sandbox" to keep running on all OS updates.
3. The API (application programming interface).
In addition to keeping operating system calls backward compatible, it is important to maintain backwards compatibility with application API extensions. For example the Thermo, Bruker, JEOL, Cameca instrument and software APIs. Unfortunately the various vendors have not always maintained backwards compatibility with new API releases.
One key to this is having a "version number" function call within the API (and data files!). This call would be made initially to determine what older functions may no longer be supported and also what new functions may be available. Unfortunately this is rarely (if ever) done.
In fact because Thermo kept adding so many new functions to their API, we had to keep a NSS/PathFinder version number keyword in our Probewin.ini file!
Anyway, here's the first few updates from the version.txt file from when Probe (for Windows) was created:
QuotePROBE for Windows Version Changes
Last Updated 3/2000
3/2/96 Initial beta version
v. 1.0
3/10/96 Add sample setup feature to allow user to save and load
v. 1.01 analytical sample setups to their database. See new
"Load Sample Setup" buttons in FormNEW, FormGETELM and
"Sample Setups" button in FormDIGITIZE. New code module
SETUPSAM.BAS. Setup number in FormDIGITIZE will be used
to automatically load sample setups during automated
sample acquisition. Modify FormDIGITIZE and FormAUTOMATE
to display position sample setup number.
Remove beep from MiscDelay routine and add beep to Real-
Time GetProbeStatus routine if motor position is out of
bounds and different from previous position.
3/14/96 Add Published values to Analyze grid. Add Set PHA button
to FormPHA (Startwin). Disable FormNEW options if sample
setup from another source is loaded.
3/16/96 Add option buttons for selecting automation default for
v. 1.02 starting new samples in FormAUTOMATE. Add check box in
FormPLOT to select analyzed or analyzed and specified
element list loading. Fix spectrometer peak center bug
when backlash is used. Improve automation loop performance
in AcquireDoAutomateNext. Add warning if user selects peak
center pre-scan and runs automation. Add option to append
or overwrite existing save log to disk file. Add function
return help context ID for future context sensitive help.
And here, I found a v. 2.1.1 Probe (for Windows) MDB file (created in October, 1996), and was able to open it with version 13.9.8 of Probe (for EPMA) running in Windows 11, and was able to view the raw data and even analyzed a sample:
(https://smf.probesoftware.com/gallery/1_08_02_25_8_57_11.png)
That's almost 30 years ago my friends! Pretty cool for a 29 year old relational database of EPMA intensities! I will have to give credit to Microsoft for maintaining the Win32 (and Access MDB file format) backwards compatibility in every new version of the Windows operating system since then... in fact I think we started with Windows NT. Remember that OS?
Quote from: John Donovan on February 08, 2025, 01:58:49 PMIn science we'd like to be able to access our old data and re-process it again if necessary. Ideally without the bother of keeping an ancient computer around with an ancient application, running on an ancient operating system . There are three main aspects to backwards compatibility:
1. The data file.
Our raw data files (for all fields in science) should be able to be opened by every subsequent version of the applications which created them. This is a key aspect of data compatibility going forward. I personally can't speak about other commercial software applications, but below we will look at the data file backwards compatibility in Probe for EPMA, which is going back for almost 30 years!
I've wondered why Probe for EPMA was still using .MDB files, but agree the backwards compatibility makes sense.
It does create a slight headache for programming in the newer .NET platform as Microsoft doesn't seem to have included JET compatibility, meaning we're either limited to using either .NET Framework 4.8 or a third party plugin* for .NET if we want to access the PfE MDB databases via code.
*See EntityFrameworkCore.JET Github (https://github.com/CirrusRedOrg/EntityFrameworkCore.Jet) or NuGet (https://www.nuget.org/packages/EntityFrameworkCore.Jet/#readme-body-tab)
Quote from: JonF on February 18, 2025, 01:25:34 AMQuote from: John Donovan on February 08, 2025, 01:58:49 PMIn science we'd like to be able to access our old data and re-process it again if necessary. Ideally without the bother of keeping an ancient computer around with an ancient application, running on an ancient operating system . There are three main aspects to backwards compatibility:
1. The data file.
Our raw data files (for all fields in science) should be able to be opened by every subsequent version of the applications which created them. This is a key aspect of data compatibility going forward. I personally can't speak about other commercial software applications, but below we will look at the data file backwards compatibility in Probe for EPMA, which is going back for almost 30 years!
I've wondered why Probe for EPMA was still using .MDB files, but agree the backwards compatibility makes sense.
It does create a slight headache for programming in the newer .NET platform as Microsoft doesn't seem to have included JET compatibility, meaning we're either limited to using either .NET Framework 4.8 or a third party plugin* for .NET if we want to access the PfE MDB databases via code.
*See EntityFrameworkCore.JET Github (https://github.com/CirrusRedOrg/EntityFrameworkCore.Jet) or NuGet (https://www.nuget.org/packages/EntityFrameworkCore.Jet/#readme-body-tab)
Yes, backwards compatibility does have a cost, but most of our users aren't writing software to access their MDB files, so being able to read ones 30 year old databases with the current software is probably the best thing for most people.
I had not known of the EntityFrameworkCore.JET package previously. Have you tried using it yourself?
I do know that some people have used Microsoft Access to view/edit their MDB files (always make a backup copy of your MDB files before editing them in Access as it's easy to break things!), though not sure how long Microsoft will support this in the latest versions of Access.
And of course one can always install the Visual Basic 6.0 development environment as it is distributed for free now. Then one will have the JET database API.
Quote from: John Donovan on February 18, 2025, 09:40:58 AMI had not known of the EntityFrameworkCore.JET package previously. Have you tried using it yourself?
No, not yet - whenever I'm writing something that will interact with the MDB file directly, I'll revert back to the last .NET Framework (4.8.1) as its still (as of writing!) supported, and then use System.Data.OleDb namespace to access the MDB using the ACE rather than JET.
We could do with some more acronyms...
Quote from: JonF on February 18, 2025, 01:25:34 AMIt does create a slight headache for programming in the newer .NET platform as Microsoft doesn't seem to have included JET compatibility, meaning we're either limited to using either .NET Framework 4.8 or a third party plugin* for .NET if we want to access the PfE MDB databases via code.
Since MDB is an old format and official support has been sunset a while ago there are some really good unofficial tools laying about. One of the tools I have contributed to and use for accessing MDBs with Probelab ReImager is MDBTools (https://github.com/mdbtools/mdbtools).
MDBTools(under LGPL and GPL license) is a standalone, open source, SQL engine (JET 3 or 4) commandline interface that can run on any operating system, made to interface with MDB files. It would be easy to write a wrapper around spawning the program with whatever query you want to make on the database allowing for more languages, and newer versions of languages, to be supported.
I currently have a NodeJS version of that wrapper here: node-mdb-sql (https://github.com/Bob620/node-mdb-sql) and could look at making more wrappers if it would be wanted.
I would look to this issue from different perspective. Why not to drop jet completely? PfS is using SQL to interact with jet? why not make platform/db agnostic SQL IO and make it possible to use any free SQL databases, be it on files (i.e. SQLite) or real distributed databases (postgreSQL, mariadb, MySQL, MSSQL, Oracle....)?
Quote from: sem-geologist on March 27, 2025, 03:07:29 AMI would look to this issue from different perspective. Why not to drop jet completely? PfS is using SQL to interact with jet? why not make platform/db agnostic SQL IO and make it possible to use any free SQL databases, be it on files (i.e. SQLite) or real distributed databases (postgreSQL, mariadb, MySQL, MSSQL, Oracle....)?
By PfS do you mean Probe for EPMA?
PFE uses a mixture of Jet and SQL calls. I am not an expert on the Jet database access and only use it as a data storage method. It's nice because the MDB file automatically handles the file format and also includes transaction processing methods. This is used because if an error occurs during a database write opertion (e.g., computer crashes), the database automatically rolls back to the previous state. Similar to how money is transferred from one account to another!
Remember, these database calls were first written in the early 1990s! The fact that we can read these databases in Windows 11 is amazing to me as I described in my post above on backwards compatibility:
https://smf.probesoftware.com/index.php?topic=924.msg13203#msg13203
If anyone is interested in looking at the MDB database code and see what is utilized, see the open source code for CalcZAF/Standard on GitHub:
https://github.com/openmicroanalysis/calczaf
For example, the Standard compositional database code is in general pretty similar to the database code used in Probe for EPMA. And of course we are happy to share database code examples in Probe for EPMA for anyone to examine.
Here is an code example for reading a sample in Probe for EPMA:
Sub DataGetMDBSample(samplerow As Integer, sample() As TypeSample)
' This routine reads the user data file (*.MDB) to obtain the sample data for the indicated sample row
ierror = False
On Error GoTo DataGetMDBSampleError
Dim n As Integer, m As Integer
Dim chan As Integer, row As Integer
Dim temp As Single, fuzz As Single
Dim SQLQ As String
Dim PrDb As Database
Dim PrRs As Recordset
' Update status
'Call IOStatusAuto("Loading sample " & SampleGetString$(samplerow%) & "...")
'Call AnalyzeStatusAnal("Loading sample " & SampleGetString$(samplerow%) & "...")
'DoEvents
' Load sample parameters only (no intensity data)
Call DataGetMDBSampleOnly(samplerow%, sample())
If ierror Then Exit Sub
' Allocate count and background arrays
ReDim sample(1).CorData(1 To MAXROW%, 1 To MAXCHAN1%) As Single
ReDim sample(1).BgdData(1 To MAXROW%, 1 To MAXCHAN1%) As Single
ReDim sample(1).ErrData(1 To MAXROW%, 1 To MAXCHAN1%) As Single
ReDim sample(1).OnTimeData(1 To MAXROW%, 1 To MAXCHAN1%) As Single ' for aggregate intensity calculations
ReDim sample(1).HiTimeData(1 To MAXROW%, 1 To MAXCHAN1%) As Single ' for aggregate intensity calculations
ReDim sample(1).LoTimeData(1 To MAXROW%, 1 To MAXCHAN1%) As Single ' for aggregate intensity calculations
ReDim sample(1).OnBeamData(1 To MAXROW%, 1 To MAXCHAN1%) As Single ' for aggregate intensity calculations (average aggregate beam)
ReDim sample(1).OnBeamDataArray(1 To MAXROW%, 1 To MAXCHAN1%) As Single ' for aggregate intensity calculations (average aggregate beam)
ReDim sample(1).AggregateNumChannels(1 To MAXROW%, 1 To MAXCHAN1%) As Integer ' for aggregate intensity calculations (number of aggregate channels)
' Open the database using error trapping for databases already open for exclusive use by another app
Screen.MousePointer = vbHourglass
Call DataOpenDatabase("DataGetMDBSample", PrDb, ProbeDataFile$, ProbeDatabaseNonExclusiveAccess%, dbReadOnly)
If ierror Then Exit Sub
' Get Count data for specified sample from probe database (raw cps with no beam current correction)
SQLQ$ = "SELECT Count.* FROM Count WHERE Count.CountToRow = " & Str$(samplerow%)
Set PrRs = PrDb.OpenRecordset(SQLQ$, dbOpenSnapshot)
' Use "ElementOrder" and "LineOrder" fields to load element and lines in order (if no data, just drops through)
Do Until PrRs.EOF
chan% = PrRs("ElementOrder")
row% = PrRs("LineOrder")
If chan% < 1 Or chan% > MAXCHAN% Then GoTo DataGetMDBSampleBadChannel
If row% < 1 Or row% > MAXROW% Then GoTo DataGetMDBSampleBadRow
sample(1).OnPeakCounts!(row%, chan%) = PrRs("OnCount") ' on peak intensites
sample(1).OnPeakCounts_Raw_Cps!(row%, chan%) = sample(1).OnPeakCounts!(row%, chan%) ' save for statistics calculations!!!!!!!
sample(1).HiPeakCounts!(row%, chan%) = PrRs("HiCount") ' high off peak intensities for stds/unks or spectrometer position for wavescans
sample(1).HiPeakCounts_Raw_Cps!(row%, chan%) = sample(1).HiPeakCounts!(row%, chan%) ' save for statistics calculations!!!!!!!
sample(1).LoPeakCounts!(row%, chan%) = PrRs("LoCount") ' low off peak intensities for stds/unkns or angstrom values for wavescans
sample(1).LoPeakCounts_Raw_Cps!(row%, chan%) = sample(1).LoPeakCounts!(row%, chan%) ' save for statistics calculations!!!!!!!
sample(1).OnCountTimes!(row%, chan%) = PrRs("OnCountTime")
sample(1).HiCountTimes!(row%, chan%) = PrRs("HiCountTime")
sample(1).LoCountTimes!(row%, chan%) = PrRs("LoCountTime")
' Get unknown count factors
If ProbeDataFileVersionNumber! > 2.44 Then
sample(1).UnknownCountFactors!(row%, chan%) = PrRs("UnknownCountFactor")
End If
' Get unknown max counts
If ProbeDataFileVersionNumber! > 4# Then
sample(1).UnknownMaxCounts&(row%, chan%) = PrRs("UnknownMaxCount")
End If
' Add beam count arrays (combined samples only)
If ProbeDataFileVersionNumber! > 5.01 Then
sample(1).OnBeamCountsArray!(row%, chan%) = PrRs("OnBeamCountsArray") ' use for faraday beam
End If
' If no beam current then use default beam current (for old wavescan samples)
If ProbeDataFileVersionNumber! <= 8.28 And sample(1).Type% = 3 Then
If sample(1).OnBeamCountsArray!(row%, chan%) = 0# Then sample(1).OnBeamCountsArray!(row%, chan%) = DefaultBeamCurrent!
End If
' Get volatile element acquisition times
If ProbeDataFileVersionNumber! > 6.46 Then
sample(1).VolCountTimesStart(row%, chan%) = PrRs("VolCountTimesStart")
sample(1).VolCountTimesStop(row%, chan%) = PrRs("VolCountTimesStop")
If IsNull(sample(1).VolCountTimesStart(row%, chan%)) Then sample(1).VolCountTimesStart(row%, chan%) = 0#
If IsNull(sample(1).VolCountTimesStop(row%, chan%)) Then sample(1).VolCountTimesStop(row%, chan%) = 0#
End If
' Get volatile element faraday delay time
If ProbeDataFileVersionNumber! > 8.26 Then
sample(1).VolCountTimesDelay!(row%, chan%) = PrRs("VolCountTimesDelay")
End If
' Add beam count arrays (combined samples only)
If ProbeDataFileVersionNumber! > 10.65 Then
sample(1).AbBeamCountsArray!(row%, chan%) = PrRs("AbBeamCountsArray") ' use for absorbed beam
End If
' Add beam count arrays (combined samples only)
If ProbeDataFileVersionNumber! > 10.68 Then
sample(1).OnBeamCountsArray2!(row%, chan%) = PrRs("OnBeamCountsArray2") ' use for second faraday beam
sample(1).AbBeamCountsArray2!(row%, chan%) = PrRs("AbBeamCountsArray2") ' use for second absorbed beam
End If
PrRs.MoveNext
Loop
PrRs.Close
' Get integrated intensities
If ProbeDataFileVersionNumber! > 5.22 Then
SQLQ$ = "SELECT Inte.* FROM Inte WHERE Inte.InteToRow = " & Str$(samplerow%)
Set PrRs = PrDb.OpenRecordset(SQLQ$, dbOpenSnapshot)
' Zero arrays
If sample(1).Datarows% > 0 And sample(1).LastElm% > 0 Then
ReDim sample(1).IntegratedPoints(1 To sample(1).Datarows%, 1 To sample(1).LastElm%) As Integer
ReDim sample(1).IntegratedPeakIntensities(1 To sample(1).Datarows%, 1 To sample(1).LastElm%) As Single ' loaded in DataCorrectData
ReDim sample(1).IntegratedPositions(1 To sample(1).Datarows%, 1 To sample(1).LastElm%, 1 To 1) As Single
ReDim sample(1).IntegratedIntensities(1 To sample(1).Datarows%, 1 To sample(1).LastElm%, 1 To 1) As Single
ReDim sample(1).IntegratedCountTimes(1 To sample(1).Datarows%, 1 To sample(1).LastElm%, 1 To 1) As Single
' Use "InteElementOrder" and "InteLineOrder" fields to load element and lines in order
m% = 0
Do Until PrRs.EOF
chan% = PrRs("InteElementOrder")
row% = PrRs("InteLineOrder")
n% = PrRs("InteInteOrder")
If chan% < 1 Or chan% > sample(1).LastElm% Then GoTo DataGetMDBSampleBadIntegratedChannel
If row% < 1 Then GoTo DataGetMDBSampleBadIntegratedRow
If row% > sample(1).Datarows% Then GoTo DataGetMDBSampleBadIntegratedRowSkip ' sample is being acquired!
If n% < 1 Then GoTo DataGetMDBSampleBadIntegratedIncrement
' Check for invalid integrated sample (do not exit)
If sample(1).IntegratedIntensitiesUseIntegratedFlags%(chan%) <> True Then
Screen.MousePointer = vbDefault
msg$ = "Warning: " & SampleGetString$(samplerow%) & " channel " & Format$(chan%) & " is not flagged as an integrated intensity channel in " & ProbeDataFile$
MsgBox msg$, vbOKOnly + vbInformation, "DataGetMDBSample"
'ierror = True
Exit Do
End If
' Check array dimensions and increase if necessary
If n% > m% Then
ReDim Preserve sample(1).IntegratedPositions(1 To sample(1).Datarows%, 1 To sample(1).LastElm%, 1 To n%) As Single
ReDim Preserve sample(1).IntegratedIntensities(1 To sample(1).Datarows%, 1 To sample(1).LastElm%, 1 To n%) As Single
ReDim Preserve sample(1).IntegratedCountTimes(1 To sample(1).Datarows%, 1 To sample(1).LastElm%, 1 To n%) As Single
m% = n%
End If
' Check if x-axis data is identical to previous position (6 * Rnd) + 1
If n% > 1 Then
If sample(1).IntegratedPositions!(row%, chan%, n% - 1) = PrRs("IntePositions") Then
fuzz! = Rnd() * 0.5
sample(1).IntegratedPositions!(row%, chan%, n% - 1) = sample(1).IntegratedPositions!(row%, chan%, n% - 1) - fuzz!
End If
End If
' Store largest value for this row and channel
If n% > sample(1).IntegratedPoints%(row%, chan%) Then
sample(1).IntegratedPoints%(row%, chan%) = n%
End If
sample(1).IntegratedPositions!(row%, chan%, n%) = PrRs("IntePositions")
sample(1).IntegratedIntensities!(row%, chan%, n%) = PrRs("InteIntensities")
sample(1).IntegratedCountTimes!(row%, chan%, n%) = PrRs("InteCountTimes")
' Stored integrated intensities are raw counts
If sample(1).IntegratedCountTimes!(row%, chan%, n%) <> 0# Then
sample(1).IntegratedIntensities!(row%, chan%, n%) = sample(1).IntegratedIntensities!(row%, chan%, n%) / sample(1).IntegratedCountTimes!(row%, chan%, n%)
End If
DataGetMDBSampleBadIntegratedRowSkip:
PrRs.MoveNext
Loop
End If
PrRs.Close
End If
' Get multi-point intensities
If ProbeDataFileVersionNumber! > 8.31 Then
SQLQ$ = "SELECT MultiPoint.* FROM MultiPoint WHERE MultiPoint.MultiPointToRow = " & Str$(samplerow%)
Set PrRs = PrDb.OpenRecordset(SQLQ$, dbOpenSnapshot)
' Use "ElementOrder" and "LineOrder" fields to load element and lines in order
Do Until PrRs.EOF
chan% = PrRs("ElementOrder")
row% = PrRs("LineOrder")
m% = PrRs("MultiPointOrder")
If chan% < 1 Or chan% > MAXCHAN% Then GoTo DataGetMDBSampleBadChannel
If row% < 1 Or row% > MAXROW% Then GoTo DataGetMDBSampleBadRow
If m% < 1 Or m% > MAXMULTI% Then GoTo DataGetMDBSampleBadMultiPoint
If m% <= MAXMULTI_OLD% Or ProbeDataFileVersionNumber! >= 12.84 Then
sample(1).MultiPointAcquireCountTimesHi!(row%, chan%, m%) = PrRs("HiCountTimes")
sample(1).MultiPointAcquireCountTimesLo!(row%, chan%, m%) = PrRs("LoCountTimes")
sample(1).MultiPointAcquireCountsHi!(row%, chan%, m%) = PrRs("HiCounts")
sample(1).MultiPointAcquireCountsLo!(row%, chan%, m%) = PrRs("LoCounts")
sample(1).MultiPointProcessManualFlagHi%(row%, chan%, m%) = PrRs("HiManualFlag")
sample(1).MultiPointProcessManualFlagLo%(row%, chan%, m%) = PrRs("LoManualFlag")
End If
PrRs.MoveNext
Loop
PrRs.Close
End If
' Close the probe database
Screen.MousePointer = vbDefault
PrDb.Close
' Make sure objects are deallocated
If Not PrRs Is Nothing Then Set PrRs = Nothing
If Not PrDb Is Nothing Then Set PrDb = Nothing
' Check for count overwrite intenity data
If ProbeDataFileVersionNumber! > 8.28 And UseCountOverwriteIntensityDataFlag Then
Call DataCountOverwriteGet(samplerow%, sample())
If ierror Then Exit Sub
End If
' If fiducial data, load from Fiducial table
If sample(1).FiducialSetNumber% > 0 Then
Call DataFiducial(Int(1), sample(1).FiducialSetNumber%, sample(1).FiducialSetDescription$, sample(1).fiducialpositions!())
If ierror Then Exit Sub
End If
' Set background type flags (AllMANBgdFlag = true if all MAN, MANBgdFlag = true if any MAN)
sample(1).AllMANBgdFlag = True
sample(1).MANBgdFlag = False
For chan% = 1 To sample(1).LastElm%
If sample(1).BackgroundTypes%(chan%) = 1 Then ' 0=off-peak, 1=MAN, 2=multipoint
sample(1).MANBgdFlag = True
Else
sample(1).AllMANBgdFlag = False
End If
Next chan%
' Load last arrays if data present and old version data file (defaults loaded in InitElement)
If ProbeDataFileVersionNumber! <= 4.82 Then
If sample(1).Datarows% > 0 Then
For chan% = 1 To sample(1).LastElm%
sample(1).LastOnCountTimes!(chan%) = sample(1).OnCountTimes!(sample(1).Datarows%, chan%)
sample(1).LastHiCountTimes!(chan%) = sample(1).HiCountTimes!(sample(1).Datarows%, chan%)
sample(1).LastLoCountTimes!(chan%) = sample(1).LoCountTimes!(sample(1).Datarows%, chan%)
' If unknown, modify for unknown count factors
If sample(1).Type% = 2 Then
If sample(1).UnknownCountFactors!(sample(1).Datarows%, chan%) = 0# Then
sample(1).UnknownCountFactors!(sample(1).Datarows%, chan%) = 1#
End If
sample(1).LastOnCountTimes!(chan%) = sample(1).OnCountTimes!(sample(1).Datarows%, chan%) / sample(1).UnknownCountFactors!(sample(1).Datarows%, chan%)
sample(1).LastHiCountTimes!(chan%) = sample(1).HiCountTimes!(sample(1).Datarows%, chan%) / sample(1).UnknownCountFactors!(sample(1).Datarows%, chan%)
sample(1).LastLoCountTimes!(chan%) = sample(1).LoCountTimes!(sample(1).Datarows%, chan%) / sample(1).UnknownCountFactors!(sample(1).Datarows%, chan%)
End If
sample(1).LastWaveCountTimes!(chan%) = DefaultWavescanCountTime!
sample(1).LastPeakCountTimes!(chan%) = DefaultPeakingCountTime!
sample(1).LastQuickCountTimes!(chan%) = DefaultQuickscanCountTime!
sample(1).LastCountFactors!(chan%) = sample(1).UnknownCountFactors!(sample(1).Datarows%, chan%)
sample(1).LastMaxCounts&(chan%) = sample(1).UnknownMaxCounts&(sample(1).Datarows%, chan%)
Next chan%
End If
End If
' Load the kilovolts array for normal samples
If ProbeDataFileVersionNumber! <= 4.89 Then
For chan% = 1 To sample(1).LastElm%
sample(1).TakeoffArray!(chan%) = sample(1).takeoff!
sample(1).KilovoltsArray!(chan%) = sample(1).kilovolts!
sample(1).BeamCurrentArray!(chan%) = sample(1).beamcurrent!
sample(1).BeamSizeArray!(chan%) = sample(1).beamsize!
sample(1).ColumnConditionMethodArray%(chan%) = sample(1).ColumnConditionMethod%
sample(1).ColumnConditionStringArray$(chan%) = sample(1).ColumnConditionString$
Next chan%
End If
' Calculate peak offsets (no warnings) for all samples
Call XrayGetOffsets(Int(1), sample())
If ierror Then Exit Sub
' Fix v. 5.21 bug for wavescan angstroms (data was saved without offset applied)
If ProbeDataFileVersionNumber! < 5.22 And sample(1).Type% = 3 Then
For chan% = 1 To sample(1).LastElm%
If sample(1).MotorNumbers%(chan%) > 0 Then
temp! = MotUnitsToAngstromMicrons!(sample(1).MotorNumbers%(chan%)) * (sample(1).Crystal2ds!(chan%) * (1# - sample(1).CrystalKs!(chan%))) / LIF2D!
For row% = 1 To sample(1).Datarows%
sample(1).LoPeakCounts!(row%, chan%) = (sample(1).HiPeakCounts!(row%, chan%) + sample(1).Offsets!(chan%)) * temp!
Next row%
End If
Next chan%
End If
' Load defaults for mag
If ProbeDataFileVersionNumber! <= 7.03 Then
If sample(1).magnificationanalytical! = 0# Then sample(1).magnificationanalytical! = DefaultMagnificationAnalytical!
If sample(1).magnificationimaging! = 0# Then sample(1).magnificationimaging! = DefaultMagnificationImaging!
End If
' Load defaults for aperture
If ProbeDataFileVersionNumber! <= 8.05 Then
If sample(1).ApertureNumber% = 0 Then sample(1).ApertureNumber% = DefaultAperture%
End If
' Load defaults for image shift
If ProbeDataFileVersionNumber! <= 8.11 Then
If sample(1).ImageShiftX! = 0# Then sample(1).ImageShiftX! = DefaultImageShiftX!
If sample(1).ImageShiftY! = 0# Then sample(1).ImageShiftY! = DefaultImageShiftY!
End If
' Load defaults for UnknownCountTimeForInterferenceStandardChanFlag for backward compatibility
If ProbeDataFileVersionNumber! <= 8.62 Then
If sample(1).Type% = 1 And sample(1).UnknownCountTimeForInterferenceStandardFlag Then
For chan% = 1 To sample(1).LastElm%
If AcquireIsUseUnknownCountTimeForInterferenceStandardFlag(chan%, sample()) Then
sample(1).UnknownCountTimeForInterferenceStandardChanFlag(chan%) = True
End If
Next chan%
End If
End If
' Load MPB last manual flags for backward compatibility (only if MPB acquisition or "shared" MPB (off-peak using MPB fit method))
If ProbeDataFileVersionNumber! < 11.39 Then
For chan% = 1 To sample(1).LastElm%
If sample(1).BackgroundTypes%(chan%) = 2 Or sample(1).OffPeakCorrectionTypes%(chan%) = MAXOFFBGDTYPES% Then
For m% = 1 To MAXMULTI%
If m% <= MAXMULTI_OLD% Or ProbeDataFileVersionNumber! >= 12.84 Then
If sample(1).Datarows% = 0 Then
If m% <= sample(1).MultiPointNumberofPointsAcquireHi%(chan%) Then
sample(1).MultiPointProcessLastManualFlagHi%(chan%, m%) = 0
End If
If m% <= sample(1).MultiPointNumberofPointsAcquireLo%(chan%) Then
sample(1).MultiPointProcessLastManualFlagLo%(chan%, m%) = 0
End If
Else
If m% <= sample(1).MultiPointNumberofPointsAcquireHi%(chan%) Then
sample(1).MultiPointProcessLastManualFlagHi%(chan%, m%) = sample(1).MultiPointProcessManualFlagHi%(sample(1).Datarows%, chan%, m%)
End If
If m% <= sample(1).MultiPointNumberofPointsAcquireLo%(chan%) Then
sample(1).MultiPointProcessLastManualFlagLo%(chan%, m%) = sample(1).MultiPointProcessManualFlagLo%(sample(1).Datarows%, chan%, m%)
End If
End If
End If
Next m%
End If
Next chan%
End If
' Load EDS spectral intensity data
If sample(1).EDSSpectraFlag Then
Call DataEDSSpectraGetData(samplerow%, sample)
If ierror Then Exit Sub
End If
' Load CL spectral intensity data
If sample(1).CLSpectraFlag Then
Call DataCLSpectraGetData(samplerow%, sample)
If ierror Then Exit Sub
End If
' Update status
'Call IOStatusAuto(vbNullString)
'Call AnalyzeStatusAnal(vbNullString)
'DoEvents
Exit Sub
' Errors
DataGetMDBSampleError:
Screen.MousePointer = vbDefault
MsgBox Error$ & ", reading sample " & SampleGetString$(samplerow%), vbOKOnly + vbCritical, "DataGetMDBSample"
ierror = True
Exit Sub
DataGetMDBSampleBadChannel:
Screen.MousePointer = vbDefault
msg$ = "Sample " & SampleGetString$(samplerow%) & " has an invalid element channel in " & ProbeDataFile$
MsgBox msg$, vbOKOnly + vbExclamation, "DataGetMDBSample"
ierror = True
Exit Sub
DataGetMDBSampleBadRow:
Screen.MousePointer = vbDefault
msg$ = "Sample " & SampleGetString$(samplerow%) & " has an invalid sample row in " & ProbeDataFile$
MsgBox msg$, vbOKOnly + vbExclamation, "DataGetMDBSample"
ierror = True
Exit Sub
DataGetMDBSampleBadIntegratedChannel:
Screen.MousePointer = vbDefault
msg$ = "Sample " & SampleGetString$(samplerow%) & " has an invalid element channel in " & ProbeDataFile$
MsgBox msg$, vbOKOnly + vbExclamation, "DataGetMDBSample"
ierror = True
Exit Sub
DataGetMDBSampleBadIntegratedRow:
Screen.MousePointer = vbDefault
msg$ = "Sample " & SampleGetString$(samplerow%) & " has an invalid sample row in " & ProbeDataFile$
MsgBox msg$, vbOKOnly + vbExclamation, "DataGetMDBSample"
ierror = True
Exit Sub
DataGetMDBSampleBadIntegratedIncrement:
Screen.MousePointer = vbDefault
msg$ = "Sample " & Format$(SampleNums%(samplerow%)) & " " & SampleNams$(samplerow%) & " has an invalid sample increment in " & ProbeDataFile$
MsgBox msg$, vbOKOnly + vbExclamation, "DataGetMDBSample"
ierror = True
Exit Sub
DataGetMDBSampleBadMultiPoint:
Screen.MousePointer = vbDefault
msg$ = "Sample " & Format$(SampleNums%(samplerow%)) & " " & SampleNams$(samplerow%) & " has an invalid sample multi-point in " & ProbeDataFile$
MsgBox msg$, vbOKOnly + vbExclamation, "DataGetMDBSample"
ierror = True
Exit Sub
End Sub
Quote from: John Donovan on March 27, 2025, 06:33:17 AMBy PfS do you mean Probe for EPMA?
yes, sorry for misname.
Quote from: John Donovan on March 27, 2025, 06:33:17 AMThis is used because if an error occurs during a database write operation (e.g., computer crashes), the database automatically rolls back to the previous state. Similar to how money is transferred from one account to another!
Transactional write is not the unique feature of JET database. It is one of the base feature of most of relational database management systems based on SQL databases, be it single file (like SQLite) or distributed (like PostgreSQL, MariaDB, MySQL, Oracle database, MS SQL...). These compared with JET do not depend from file system (JET do some fishy write calls using NTFS - and it is main obstacle in Reverse Engineering and implementing fully functioning open source JET database engine) and can be easy migrated, cleaned up (database can be not only expanded, but also shrunken safely). Mentioned earlier open source tools for JET (i.e. MDBTools) works with JET database OK only if no data were removed from the database... which JET does not do cleanly.
Quote from: sem-geologist on March 28, 2025, 02:38:48 AMQuote from: John Donovan on March 27, 2025, 06:33:17 AMThis is used because if an error occurs during a database write operation (e.g., computer crashes), the database automatically rolls back to the previous state. Similar to how money is transferred from one account to another!
Transactional write is not the unique feature of JET database. It is one of the base feature of most of relational database management systems based on SQL databases, be it single file (like SQLite) or distributed (like PostgreSQL, MariaDB, MySQL, Oracle database, MS SQL...).
I never claimed that transactional write was a unique feature of Jet. Only that this was one of several reasons why we liked using a database management system for data storage in Probe for EPMA. I'm sure that some other database management system could be utilized in Probe for EPMA, but switching to a different one would be an enormous amount of work, and it would make all existing MDB Jet databases unreadable.
But since Jet still works in the latest Windows 11 operating system, we currently have
backward compatibility going back to the early 1990s. That was my main point.
As for
future compatibility, will that compatibility continue into the future? I do not know, but there are a lot of Jet MDB databases out there in the world, so I'm hoping the answer is yes.
Writing software has always been "stepping into the stream" and hoping the current carries you with it. We've made lots of decisions about which components to utilize, e.g., our graphics library.
For example, we started with a product called Graphics Server (you can see in the older screen shots in this forum), but it was never updated to run properly in Windows 7 and later, so we had to "jump horses" in mid-stream to the Pro Essentials graphics library. Fortunately the choice of graphics library did not affect the data file management.
Another example is the choice of Internet tools. We started with Catalyst Tools (for downloading CalcZAF and Probe for EPMA updates), and it's been a great choice, but late last year (as some of you may have noticed):
https://smf.probesoftware.com/index.php?topic=40.msg13206#msg13206
we started seeing many institutions locking down their Internet security even further, so we had to update to a more recent version of Catalyst that supports these newer protocols.
The "stream" never stops! ;D