In the early 1940s, many prominent scientists were involved in an undertaking that would change war as we knew it — the Manhattan Project. The project led to the development of the first atomic bombs, which were dropped on the Japanese cities of Hiroshima and Nagasaki, on August 6 and 9, 1945, respectively.

The Manhattan Project also forced scientists to develop ways of measuring an individual’s ionizing radiation dose. One of the research sites for the project, the Metallurgical Laboratory at the University of Chicago, created a Health Division that existed for the sole purpose of keeping the project’s workers safe. One of the physicists in the Health Division, Ernest Wollan, was tasked to find a good way to track a worker’s cumulative radiation dose.

At the time, workers used pocket ionization chambers, which were tubes the size of a fountain pen, to store an applied electrical charge affected by radiation. When a person measured the initial charge and the final charge, it could be determined how much radiation exposure occurred.  The pocket chambers were awkward to use and only measured individual doses at a specified time—not the cumulative dose over a period of time. The pocket ionization chamber evolved in to the pocket dosimeter. Charles Lauritseninvented the pocket dosimeter in 1937 and it was more widely used during the Manhattan Project. The pocket dosimeter still required charging, but wearers could read a built-in dose scale through magnifying lenses. 

Several years after Lauritsen invented his pocket dosimeter, Wollan invented the film dosimetry badge, which used photographic film to record exposure over days or weeks. He created it to be attached to the back of a badge, so workers would be more likely to wear it every day. One of the drawbacks of film was that it needed to be processed in order to provide a reading, but it was a cheap and easy way to monitor the many workers on the project.

Related Categories: Industrial & Non-Destructive Testing (NDT), Medical Practices