Assembling and Characterizing the Parallel Detector

DSC_1006-1024

The revised high energy particle detector boards arrived, and I’ve had the chance to put one together and verify it working over the weekend — and snapped some pictures of the assembly process along the way. This (long!) post details the assembly process, and (towards the bottom) describes some initial characterization of the detector, including a histogram of detector variability.

For more details, please see the detailed OpenCT2 project build logs!

DSC_0993-1024

Developing a parallel detector for the Open Source CT 2

I’ve been thinking about tomographic imaging a lot lately in between nursing sore fingers from learning how to hand-wind electromagnetic coils for my attempt at a small prepolarized coded-field MRI, and I thought I’d take a moment to share some progress.

DSC_0742

The open source computed tomography (CT) scanner that I put together last year was a lot of fun to design, having many atypical design problems in it, from the mechanical design of the rotary gantry to pairing an appropriate radioisotope source with a modified extra-sensitive radiation sensor. Something about it being essentially a radioactive desktop Stargate that lets you see inside of things also seems to get people very excited, and so I’ve received an eclectic bunch of e-mails asking about the scanner from folks as diverse as radiology professors and biomedical folks to makers to those hoping I’d open up Dr. Jansen’s back-alley CT scans to have a look at some strange bump they have (please go see your doctor!). But I feel that for all the excitement, to quote Feynman, the current open CT design feels a bit like a dog that walks on two legs — it’s not that it does it well, it’s that it does it at all. It’s essentially a working model of the first generation of CT scanners, and so it takes a very long time to get even a single slice of an image. I’ve been wondering what it would take to move it from a proof of concept into something that does the job well, or at least substantially better.

dsc_0531

The previous design makes use of a 10uCi Barium-133 radioisotope check source, which is the strongest radioisotope source that’s available without a licence. Barium-133 has strong x-ray peaks around 31kEv and 81kEv, which are low enough energy that organic water-containing materials (like apples and bell peppers) are reasonably contrastive for imaging. The silicon PIN photodiode detector in the Radiation Watch Type 5 high energy particle detector that I used is also most efficient at low (~10kEv) energies, meaning that we would need less exposure time to generate an image with sufficient signal to make things out, although there are technical challenges in detecting these lower energy photons. Imaging under these circumstances is essentially a constant battle for detecting signal over the noise, and one way to generally increase SNR is to increase the measurement time — although of course this increases how long it takes to create an image. In the extreme case when you’re so signal starved, it ends up taking all night just to capture a single slice of a bell pepper.

pepper_overlay1

Another alternative that’s been suggested is to increase the intensity of the x-ray source. For a variety of safety reasons this isn’t something that I’m willing to explore. I’ve heard of folks experimenting with surplus x-ray tubes and equipment, and I believe that your health is just too precious to risk for such endeavors. Pragmatically, as an open source project, using readily available radioisotope check sources is also much more repeatable than found radiography parts.

And so we’re left with a few alternatives:

  • Decrease the scanning volume: The current design has a source-detector distance of about 12cm. Following the inverse square law, each halving of this distance should increase the number of counts at the detector by a factor of 4.
  • Detect in parallel: Starting with second-generation CT scanners, parallel arrays of detectors were used to dramatically decrease scan time. This should give linear speed gains — e.g. N detectors should reduce the scan time by a factor of N.
  • Increase sensitivity: Detecting the signal from single photons requires a genuinely incredible amount of amplification, and this process is noisy. While previously recalibrating the Type 5 detection threshold to just above the noise floor yielded many more counts, it appears as though much of the signal is still buried below the noise floor of the detector.

It’s likely that a solution will end up with some combination of each of these alternatives.

notes2b

Parallel Detectors
Where the first generation of medical CT scanners contained only a single photodetector, I’ve read that each generation of CT scanner after has increased the number of detectors by about an order of magnitude over the generation before it — moving from ~30 detectors in a second generation system, to ~300 in third-generation fan-beam systems, to ~2000 in modern static ring systems. Each of these improvements allowed scan times to reduce accordingly, starting from about half an hour with first generation systems, to about a minute, to now only a few seconds.

Above we can see how the current system might be parallelized with more Type 5 detectors, each arranged across an arc such that there is a constant distance between the source and each detector. The Type 5 is one of my favorite sensors of recent years, and I’ve had a lot of milage out of it, but unfortunately because it’s about an inch across, it would be difficult to parallelize the desktop CT with more than about 6 detectors using this design. Additionally, the fantastic First Sensor X100-7 that the Type 5 uses has a 10mm x 10mm active area, which makes it great for collecting and detecting high energy particles, but a little large for the kind of spatial imaging that we’re doing, so the images will be a little blurred. Having a smaller detection area will increase our spatial resolution, and decrease the detector size — so that (ideally) we’ll be able to pack many more detectors in (as shown below), and decrease the scan time.

notes2b1

Designing an inexpensive parallel detector

High-energy particle detectors are extremely challenging to design and build, in large part because you’re amplifying the signals from a single subatomic particle by many orders of magnitude in order to be barely detectable by a microcontroller. Because of this they’re extremely sensitive to noise, layout, component selection, and generally must also be shielded from external interference. Given the design challenge I’ve been reading whatever I can find on the topic over the last few months, and meditating on layout guidelines to reach the zen-master level of low-noise layout required for such a task, or at least to get a prototype to a point that I can work from.

radiation_sensor_rev0

The design that I arrived at is heavily based on the Maxim Appnote 2236: Gamma-Photon Radiation Detector, which uses a reverse-biased silicon PIN photodiode as the detector connected to an extremely sensitive first-stage amplifier, followed by a series of cascade amplifiers to increase the signal intensity. The photodiode is the popular BPW34, which has been used in a number of other radiation sensor projects. The active area of the BPW34 is only 7.5mm^2 (a little less than 3mm x 3mm), which is a little over 10 times smaller than the 100mm^2 of the X100-7, so this should increase the spatial resolution a good deal.

schematic_rev0

While many of the hobbyist designs for radiation sensors operate at effective noise thresholds of around 100kEv or more, because here we have such a specialized application where most of the business photons are 31kEv or 81kEv, keeping the noise floor as low as possible to be able to detect these photons is critical. A number of academic project writeups for radiation detectors I’ve found suggested that the Texas Instruments LMP7721 would be an ideal choice for helping keep the first-stage amplifier as low-noise as possible, both because of its incredibly low noise characteristics (measured in femtoamps), and an uncommon footprint that includes guard pins to further reduce layout noise.

DSC_0764

To make the boards modular and digitally addressable, I’ve added on a PIC24FV32KA301 microcontroller, which sports a 12-bit analog-to-digital converter (~1.2mV resolution @ 5V), and plenty of RAM to store a histogram of pulse widths to experiment with doing very crude spectroscopy as I’ve done with the Radiation Watch Type 5. Both the raw analog output of the photodiode amplifier as well as the digital output of a comparator serve as input to the PIC, and a digipot allows one to dynamically calibrate the comparator output based on the noise level at runtime. I’ve also included a small LED for debugging.

The PIC currently exposes an I2C interface (SDA/SCL) for external communication, though in retrospect while this makes communication easy, it would also require programming each detector with a unique I2C address — so future revisions might move to an SPI interface.

In terms of assembly, the memory of spending 4 days to assemble a set of fine-pitched Arducorder boards is very fresh with me, and so I’ve designed these to be very easily put together — a single-sided design with 0603 parts or larger on all the passives, and SOIC pitches on each of the ICs with the exception of the SOT-8 footprint for the digipot. So it’s comparatively easy to build a bunch of these and pop them in the reflow oven, and with a low-quantity BOM of $20-$30 (about half of which is from the TI LMP7721), they’re relatively inexpensive.

DSC_0838

A series of guard vias on the ground plane divide the analog and digital sections. I’ve found that shielding the analog section and grounding the shield (like the Type 5) is absolutely essential for operation, much as the appnotes prescribe. I’m not sure what folks tend to use for the shielding tape, but I found that a wrap of electrical tape (to shield the photodiode from light and provide a non-conductive base) followed by a wrap of standard aluminum duct tape from the hardware store seems to work well as a first pass.

detection1a

As an aside, for the low-field MRI project I’ve been looking for a USB oscilloscope or DAQ that would allow recording at at least 100KSps (ideally 1MSPS) at 12+bits of resolution for several seconds. This is a little unusual — usually scopes that are fast have very short sample memories. I’d seen an article about building a sound card oscilloscope in Make that would easily allow for such long recordings, as well as folks selling oscilloscope probes modified for sound card use, so I thought I’d give this a try before using a benchtop oscilloscope.

Above is a recording from the soundcard oscilloscope with the Ba133 radioisotope source near the detector — and the detections were clearly above the noise floor. Wonderful news! The units on the axes aren’t entirely clear here, and with such a slow sample rate we’re right on the edge of being able to detect this signal, so on to a benchtop scope to better characterize things.

DSC_0772

Here I’ve taped the 10uCi Ba-133 radioisotope source to the side of a 2 inch diameter acrylic cylinder. In this configuration I can easily rotate it to see the number of photons detected when the source is directly beside the detector, and compare this to when the source is 2 inches away, and going through a thin (1/16 inch) acrylic sample container.

DSC_0795

The noise floor generally appears to be around +/- 40mV, which is very decent for a first try, although it did appear to increase to nearly +/-60mV after being on for a few hours. I was also doing this at Xerocraft beside a metal shop, and the noise would periodically go a little crazy when a large piece of equipment (like the mill) was turned on, so I’m not entirely confident in the noise characterization — but it’s a good start.

DS1Z_QucikPrint5

The firmware for the PIC isn’t complete, so I was using the scope to trigger on the raw analog output instead of using the PIC (either with the analog output, or the digital comparator output). With the Ba133 source very near to the detector there were far too many detections to count, and with it 2 inches away going through the acrylic cylinder, I counted approximately 20-30 counts per minute. That’s not incredible, but it’s also workable, especially if the noise floor can be lowered, and we gain access to more signal.

To help ground this, the bell pepper image shown earlier was captured at about 22×22 pixel resolution, with about 60 seconds of integration time per pixel, for a total of about 9 hours of acquisition. Using a parallel array of about 20 of these BPW34 detectors, the rows of such an image could be captured in parallel, so we’d only have to scan in one dimension. Assuming it takes 5 minutes of integration to capture an image with a similar baseline signal (say 100 counts) to 60 seconds of integration with the Type 5, we could capture a similar (likely sharper) image with about 20 measurement cycles using the parallel detector. At 5 minutes per measurement cycle, this would reduce the acquisition time to about 90 minutes, or a factor of 10 faster than the original device. Were we to significantly improve the noise threshold, this could further decrease the acquisition time, and really start to get low resolution images in under an hour, and complete low resolution tomographic scans (at 10 degree increments, or 36 angles) in under a day. That’d be a substantial improvement over the current system.

DS1Z_QucikPrint17

openct2a

Mechanical Design
Given the requirement for long acquisition times (though much shorter than before), I’ve been sketching up a simplified mechanical design that could be entirely self-contained without any moving parts exposed, and placed on the edge of ones desk to scan away for the duration. I’d also like it to be much more compact than the original design so that it’s unobtrusive, while being attractive and interesting enough that it would be a welcome addition to ones desk. The basic design would be a cylinder approximately 6-8 inches in diameter, and 8-12 inches high, with a 2-3 inch diameter aperture in the top to place in a cylindrical acrylic sample container.

DSC_0865

Moving to a parallel detector removes the need to linearly scan both the source and detector, which removes two of the four axes, greatly simplifying the mechanical design. In this prototype, the idea is that a small sample container slides in through the top, and the sample itself (rather than the source and detectors) will rotate, also greatly simplifying the design.

DSC_0867

The source and detectors would be mounted on a moving Z-axis (shown above), that translates them up and down to capture different slices of the sample. While here I’ve included two stepper motors on nylon lead screws for this purpose, in practice this doesn’t appear to provide enough support to prevent the Z stage from walking, especially with an unbalanced load — and so this will likely change to three lead screws in the next revision. The drive motors are currently NEMA14 steppers, with a NEMA17 footprint for the sample cylinder support and rotation.

DSC_0869

So far a great first set of experiments and sketches, and it’ll be interesting to better characterize the detector design, make improvements and revisions, and see how it all ultimately affects acquisition time.

Thanks for reading!

Source Release — Open Source CT Alpha

openct-source

I’m very happy to announce the first release of the Open Source Computed Tomography (CT) scanner project. This is an early alpha release, and contains all of the source at the projects current stage, including the laser cutter design files for the machine structure, EAGLE source files, and the sample Arduino sketch.

The source is available for download here [zip], and is also available on GitHub. For potential contributors, the TODO file also includes near term project goals at a variety of skill levels, from adding end-stops and designing the official Arduino shield, to designing parallel detectors that decrease scan time, and developing a new source/detector pair for different wavelengths of interest.

I’m excited to see what folks do with this project, both now and as it matures. If you build one, want to contribute to the project, or encounter any issues, please send me a note.

In other news, the Bay Area Maker Faire is coming up in a short two months. With a good amount of progress on the Open Source CT Scanner, I’m going to switch gears for a while back to the Mark 5 Arducorder — I’d love to have the firmware and basic functionality working and demonstrable by then.

Open Source CT in MAKE Magazine

makemagazine_ctscanner400

Very exciting news — the Open Source Desktop CT Scanner is featured in this month’s MAKE Magazine Homebrew Section. I’ve been a great fan of MAKE for years and presented the Science Tricorders at their first Hardware Innovation Workshop, and so it’s very exciting to see the project in this issue.

Source Files: There’s been a lot of interest in having the source files for the alpha version of the scanner, and so I’ll endeavor to have these up within a week or so. I’m in the process of collecting and packaging the source, as well as moving everything to GitHub (including TODO lists) so that it’s much easier for folks to contribute.

I think that the best thing for an open source project is to bootstrap an initial community of users that can grow into a community of contributors, and so I’d like to cut out a few sets of the laser cut parts to send to one or two folks who are interested in building (and ideally contributing) to the project. If you’re interested, please send along a note with your background and how you’d like to contribute, to peter at tricorderproject dot org.

thanks!

Dr. Jansen, or: How I Stopped Worrying and Learned to Love the Barium

pepper_overlay1

After a marathon build session, the first images from the open source CT scanner are here! The story…

The Detector
dsc_0425

Recall that in the last update, the stock Radiation Watch Type 5 silicon photodiode high energy particle detector was found to be calibrated for Cesium, with a detection threshold likely somewhere near 80keV. This was too high to detect the ~22keV emissions of the Cadmium-109 source, and so I put together an external comparator that could adjust the threshold down to the noise floor. After testing the circuit on a protoboard, I designed a tiny board that sits on the back of the Type 5, and through the use of a 10-turn potentiometer allows you to recalibrate the threshold down to the noise floor.

dsc_0417

I designed some mounting plates that could mount to the linear carriages for the source and detector.

dsc_0436

Here, the detector is mounted onto an offset mounting plate, which in turn connects to the detector carriage. The wiring harness breaking out all the detector pins feeds through the center of the carriage to a fixed mount point on the bore that acts as a strain relief. Looks great!

The Source

Even with the upgraded extra-sensitive detector, I was still seeing many fewer detections than I was expecting — albeit about an order of magnitude more than without the enhancement. A kind fellow on the Radiation Watch facebook group made a spice simulation model based on the helpful schematics that the folks at Radiation Watch make available, and his simulations suggested that the noise floor for this circuit is around 30keV. This means that with Cadmium 109, whose primary emissions are around 22keV, I was likely still missing the majority of the emissions, and getting many fewer counts than I was expecting.

Enter the Barium-133. There are a number of radioisotope check sources that are commonly available, but many of them have very high energy emissions in the many hundreds (or thousands) of keV — likely far too high energy to be usefully absorptive for everyday objects. The emission spectra I’ve seen for the tubes in commercial CT scanners tend to have broad spectrum emissions centered around 60-70keV, and the datasheet for the silicon photodiode suggests it’s most sensitive from 10keV to 30keV, where the sensitivity drops off afterwards. A higher detection efficiency means that we can get by with a less intense source, and with check sources that are barely detectable over background a foot away, it’s a battle for signal, and every photon counts.

Barium-133 has primary emissions around the 33keV range, and seems to be one of the few commonly available radioisotopes (aside from Cadmium-109) with such low emissions. To give the system the best possible chance of working, I ordered a 10uCi Ba133 source (up from the 1uCi Cd109 source I was using previously). With the source 10cm away from the detector, with the background rate at 20 counts per minute, the Cd109 source reads about 70 (so a delta of 50), and the Ba133 source reads around 1500 (!), so we’re definitely detecting many more of the lower energy emissions, and this should have a much better signal-to-noise ratio, and decrease the acquisition time required for collecting good data.

dsc_0465

The Ba133 source also comes as a sealed 25mm disc. I designed a sandwich mount for these source discs that contains between 3-6mm of lead shielding at a variety of angles, and a very rough approximation of a lead collimator with a 3mm hole drilled in the front to give some directionality to the source. Testing out a few angles, this appears to have brought the reading down to about 60 cpm at 15cm away, except for directly ahead, where the intensity is about 550cpm at 15cm. Sounds great!

dsc_0477

Putting it all together

dsc_0482

I have to confess, I’m a bit of a late sleeper (and a night owl), but I was so excited about finally putting everything together and collecting the first data, that I woke up early Saturday. After a marathon 13-hour build session, I finished designing and fabricating the source and detector mounts, and putting the bore back together.

dsc_0482

With one of the bore covers removed, these pictures make it a little easier to see the complete linear axis mechanisms that are contained within the bore. You can thank my dad for discouraging my rampant hot glue use at a young age, and encouraging me to design things that were easily serviceable. I’ve given a few students the same talk when I see them wielding a hot glue gun for one of their projects… 😉

dsc_0488

openct2

Putting it all back together — looks beautiful!

And now, the data!

openct1

After the marathon build session, I took the very first data from the instrument — a quick absorption image straight up the center of this apple. Data was low resolution and noisy, but fantastic for the very first data from the instrument.

open_ct_xerocraft2

A very tired, but very pleased person after collecting the first data off the scanner around 1am.

rsz_avocado_picture

I had some time Monday evening to write some basic firmware for collecting images, storing them to an SD card, specifying the size and resolution parameters, the integration time for the detector, and so forth. In probably one of the strangest things I’ve ever done, and feeling very much like Doc Brown, I went to the grocery store and found a few vegetables that have internal structure and might be interesting to scan. I decided to start with the avocado…

I’d previously determined empirically that the optimal integration time for this setup is about 90 seconds per pixel — that tends to give a stable count of around 550cpm +/- 4 cpm. Lower integration times will give proportionately more noise, but be much quicker to scan.

The avocado is about 10cm by 12cm, and so to capture a first test image I set it to a 5mm resolution with a relatively fast 10 second integration time per point (bringing the total acquisition time to 20 x 24 x 10 seconds, or just over an hour).

avacado1b_log10

And it worked! The image is certainly a bit noisy (as expected), but it looks great. The table and the avocado are clearly visible, and the seed might also be in there, but we’ll need a bit higher integration time to see if that’s real structure, or just noise.

avocado_overlay1

Overlaying the scan atop the picture, the scan is a perfect fit!

avocado2_60sec

The integration time for the first image was only 10 seconds per pixel, and so I setup a longer scan with an integration time of 60 seconds per pixel. Beautiful! This still isn’t quite at the empirically determined sweet spot of 90 seconds, but it really cleaned up the noise in the first image.

avocado2_log10_60sec

The same data, with log scaling rather than linear scaling. I’m not entirely certain whether avocado pits are more or less absorptive to 33keV photons than the surrounding avocado, so it’s not clear whether we’re seeing lots of absorption at the center because of the seed, or because there’s 10cm of fruit between the source and detector…

rsz_dsc_0606

But I’d love to see some internal structure. So tonight I put the bell pepper on, which is about the same size as the avocado, and set it to an integration time of 20 seconds.

pepper1_20sec

And the result! It definitely looks like a bell pepper, and you can clearly see the seed bundle inside. Incredibly cool!

pepper1_log10_20sec

The same image, log scaled instead of linear scaled.

pepper_overlay1

And the overlay. Looks beautiful!

What a fantastic few days for the open source CT scanner, and the initial data looks great. There’s still plenty to do — now that the source and detector are working, I can finish designing the Arduino shield with four stepper controllers (two for the linear axes, one for the table, and one for the rotary axis). The source is also currently collimated in only the most liberal of senses, and in practice the detection volume for a given pixel is likely a pyramid that starts from the ~3mm source aperture and meets the ~1cm square detector — so the images should sharpen up a good deal by better controlling the beam shape. Once all of that is working, and I add an accelerometer sensor to the rotational axis to sense it’s angle, I should be able to scan from 180 degrees around the sample, and test the ability of the instrument in computed tomography mode, backing out the internal structure of a given slice from a bunch of 1D images. Very exciting!

Thanks for reading!

Update on CT scanner: calibrating the radiation detector

rsz_20140118_164737

I thought I’d post a quick update on the open source CT scanner, especially given that there’s been a lot of interest lately. Life has been a little busy over the past few months — in the lab with paper writing, visiting home for the holidays, as well as my mom very recently passing away after her 18 year battle with cancer — so I haven’t been as good at writing updates as I’d planned.

Before getting to the CT scanner, I’d like to put together a proper update on the Mark 5 Science Tricorder shortly. A quick preview — in November I redesigned the Mark 5 from scratch to be less expensive, more modular, easier to modify, and quicker to program — and I call the result the Arducorder. The Mark 5 is now Arduino Due compatible, so the hundred thousand folks out there who can program an Arduino should be able to comfortably pick up and modify the code quickly, while others looking to start writing code for the Arduino could do so over a weekend using the large existing library of books and resources available. The sensor board now attaches using a standard 0.1 inch header, so one should be able to prototype new sensor boards quickly. The system includes a separate graphics processor (the FT800), so the graphics capabilities should be similar to something like a Gameboy Advanced, and the motherboard also includes a WiFi module for wireless connectivity (the CC3000). The boards arrived just as I flew back in from the holidays, and I’ve been building them and writing the drivers and firmware over the past few weekends.

A source and detector for the CT scanner
As I mentioned in my first post, for safety I’ve designed the CT scanner to use a radioisotope x-ray source that’s barely above background levels — the tradeoff for this safety being acquisition time. There are a number of low-intensity radioisotope check sources commercially available, each with different emission spectra. I ended up deciding to use a Cadmium-109 source, which has the lowest energy photons I could find — it’s primary emissions are 22keV x-rays, with a small secondary 88keV emission. I’ve read that 22keV photons are about 50% absorbed by 2cm of tissue, so this seemed like a usefully contrastive figure — too much or too little absorption and you’ll need many more samples to make a useful image, which increases acquisition time. Higher energy emissions from other radioisotopes (in the many hundreds of keV) might be useful for imaging metals, but generally have much less absorption for non-metallic materials that I’d like to image first (like vegetables).

The detector data sheet from the folks who make the Radation Watch Type 5 also suggests that lower energy photons are much easier to detect, with a detection efficiency of around 30% for 22keV photons, where this drops to about 1% for 88keV photons (Note: although further discussions suggest that they might be using a slightly different photodiode than the FSX100-7 2.0 in the Type 5, so the efficiency curve in the datasheet is likely somewhat different). Functionally, this is like having a source that’s 30 times more intense — when every photon counts, if you can count them more efficiently, you can get by with much less exposure, and decrease the acquisition time.

The actual measured count from the Cd-109 source ended up being much less than I was expecting, and after some investigation it looked as though the detector was only measuring the 88keV emissions, which both account for only 4% of the total emissions, and are also detected much less efficiently — reducing the number of counts by a factor of about 2000. After chatting with the Radiation Watch folks, it looks as though the Type 5 is calibrated for Cesium-134 and 137, which emit higher energy photons, and would need to be recalibrated to sense the lower energy emissions of Cd-109.

rsz_20140118_164818

Measuring high energy photons is a constant battle between signal and noise, in part due to the extreme amplification required to sense a single subatomic particle and convert this into a voltage signal with an energy many orders of magnitude larger that can be detected by a microcontroller. To keep the signal-to-noise ratio favorable, the Type 5 has a detection threshold somewhere above the noise floor of the amplifier circuit and below the energy produced when a high-energy Cesium photon hits the detector. To recalibrate the detector for the 22keV Cadmium-109 emissions, this threshold has to be lowered — but it can only be lowered so far, and if the peaks produced by the 22keV photons aren’t larger than the noise, then we’re out of luck and either have to choose a more sensitive detector or a source with higher energy emissions. Thankfully the folks who designed the Type 5 detector were good enough to include a test point after the amplifier and before the threshold comparator that can be used to see the raw signal, and recalibrate if required.

20140118_165304

Above on the oscilloscope, the raw signal is pictured in blue (50mV/div), and the detector output (raw signal after the comparator) that’s fed into a microcontroller is pictured in yellow (2V/div). Here, an 88keV photon from the Cd-109 source has hit the detector, and we’re seeing the yellow line trigger. We can also see that the raw signal for this photon is about an order of magnitude above the noise floor — looking good.

20140118_165130

Here we see a lower energy emission from the Cd-109 source, likely in the 22keV range. These lower energy emissions are just barely above the +/-80mV noise floor, and setting the oscilloscope to trigger at 90mV above has the scope triggering frequently when the Cd-109 source is near the detector, and very little when the source is taken away. So, we’re just barely squeaking in, and appear to be able to measure the ~22keV photons just above the noise floor.

To make a quick ball-park measurement of the signal-to-noise ratio, I measured the number of detections (by hand) above 90mV with the Cd-109 source 10cm away from the detector, as well as the background rate (Cd-109 source in a shielded container), and with a few nearby materials between the source and detector.

  • The background rate is up from about 5 counts per minute (cpm) to around 15. These extra detections are likely a combination of actual lower-energy photons that the detector threshold was missing, as well as electrical noise.
  • The count with the source 10cm from the detector is 80 cpm. This is still less than I was expecting, but it’s very workable as a starting point.
  • With a 1/4 inch acrylic sheet between the source and detector, the detector measures 52 cpm — so it’s absorbing about 40% of the 22keV x-rays.
  • With a 1/8 inch MDF sheet between the source and detector (the same material that the prototype CT scanner is constructed out of), the detector measures 68 cpm — or about 20% absorption.
  • A bottle of water between the source and detector measures about 18 cpm, so nearly 95% absorption.

So, we’re looking very good, and these numbers should give useful contrast and interesting images for many small non-metalic objects.

Next Steps
Rather than modify the comparator on the detector circuit itself, I’ll put together a small board that connects to the raw test point, and has a comparator with a precision potentiometer for calibration. This will both make it a lot easier for folks to replicate, but also reduce the chances of introducing noise into the detector by removing and trying to replace the electrical shielding around the detector.

Because I’m seeing fewer detections than I’d expected, I’ll also make a small mechanical modification and move the source and detector from the outside of the gantry to the inside — reducing their distance from 30cm to somewhere in the 10-15cm range.

Once that’s complete, I should be able to acquire the first images of a few test targets!

Towards an inexpensive open-source desktop CT scanner

rsz_dsc_0803

A bit of a story, and then a lot of pictures — by far the most interesting class I’ve ever taken was Advanced Brain Imaging in grad school. As a hands on lab class, each week we’d have a bit of a lecture on a new imaging technique, and then head off to the imaging lab where one of the grad students would often end up in the Magnetic Resonance Imager (MRI) and we’d see the technique we’d just learned about demonstrated. Before the class I was only aware of the structural images that most folks think of when they think of an MRI, as well as the functional MRI (or fMRI) scans that measure blood oxygenation levels correlated with brain activity and are often used in cognitive neuroscience experiments. But after learning about Diffusion Tensor Imaging, spin-labeling, and half a dozen other techniques, I decided that the MRI is probably one of the most amazing machines that humans have ever built. And I really wanted to build one.

MRI is a spatial extension to nuclear magnetic resonance spectroscopy (NMR), and requires an extremely homogeneous high-intensity magnetic field to function — far more uniform than you can achieve with permanent magnets or electromagnets. For MRI, this uniformity is often accomplished using a superconducting magnet that’s cooled to near absolute zero using liquid helium. This, of course, makes it extremely technically difficult to make your own system. While folks have been able to use large electromagnets for NMR (they average out the magnetic field intensity over the sample by spinning the sample very rapidly while it’s inside the magnet), I haven’t seen anyone demonstrate building an imaging system using an electromagnet. There are some experimental systems that try to use the Earth’s magnetic field, but the few systems I’m aware of are very low resolution, and very slow.

Volumetric biological imaging has two commonly used tools — MRI and Computed Tomography (or CT), sometimes also called Computed Axial Tomography (or “CAT”) scanning — although ultrasound, EEG, and a bunch of other techniques are also available. Fast forward about two years from my brain imaging class (to about three years ago), I had started my first postdoc and happened to be sitting in on a computational sensing / compressed sensing course.

rsz_1dsc_0692

About the same time I happened to be a little under the weather, and stopped into a clinic. I thought I’d torn a muscle rock climbing, but after examining me the doctor at the clinic thought that I might have a serious stomach issue, and urged me to visit an emergency room right away. As a Canadian living abroad, this was my first real contact with the US health care system, and as exciting as getting a CT was (from the perspective of being a scientist interested in medical imaging), from a social perspective it was a very uncomfortable experience. Without really going into details or belaboring the point, universal health care is very important to me, and (what many consider) a basic human right that most of the folks in the developed world have access to. My mom was diagnosed with cancer when I was young, and we spent an awful lot of time in hospitals. Her and my dad still do, after 15 years and more surgeries than anyone can count. It’s frightening to think of where we’d all be if her medical care wasn’t free. And so when a bill showed up a month or so after my emergency room visit for nearly $5,000 (most of which was covered by a health insurance company), I nearly needed a second trip to the emergency room, and I thought a lot about the many folks I knew, including my girlfriend at the time, who didn’t have any form of health insurance and basically couldn’t go to the doctor when they were ill for fear of massive financial damage.

With all of this in mind, knowing the basics of medical imaging, and having just discussed computed tomography and the Radon transform in the class I was sitting in on, I decided that I wanted to try and build an open source CT scanner, and to do it for a lot less than the cost of me getting scanned, by using rapid prototyping methods like laser cutting and 3D printing.

It’s been a few years since I’ve had access to a laser cutter, and they’re one of my favorite and most productive rapid prototyping tools. In the spirit of efforts like the Reprap project, I enjoy exploring non-traditional approaches to design, and designing machines that can be almost entirely 3D printed or laser cut. Fast-forward almost two and a half years to last month, and the local hackerspace happened to have a beautiful laser cutter generously donated. This is the first cutter I’ve had real access to since grad school, and with the CT scanner project waiting for a laser cutter and a rainy day for nearly two years, I immediately knew what I wanted to have a go at designing. On to the details.

rsz_dsc_0732

From a high-level technical standpoint, a computed tomography or CT scanner takes a bunch of absorption images of an object (for example, x-ray images) from a variety of different angles, and then backs out 3D volumetric data from this collection of 2D images taken from different angles. In practice, this is usually done one 2D “slice” at a time, first by rotating an x-ray scanner around an object, taking a bunch of 1D images at tens or hundreds of angles, and then using the Radon transform to compute a given 2D slice from this collection of 1D images. One can then inspect the 2D slices directly to see what’s inside something, or stack the slices to view the object in 3D.

rsz_dsc_0795

Mechanically, this prototype scanner is very similar to the first generation of CT scanners. An object is placed on a moving table that goes through the center of a rotating ring (or “gantry”). Inside the ring there’s an x-ray source, and on the other side a detector, both mounted on linear stages that can move up and down in unison. To scan an object, the table moves the object to the slice of interest, the gantry rotates to a given angle, then scans the source and detector across the object to produce a 1D x-ray image. The gantry then rotates to another angle, and the process repeats, generating another 1D image from a slightly different angle. After generating tens or hundreds of these 1D slices from different angles, one backs out the 2D image of that slice using the Radon transform. The table then moves the object slightly, and the process is repeated for the next slice, and the hundreds of other slices that are often taken in a medical scan. Modern scanners parallelize this task by using a fan-shaped beam of x-rays and hundreds of simultaneous detectors to scan someone in about a minute, but the first generation of scanners could take several minutes per slice, meaning a scan with even tens of slices could take an hour or more.

rsz_dsc_0703

Designing an almost entirely laser-cuttable CT scanner with four axes of motion, one being a large rotary gantry, was a lot of fun and an interesting design challenge. I decided that a good way to rotate the gantry would be to design it as a giant cog that sat atop a system of drive and idler cogs, that could slowly index it to any angle.

rsz_dsc_0762

One of the issues with laser cutting a giant cog is finding something to mate with it that can transfer motion. I’ve press-fit laser cut gears onto motor shafts before (like with the laser cut linear CNC axis, but in my experience they can slip or wear rather quickly, and I like being able to disassemble and reassemble things with ease. I decided to try something new, and designed a laser-cuttable 2.5D timing pulley that mates with the main rotary cog, and securely mounts on a rotary shaft using a captive nut and set screw. On either side of the shaft there’s space for a bushing that connects to the base, and inside the base there’s a NEMA17 stepper from Adafruit that transfers motion to the drive shaft using a belt and timing pulleys.

rsz_dsc_0723

A small lip on the base acts as the other edge of the timing pulley, and helps keep the main rotary axis aligned.

rsz_dsc_0747

Inside the rotary gantry are two linear axes 180 degrees apart — one for the source and the other for the detector. The gantry is about 32cm in diameter, with the bore about 15cm, and the gantry itself is about 8cm thick to contain the linear axes.

rsz_dsc_0751

Each linear axis has a small carriage that contains mounts for either the source or detector, some snap bushings for two aluminum rails, and a compression mount for the timing belt. Each axis also has an inexpensive NEMA14 stepper and an idler pulley. Here, I’m using a very small solid state high-energy particle detector called the Type-5 from Radiation Watch, which can be easily connected to an external microcontroller. This is really very easy to work with, and saves me from having to use a photomultiplier tube and scintillation crystal that I found on eBay from an old decommissioned PET/CT scanner.

rsz_dsc_0752

I’m certain if the symmetry were any more perfect, it would move one to tears. The rotary gantry has to be symmetric to ensure proper balance and smooth rotation. After rotating the gantry 180 degrees, here you can see the other linear axis intended for the source. It currently just contains a mount pattern with 4 bolts, that a source will eventually mount to.

Safety is very important to me. In medical diagnostic imaging it’s often important to have an image as soon as possible, but that’s not the case for scanning non-living objects purely for scientific or educational interest. This chart from XKCD shows the radiation that folks typically absorb from every day adventures like banana-eating and sleeping beside someone to hopping on planes or having a diagnostic x-ray. I’ve designed this scanner to operate on levels slightly above the natural background level, well into the blue (least intense) section of the xkcd graph, and make use of a “check source”, which is an extremely low intensity source used to verify the functionality of a high-energy particle detector. The trade-off for this safety is acquisition time, and it will likely take a day or more to acquire data for even a small object. This aspect of the design is scalable, such that if the scanner were to be used in a research environment in a shielded room, folks braver than I should be able to acquire an image a good deal faster.

rsz_dsc_0758

The sandwich of four plates on either end of the linear axes contain precision mounts for the aluminum shafts, as well as a setscrew with captive nut to hold the shafts in place.

rsz_dsc_0719

The table itself is about 40cm long, and offers nearly 30cm of travel. It uses a light-weight nylon lead screw to index the table, with a NEMA14 drive motor located in the base.

rsz_dsc_0779

To test out the motion and detector, I put together an Arduino sheild with a few Pololu stepper controllers and a connector for the detector. The seeed studios prototype board I had on hand only had space for three stepper controllers, but it was more than enough to test the motion. Each axis runs beautifully — I was sure the rotational axis was going to have trouble moving smoothly given that most of the moving parts were laser cut, but it worked wonderfully on the first try, and moves so fast I had to turn down the speed lest the neighbours fear that I was building a miniture Stargate…

When I solidify all the bits that have to be in the controller, I’ll endeavor to lay out a proper board much like this prototype, but with four stepper controllers, and an SD card slot to store the image data for long scans.

rsz_dsc_0772

For size, here you can see the Arduino and shield together on the scanning table. I’m hoping to start by scanning a carrot, move up to a bell pepper (which has more non-symmetric structure), and work up to an Apple. Since time on commercial machines is very expensive, I think one of the niche applications for a tiny desktop CT scanner might be in time-lapse scans for slowly moving systems. If the resolution and scan speed end up being up to the task, I think it’d be beautiful to plant a fast-sprouting seed in a tiny pot and continually scan it over a week or two to build a 3D volumetric movie of the plant growing, from watching the roots in the pot grow, to the stalk shooting up and unfurling its first leaves. I’m sure the cost of generating that kind of data on a medical system would be astronomical, where the material cost of this prototype is in the ballpark of about $200, although I’m expecting that a source will add about $100 to that figure.

1264585_4716534411740_673623925_o

And finally, here’s a quarter-size acrylic prototype that I designed and cut in an afternoon a few weekends ago, that started the build and brainstorm process. My recently adopted rescue cat ironically loves to hang around the “cat” scanner, and has claimed nearly all of the open mini spectrometers I’ve built as toys to bat around…

Laser cutters are really amazing machines, and it’s really incredible to be able to dream up a machine one morning, spend an afternoon designing it, and have a moving functional prototype cut out and assembled later that evening that you can rapidly iterate from. Since laser cutters are still very expensive, this work wouldn’t have been possible without kind folks making very generous donations to my local hackerspace, and I’m extremely thankful for their community-minded spirit of giving.

thanks for reading!