Thursday, March 30, 2006

Quantum Dots – Lighting the Way

Quantum Dots are small devices fabricated with semiconductor techniques to contain a precise number of electrons. Because these electrons are contained, they exhibit “quantum” characteristics found in atomic systems. They are useful in studying atomic properties and are used in biological and genomic experiments in particular.

Quantum dots range in size from 2-10 nm. A 1 cm long penny contains 4M quantum dots. Here’s a short tutorial. Also, Wikipedia provides additional background information including fabrication and applications.

Quantum dots emit light and can be tuned to any particular wavelength over the entire visible range of spectrum from infrared to ultraviolet. As the dot becomes smaller, the wavelength shortens and the color shifts toward blue. One can make LEDs from Quantum Dots. Michael Bowers of Vanderbilt developed quantum dots at such a small size that the fluorescent properties changed. Quantum dots designed to fluoresce blue turned into a full-spectrum white when excited by a laser. The white light has applications in replacing standard light bulbs (a combination electrical/mechanical process) with a process completely electrical which would generate significant savings in energy.

Companies operating with this emerging technology include Evident Technologies, and Quantum Dot Corporation.

As with many emerging technologies, there’s a conference dedicated to the topic.

Virtual Instrumentation brings to the field of Quantum Dots data acquisition, imaging, and control. Johns Hopkins researchers use virtual instrumentation to built a dual-wavelength emission confocal spectroscopic system to measure targets. MIT researchers used quantum dots with a near-infrared approach for biomedical imaging in living tissue. The Centre for Micro-Photonics at Swinburne University built virtual instrumentation-based systems with LabVIEW for nanopositioning and photon counting for quantum-dot applications.

Best regards,
Hall T.

Friday, March 24, 2006

fMRI—Telescope for the Brain

MRI (Magnetic Resonance Imaging) has taken great leaps forward in the past few years. Today it has expanded into fMRI (functional Magnetic Resonance Imaging) which maps brain activity. It is a technique for determining which parts of the brain are activated by different physical sensations or activities. According to this brief tutorial A subject is placed inside the MRI and a high resolution scan is taken of him. The subject is given stimulation (ex. shown images) followed by a low resolution scan. By comparing low resolution scans both with and without the stimulation, the fMRI can detect brain activity. fMRI can also correct for head movements and other distortions of the image. The low resolution images are then overlaid on the high resolution image to create a final picture.

fMRI measures the blood flow changes that occur in the brain based on stimulated activity. At the atomic level, hydrogen atoms in the blood are magnetic dipoles and react when in the presence of a magnetic field such as that used in an MRI machine. fMRI has the advantage of making non-invasive measurements in a relatively short amount of time which makes for a better procedure compared to most electrophysiological data acquisition systems which require the use of electrodes, wires, and more.

Each part of the brain controls a specific function. Here’s an example of site locations for sensory, motor, language, and vision.

In this Wired article Columbia University researchers use fMRI to map brain activity in an effort to detect when someone is lying. The activity in the caudrate region of the brain which controls emotion becomes engage when a subject is lying, but remains neutral when a subject is telling the truth.

In another application University of Michigan and Harvard researchers used fMRI to map how the brain perceives brands and products. This seems like an odd application, but it’s within the realms of technology.

Researchers are also using it to diagnose and treat bipolar disorder.

The technology of magnetic resonance imaging can be used in several forms. Wikipedia also related MRI technologies. Contrast MR uses a contrast agent such as iron oxide which causes a change in the magnetic field. This technique can be used to measure properties of the blood. Another is magnetic resonance spectroscopic imaging which uses a nuclear magnetic resonance technique to measure the magnetic properties of the nucleus of an atom. Different chemicals have different magnetic resonances.

One can build or supplement a magnetic resonance system with virtual instrumentation. Researchers at Battelle Memorial Institute, built their own magnetic resonance microscope that combines a confocal microscope with an MRI system to allow the study of living cells from two points of view simultaneously. The confocal microscope provides a 3-D image of the structure of the cell, while the MRI provides the magnetic imaging. Combining both images yields new information about the cell.

Best regards,
Hall T.

Friday, March 17, 2006

SQUID Technology – Measuring Magnetism

SQUID stands for Superconducting Quantum Interference Device and is a sensitive detector for magnetic flux and it places the sensor in a cryogenic state to measure extremely small magnetic fields. Any low frequency signal that can be converted into a magnetic flux can be measured with a SQUID.

Biomedical Applications dominate the field of SQUID with companies such as Tristan Technologies, Biomagnetics Technology, CTF Systems and Neuromag leading the way in multichannel SQUID systems for imaging the brain.

SQUID technology’s history dates back to the 1970s when the company SHE (Superconducting Helium Electronics) founded by John Wheatley and his colleagues pioneered the work. This effort was followed by a cottage industry of startups in the California area. Japan entered the game and through a MITI-funded consortium pushed the technology to a new level.

Tristan Technologies is a pioneer in SQUID technology dating back to the founding days of the technology. Their applications for SQUID technology include materials analysis, biomagnetic measurements and non-destructive testing. Their noninvasive magnetic imaging system map the brain activity in infants who are at risk for cerebral palsey and epilepsy.

SQUID technology can be configured to work as a microscope for such as applications as viewing integrated circuits and ultra-thin magnetic films. It can also be used for biomagnetism. Here’s an application measuring the condition of the heart.

The Industrial Physicist one of my favorite publications, has a more detailed article on it.

Best regards,
Hall T.

Friday, March 10, 2006

FPGAs – Driving Computing, Military, Medical, Electronics, Wireless and Many More Applications

FPGAs are increasingly used for high performance computer applications. In this article the author notes that FPGAs change the fundamental programming methodology by tying one programming instruction to one FPGA and thus shifting it from a sequential processing system to a parallel one. Christopher Lazon said,

“Adding FPGAs change the characteristics of a supercomputer, fundamentally. As the FPGA is reconfigurable one is able to design a non von Neumann network topology, specific to the algorithm with one cycle latency. The algorithm is programmed at a very fine-grain parallelism, a single instruction on each node, each node adapted to run its instruction. The architecture transforms sequential instruction scheduling into parallel packet switching.”

Another example is Intel where researchers are using the FPGA to build a simulator system by placing 1000 processors in forty FPGA units. The size, power consumption, and cost are far below what a traditional supercomputer architecture costs. More specifically, for this configuration a traditional architecture would cost $2M while the FPGA version cost $100K.

FPGAs (Field Programmable Gate Arrays) offer tremendous promise to the field of data acquisition, a key technology in the world of Virtual Implementation. Chips with increased A/D sampling rates are matched by the increased capabilities of FPGAs thus offering improved real-time performance. In this article the author outlines the advantages of applying an A/D converter to an FPGA for real-time processing.

FPGAs are also used in high-end imaging systems, such as medical imaging. Pixel Velocity offers an FPGA-based imaging system with very low NRE costs. It takes some effort to build up the infrastructure in an FPGA to support all the peripherals required, but once established, it becomes highly leveraged by letting the designer easily add new features. The other challenge is the programming aspect. Again, an infrastructure needs to be built that allows for additional modifications. Once in place, the designer reaps the benefits.

FPGAs are increasingly used in consumer electronics where components must be low-cost. Initially used as glue logic, FPGAs are now used for the core logic.

In the military, FPGAs find use in Software Defined Radios by combining both IF (intermediate frequency) and baseband processing into one processor.

In the communications area, cellular, broadband internet wireless, and data security are driving the use of FPGAs for the complex front end processing.

For those engineers trying to make the jump to FPGAs, here’s an interesting article that highlights difference between ASIC design and FPGA-based design. In ASIC design, the cost of a design turn drives verification and testing, while the FPGA allows the designer to drive toward a performing function without the associated cost of creating new masks.

Best regards,
Hall T.

Friday, March 03, 2006

Digital Signal Processing – Going Embedded, Real-time, and Multi-Core

Digital Signal Processing impacts numerous industries and applications including audio, video, consumer electronics, multimedia, speech recognition, biological systems, wireless, telephony, broadband communications, and more. Application examples abound from ADI, TI, and Xilinx to name a few. Xilinx teamed up with Pixel Technology to perform 3-D facial recognition technology with 1000x performance over traditional processors by using a stacked architecture of FPGA and DSP processor units. TI is targeting WiMAX with their DSP platform. Chipwrights uses a vector architecture to process several data streams at once for image processing on hand-held devices.

One of the main trends in DSP is the shift to embedded applications. General purpose DSP is dominated by the cellular industry with China currently driving the demand. Embedded DSP is driven by the consumer electronics industry and has a broader reach with about 2x the volume. The DSP-FPGA.COM site is a good source of information and has a news release on the topic.

Another trend is the shift from single to multi-core processors. This article describes how this shift is happening not only in microprocessors but is a natural fit for DSPs as well.

Other trends include DSPs working with real-time systems. This article describes several strategies for handling real-time data flows including polling a status bit to see if new data has arrived and using interrupts and stacking up the I/O for follow on processing.

If you are unfamiliar with DSPs, you can find a tutorial on it, here.

Two resources for more information are the Global Signal Processing Times and DSP-FPGA.COM

Best regards,
Hall T.