Friday, November 30, 2007

ITER – “The Way”

The ITER (originally stood for the International Thermonuclear Experimental Reactor but now refers to the Latin phrase “The way”) is an international research consortium that seeks to understand fusion as a source of energy. Fusion is the process in which two atomic nuclei ‘fuse’ together and in so doing release enormous amounts of energy.

The ITER will be built in Cadarache, in the south of France but joint work will continue in Germany and Japan. The ITER device will consist of a Tokamak-like design which is a torus-shaped device containing a hot gas controlled through magnetic effects. A complete technical documentation list can be found here.

The controls and data acquisition system is defined as the Codac sampling data at around 5MHz, but streaming data at 200MB/s from 4000 channels and saving up to 60 TB of data per year.

According to Joe Lister the ITER seeks a new level of reliability in an increasingly regulated industry for nuclear power generation. Their goal is 98.65% uptime over a 6 year period. They plan to have 3000 analog data points for feedback with a 1 to 5msec response rate for sensors to actuators. The 1588 bus is under consideration for timing applications.

There are nearly 120 unique plant systems that must be controlled using a “classical” distributed control design model. They are currently in the design stage of the CODAC system and plan to start operations in 2016. The CODAC is based on 3-levels of control including Interlock and Safety subsystems. The CODAC defines a set of XML schemas common to all plant systems that allow the system to be data driven. One of the challenges is coordinating the systems from 3 separate PLC vendors.

Best regards,
Hall T.

Friday, November 23, 2007

Particle Physics Experiments—The Global Design Effort (GDE) of the ILC

As the International Linear Collider continues to define its roadmap going forward, Barry Barrish just released the GDE – Global Design Effort – outlining the stages of design, development, and deployment. The report also assigns a “value” to each component and research stage so contributing scientists know how each part relates to the other and what value that part offers. For example, the RF cavities used to propel the particles will each consist of a 1.3m length and numbering over 14,000 units. In addition, there will 560 RF units to generate the RF signals needed by the system. The entire system will consist of 6500 “value units” of which the instrumentation and controls is 7%.

The full reference design can be seen here. In addition, there’s a collaboration project with DESY called the Tesla Techn0logy Collaboration whose goal is to facilitate collaboration on accelerator/collider projects through work groups.

The GDE indicates they will follow a SCRF or Super Conducting Radio Frequency which aims to control the frequency, phase, and amplitude of the RF electric field in the superconducting
Cavity. The use of FPGA’s is a key component in the instrumentation used. The Penn Instrumentation Group develops front end electronics for accelerator detectors and other control systems.

Best regards,
Hall T.

Friday, November 16, 2007

Particle Physics & Their Control Systems--ILC

Particle Physics is the study of the elementary constituents of matter and energy. These elementary particles may be created through collisions created by particle accelerators. There are many sites dedicated to particle physics most of which reside in North America or Europe.

The International Linear Collider is an electron-positron collider that complements the Large Hadron Collider (LHC) based at CERN. The LHC will perform experiments that will guide the ILC in performing follow-up experiments which it can do with greater precision. The ILC is two colliders that face each other. This design will create greater amounts of energy for atom smashing. The ILC uses superconducting accelerator cavities to send electrons and other particles at nearly the speed of light. The system is nearly 35 kilometers in length creating 500 billion-electron-volts (GeV) in the process.

They recently completed the Reference Design Report phase and are not entering the Engineering Design Report phase. They recently made a call to the ILC engineering community for expressions of interest (EOI) to work on the controls systems. You can see the list of projects here.

While the project is admirable, we must keep in mind that the project is not yet funded. Funding is contingent on results of the LHC at CERN and international commitment to the ILC project.

Best regards,
Hall T.

Friday, November 09, 2007

Light Sources—New Sources of Light to Illuminate Sub-Atomic Particles

Light sources are extremely concentrated x-rays and ultraviolet radiation typically generated by a synchrotron in order to illuminate subatomic particles. There are almost seventy sites working on lightsource applications around the world. Synchrotrons are used to generate these specific wavelengths of light.

According to Scott Hannahs of the National High Magnetic Field Lab, a pulse of charged "particles" usually electrons are generated and bent by magnets. Magnets bend electric current which is flowing electric charges. The electrons (usually, but can be other charged particles) are acellerated with electric fields and the beam is bent by magnets. The bending of the beam causes the emission of light.

A Synchrotron uses an arc engine to generate a pulse of light that is accelerated to 99% the speed of light. The pulse creates x-rays which shoot through a sample creating scattering which forms an image of the atomic structure of the sample. If you want to hear a nine year old explain how a synchrotron works then check out this video.

A typical experiment consists of spectroscopy, scattering, and imaging. Spectroscopy studies the energies of the particles emitted by samples that are exposed to the light-source beam. Scattering (diffraction technique) looks at the patterns of light produced when the x-rays deflect off the atomic structures of the sample. Imaging techniques use the light-source beam to capture pictures of the samples.

To learn more about the instrumentation used in Synchrotrons I turned to Wikipedia which broke the instrumentation into the following categories:

Beamline - line in linear accelerator along which a beam of particles travel.

Collimator - a device that filters a stream of rays so that only those traveling parallel to a specified direction are allowed through.

Crystal monochromator - select a defined wavelength of the radiation.

Insertion device -- produces highly-brilliant, forward-directed and quasi-monochromatic synchrotron radiation.

Undulator -- an insertion device from high-energy physics and usually part of a larger installation, a synchrotron storage ring. It consists of a periodic structure of dipole magnets alternating the wavelength.

Wiggler -- a series of magnets designed to deflect ('wiggle') a beam of charged particles (invariably electrons or positrons) inside a storage ring of a synchrotron.

Best regards,
Hall T.

Thursday, November 01, 2007

Control Systems for Telescopes – Challenges and Case Studies

Last week I blogged about Large Telescopes and how they are getting even larger in order to answer the big questions. This week, we’ll look at the control systems that support these telescopes. Large telescope systems face numerous challenges. The first is the large number of mirrors that must be controlled by actuators with multiple degrees of freedom. Even the secondary mirror must be controlled with five degrees of freedom. The reliability of the system must be high and wind disturbance inside the telescope must be managed.

The complex nature of the telescope control system (TCS) requires sophisticated software architecture with strong software engineering principles. Key elements include system integration, computer architecture, interface management, software standards, and user interface design.

In practice, telescope control systems use multiple networks and systems. This paper on the large binocular telescope system lists three systems: control, global memory, and telemetry. The control network ships commands throughout the system. The global memory network maintains a data structure which has the complete state of the system. This state is replicated to each subsystem (error correction, backup systems, etc) to manage for individual failures. The telemetry system sends messages from one sub-system to another.

Software provides a foundation layer for low-level infrastructure activities which includes the following:

1. System Data Dictionary and Network Shared Memory: all information about the system is kept in a single data structure.

2. Event Subsystem: records event, actions, and error conditions.

3. Telemetry Streams: commands and time-stamped data are sent as streams from one system to another.

4. Time Service: a GPS synchronized time server sits on the network providing time-stamping and synchronization.

A GUI is associated with each subsystem as an executable. To maintain an abstraction layer, each GUI does not communicate directly with hardware or other GUIs, but rather sends commands to obtain system status or initiates actions through a “command sequencer.” The command sequencer validates and routes commands to the appropriate destination. It also acts as a “heartbeat” which periodically obtains the “health” status of each subsystem.

Virtual instrumentation brings tools to telescope control systems. The SOuthern Astrophysical Research (SOAR) is a 4.2m telescope based in Chile. It uses an M1 control system for the mirror, and an M2 control system for the secondary mirror, and M3 for image stabilization through the use of a third mirror. LabVIEW was used for the overall control system and the datalogger function. The system interfaces with Real-time Linux as well as Windows NT.

Best regards,
Hall T.