Friday, May 30, 2008

Big Physics Funding in the US and Elsewhere

The US Department of Energy (DOE) leads the world in the conception and design of large scale science projects. In their 2003 report called Facilities for the Future of Science: a 20 Year Outlook and the interim report last year showing progress, the department highlights their progress.

Near term priorities include the ITER, medium term includes the Linear Collider, and long-term includes the National Synchrotron Light Source upgrade.

What’s surprising about the report is how low the funding is compared to the European Union. The Europeans are outspending the US by a factor of 3. The US has diverted a substantial amount of funding from basic science into homeland security. Some of that work is being done at the major US labs but in general the US labs are downsizing their workforce – especially at the SLAC and ILC.

In scanning web it’s clear this problem has been coming for a few years now. Reports from Wired magazine and others describe the budget reduction from $60M to $15M. In this post the author describes the impending budget crises at the ILC and SLAC based on projections late last year. Labs are laying off employees and finding ways to cut costs. Fermilab even as a web page dedicated to layoffs providing details.


Best regards,
Hall T.

Friday, May 23, 2008

Thomson Scattering Technique –Measuring the Plasma Wave

Recently I attended the High Temperature Plasma Diagnostics conference in Albuquerque. The show focused on measuring the plasma wave primarily in Tokamaks. The primary technique is called Thomson scattering named after its discoverer JJ Thomson. The technique measures the scattering of electromagnetic radiation by a charged particle. A plasma wave accelerates the particle which in turn emits radiation and scatters the wave.

In the conference we saw numerous signal types used for the measurement including microwave, millimeter, acoustic, ultrasound, and others. Virtual Instrumentation was used for the control of the system as well as the data acquisition portion of the signal. The conference was well attended with over 250 people in the room.

Best regards,
Hall T.

Friday, May 16, 2008

Mobile Internet Devices—Low Cost Wireless Devices Enable Virtual Instrumentation Applications

Computers continue to shrink in size and increase in power and performance. The laptop is now considered bulky even unwieldy. The mobile phone as you may know by reading previous posts is the next step in computation delivery. Every day one can see the shift to hand held devices. As the user tries to step away from the PC there’s a middle ground some fall into called Mobile Internet Devices – a term coined by Intel to describe computers smaller than a laptop but not as small as a mobile phone. Intel provides the Atom chip previously called the Silverthorne to power these devices. Most of the examples listed below use Linux as the OS and provide a wireless connection to the internet. It appears Microsoft will take another hit as the industry moves around its proprietary operating system which for many applications today is overkill and overpriced.

Elektrobit offers a Mobile Internet Multimedia device. It offers both Wi-Fi and WiMAX broadband connectivity and a Linux OS.

Asus offers the eePC which looks like a slimmed down laptop that uses WiFi to connect to the internet. The cost is equally slimmed down.

The Aigo MID borrows heavily from the iPhone and has something of a cheesy quality to it.

LG, Lenovo, and BenQ offer their versions as well.

The mobile internet device brings another low-cost, commercial off the shelf technology to solving virtual instrumentation applications. Low cost devices connected wirelessly to the internet enables applications deployment for many users.

Best regards,
Hall T.

Friday, May 09, 2008

High Performance Computing – Increasing Computational Performance with Hardware

Computing performance has slowed down from the high gains made in the 1990s. This is due to the shift from higher performance cores to multiple cores in cpus. Part of the problem is related to the software issues of programming multiple threads and part of the problem is the hardware itself. Hence the rise of GPUs or graphical processor units.

The traditional definition of high performance computing focused on achieving supercomputer performance through clustering of computers. With the advent of graphical processing units most notably led by Nvidia and AMD, the high performance computing now stands for using lower-end computers and customizing the code to make use of their graphs processor units.

The IDC reports strong growth in the HPC segment with a shift to higher performance including the petaflop range and the move to fiber for interconnects.

A rich resource of information is the general purpose computation using gpus (graphics processing units). Additional information can be found at GameDev.Net.

Players in this space include Acceleware which makes turn-key solutions with vertical solutions in seismic and imaging applications. Their partners include the usual suspects, Nvidia, AMD, Dell, Sun, and HP.

Rapidmind offers a modular based solution with an elegant API that focuses on single-threaded, multi-core solutions using CPUs and GPUs. They steer clear of the multi-threaded approach due to the complexity of the code. They offer solutions to the desktop and medical imaging and financial analysis markets.

ClearSpeed takes the accelerator approach building custom cpus onto its own proprietary hardware and then optimizing its software to work with it.

Best regards,
Hall T.

Friday, May 02, 2008

Cloud Computing—Increasing Capacity without Increasing Costs

As with many emerging technologies it starts with a distant voice. It has a cool-sounding name and it starts to come up in articles and conversations but only peripherally. Cloud computing fits the model as I’ve heard it come up several times but what exactly is it? What is it not? According to Wikipedia it’s the use of virtualization technologies to simplify the data center. There is some disagreement over what cloud computing actually means. Some take it for anything outside their own companies system, while others define it more narrowly as utility computing. The benefits are clear – it provides additional bandwidth without having to invest into additional infrastructure. The upshot of it is that it will most definitely change the way companies handle IT. In this article the author predicts the transformation of the IT industry by shifting the infrastructure within a company to outside. Just as Software as a Service model has shifted software sales from perpetual license models to pay-by-the-drip models, so the core IT infrastructure will follow the same path.

Current reviews of the technology show that as an emerging technology it appeals to small and startup companies but not the large enterprise user yet which parallels most emerging technologies in that it leverages the newer business models more so than the established ones. The primary advantage is the lower cost. While the previous article focuses on IT applications, virtual instrumentation applications could also make use of cloud computing for data storage and test applications.

For virtual instrumentation, the IT infrastructure is a part of many applications for data storage, reporting, and analysis of data. As the cost comes down and the access to capacity goes up, this should provide an improvement in ROI for virtual instrumentation-based projects.

Best regards,
Hall T.