Friday, October 27, 2006

Jeff Andrews -- UT Researcher in Wireless Sensor Networks

Jeff Andrews at the University of Texas Austin, works on wireless ad hoc networks. I recently spoke with him about his research. He and his team recently won a $6.5M grant with the help of NI. The Austin American Statesman ran this story.

Jeff spends time on basic research to understand the communication of data through wireless ad hoc networks. He’s trying to characterize the performance limits in ad hoc networks. The first step is to build a strong theory of information. Information theory is good at describing a set of links transmitting to a central point but for ad hoc networks it doesn’t work because the position of the nodes impacts the communication of all the nodes in a network. Developing a descriptive theory for ad hoc networks could improve the development of more efficient networks in the future. His project is funded by DARPA.

He plans to research physical layer technologies to understand which are best for ad hoc networks. For example, how does a wireless network LAN, spread spectrum, or CDMA work in an ad hoc network environment? In addition, he is also looking at MIMO and how it changes the capacity of a network. Other issues include scheduling to understand which users should transmit and to whom.

How to build the radios by deconstructing the link we’ve had up until now. In a cellular system one call affects another. The problem in a cellular system is compounded because all the nodes are communicating.

Currently Jeff is using LabVIEW for their research. He’s focusing on signal interference ratios and routing packet efficiency. He’s taking a mathematical approach in order to develop mathematical theories for0 creating a model.

People have been working on ad hoc networks for some time. Military and emergency services are interested in it. Mesh networking for deploying wireless broadband is one instance of ad hoc networking that may have broader appeal. Jeff said, “If we can figure out the right transmission technology it would be useful to deploy broadband to the third world.” Sensor networks are a more nascent field. “In twenty years it’ll be more broadly distributed although you may not know it’s there,” he commented.

I’ve always thought RFID would be a useful technology to marry to ad hoc networks for controlling the communication. An RFID pulse could be used to synchronize the nodes for follow on communication.

It’ll be interesting to see what Jeff and team come up with.

Best regards,
Hall T.

Friday, October 20, 2006

Outside Innovation—Patricia Seybold’s Take on Lead Users

Patricia Seybold just launched her new book called Outside Innovation which proposes using customers to help define new products and imbue innovation into those products. While Eric Von Hippel coined the term “Lead Users,” Patricia Seybold coined the term “Lead Customers.” In talking with her about the concept while she was compiling the book it appears she followed the same path as National Instruments in implementing its Lead User program. A careful study of Von Hippel’s Lead User program highlights a process that is very narrow. In order to apply the concept we had to expand it to include not only customers outside the target industry, but also those within the target industry. Lead customers according to Patricia’s definition “is that small percentage of your current customers who are truly innovative.” She goes on to highlight that they are not necessarily the most vocal, the most profitable, or even the largest customers. Nor do they represent the current majority of a company’s customers. The key difference is they care deeply about the product and are innovative in its use.

She highlights the Lego Mindstorms project as an example of a successful Lead User process. In October 2004, five key lead users were chosen to help design the next version of the product.

Patricia goes on to highlight National Instruments’ role in the Lego Mindstorms project and then discusses NI’s own Lead User program. Patricia interviewed me for the book and included many of the quotes from our discussion. I believe Lead Users can be applied to both the definition of the product as well as the definition of the target market. Drawing from “Crossing the Chasm” we labeled the former as Technologists and the latter as Early Adopters if for no other reason than to keep track of what a lead user team expects of a project.

In the book she hits on one of the key reasons LabVIEW has been successful – it’s an open environment for customers to create their own solutions. It accommodates a user’s domain knowledge and amplifies his creativity. LabVIEW fosters innovation among customers by letting them create their own applications in their own way. This brings the customer into the LabVIEW design process as well. Feedback from customers then drives the development of the LabVIEW environment.

Patricia’s book goes on to highlight the use of online communities for fostering communication with customers and the use of blogs in developing a voice behind the corporate web page.

Overall the book is a fast read with some key insights that many companies will find useful. It was impressive how many pages are devoted to National Instruments and in particular NI’s lead user program.

Best regards,
Hall T.

Friday, October 13, 2006

Argie Rumann – Developing Telemetry Systems with Reconfigurable I/O

Argie Rumann works at Swales Aerospace and is currently focused on projects at NASA Goddard related to the Hubble Space Telescope. I spoke to him on the phone the other day. He is coding a new ground station application for telemetry and commanding of a Space Shuttle payload associated with the Hubble telescope for an upcoming flight. He specified a single PXI-7833 with an 8196 controller in a 4-slot chassis for this NASA Goddard job. The system has 6 different sync and async 422 signals that will connect to user interfaces at the NASA MCC (mission control center) in Houston, so they are using the twisted DIO pairs on the RIOs to do the 422 signals and receive modulated IRIG-B through an analog input. Also working with Argie is Kareny Dominguez who is developing and integrating the windows level user interface with the RIO. She is compiling in LabVIEW 8.2 and debugging with Argie. Her company is ViGYAN Inc. Here's some of her early work.

One of the biggest challenges in using RIO is that he’s not able to control the clock at a high resolution. Even if he compiles at a 120MHz, the jitter in the clock because of the logic (when he’s trying to get a specific timing) forces him to resort to external timing sources. The problem is that with the shuttle telemetry application they have 6-8 clocks derived from a master clock. They want to clock 2 to the 23 Hertz (in powers of 2). That’s a challenge to bring signals out of the RIO that has the right clocking capability. There’s a need for timing capability with RIO to make explicit the timing characteristics of a signal.

Argie finds RIO provides some basic architectural rules for building the command frames for telemetry which is frame-based communications. He has found the RIO memory useful with a DMA FIFO for working with the bandwidth between MS Windows and RIO. Before, there was a bottleneck in getting the telemetry from the RIO to the front panel. They have 600 byte frames which update seamlessly, when run at 1 Megabit. They are called NASCOM blocks. It’s the standard for Shuttle payload telemetry.

One challenge remaining is that he is mimicking 422 channels. It usually buffers up the signal in the chip with the comparator inside the chip working at the electrical level. He has taken the contiguous digital signals in the RIO and wrote the differential code inside the RIO as opposed to doing it in a circuit. Impedance matching is a problem. The buffers in the RIO need to be configured to handle it safely given the current levels. The fact that there is a direct connection to the channels means there’s no way to actually put buffers on the board and most of the applications seen in communications are differential in their signal input. If he could put resistors inside the RIO so he could pick 422 differential lines to ground, for impedance matching, that would be really helpful. Right now, he has to make some intricate connections inside the wires in the bundle which can cause reflections.

Best regards,
Hall T.

Friday, October 06, 2006

Mark Yoder on DSP and FPGA Technologies

Mark Yoder is a Professor of Electrical and Computer Engineering at the Rose-Hulman Institute of Technology and a Lead User for National Instruments. He recently returned from the 4th Signal Processing Education Workshop held in Jackson Hole, Wyoming. I asked him how it went. Mark indicated the workshop on the whole was very good and that it’s his favorite event. It only meets every other year but this year it had about 125 attendees.

He digressed for a few moments to comment on the Grand Tetons in the background and the two moose which were next to the cabin when he returned one morning. He came back to the line of discussion and spoke about the speakers. Dr. James Truchard’s speech on DSP was LabVIEW-centric. There was one speaker talking about signal processing in the 4-5KHz range used for avalanche detection. Also, there was a history lesson from John Tricolor, on different adaptive filters for which he had an unusual application.

I asked Mark where LabVIEW should go in the future. He likes the current trajectories such as the ability to layout a problem on the PC and then target an FPGA. It would be neat if there were no changes required for the target platform. Mark would like to teach a class that can focus on the tradeoffs between FPGA, host computer, and DSP, and then let the user decide on which platform to use based on the performance, memory size, and reconfigurability.

At Rose-Hulman, they are working on a class for reconfigurable technologies that is now looking at fixed point usage. In this class, Mark is implementing a simple sideband, transmit/receive program. He’s not interested in the sideband, but rather how to implement fixed point and handle it when you have a fixed number of bits to work with, in order to get the best Signal to Noise Ratio out of it.

Mark is also working on Software Defined Radio (SDR). He used a reconfiguration I/O board from NI in a couple of senior design projects. He thought it an intriguing piece of hardware. “The ability to hang a wire out and sample a swath of spectrum is a powerful tool; it’s one thing to talk about equations, but it’s another to actually do it,” he said. The fact that it’s all in one place (generate signal, sample it, demodulate it, and then process it) is a great advantage.

There’s still work to get it going to compile in an FGPA, though. The compile time could be done more quickly, he thought. We’re currently in the “punch card era” with FPGAs due to the relatively long compile times. Some students use remote compiles to get around the problem. With FPGAs you can tell it compile on another machine which can make things run faster, if the machine has more memory. The students simulated an RFID reader and tag. One board ran as a reader and the other as a tag. They were able to get the handshake between the two.

I asked Mark where his research is taking him in the future. If he had time, he would write the ultimate purpose receiver that would go across the entire spectrum. That would make a neat demo.

Best regards,
Hall T.