Thursday, July 31, 2008

Semantic Web—Another run at Artificial Intelligence?

Another emerging trend underway, and for some time now, is the Semantic Web. In a nutshell it’s the next generation of web technology which will use semantics or “knowledge” of the person making a request to generate a response. To date, web searches are based on keyword matches, but in the future it will take much more into account such as the background of the requester, the information requested, and the context in which the request is made.

In many posts on the semantic web I find a connection to “artificial intelligence.” Having worked on artificial intelligence in the 1980’s I know something about the challenges of AI. First, AI suffered from a PR problem. If it was solvable, it was no longer AI. Second, AI was very particular about what kinds of problems it would solve – so particular that it had limited usefulness coupled with unlimited expectations from the users. In this post the proponents of the semantic web try to distance themselves from AI. Some argue that the Semantic Web pursues the current goals of AI which is to create machines that exhibit intelligent behavior rather than the original goals of AI which was to create human-level intelligence.

Social networking currently drives the development of the internet through social linking. The semantic web in its academic form drives knowledge through tagging information and linking together those tagged items. In this paper, the authors argue that Metcalfe’s Law (the value of the network increases exponentially based on the number of nodes in the network) will apply only if social networking and semantic tagging come together.

Best regards,
Hall T.

Friday, July 25, 2008

Virtualization Performance –Real-time Environments vs. Server Environments

Most users today consider Virtualization as it is applied to the server space. VMWare takes the dominant position in this market. But Virtualization for real-time systems requires a distinctly different set of characteristics. The environment is very different and thus requires a different approach. The Virtual Logix value proposition focuses on virtualization for real-time and network systems. Real-time systems require general purpose OS (GPOS) and real-time OS (RTOS) support. There are tradeoff requirements between performance and isolation of processes. In the server environment it’s all about I/O throughput. In the real-time world, latency, timing, and determinism are also important. In the server world, device drivers consist of disks and networks while in the real-time world there are many types of device drivers including both virtual and physical.

Another difference is the heavy set of Windows-based drivers from the 1990s. For companies with legacy software built in the Windows-era virtualization reuses those drivers without having to rewrite them in Linux, real-time, Macintosh OS, or any other OS. Performance can also be enhanced by applying a unique set of security, redundancy, and management tools on each core. For sensitive information, high security could be applied to one core, while for performance a different set of management criteria could be applied to another. Server environments focus often on security while real-time systems focus more often on performance.

Since performance is a key issue in deploying virtualization in an application, it’s interesting to look at the benchmarks. Fundamentally, adding more layers of software would indicate more overhead and to certain extent that’s the case with virtualization technology. The overhead for an arithmetic/floating point calculation is about 1% for Windows-based system. For multi core efficiency there’s a 7% hit and for physical disk reads there’s no overhead. In some ways that’s not too much of a hit.

Best regards,
Hall T.

Friday, July 18, 2008

Using Virtualization to Build the Next Mobile Phone

VirtualLogix brings its virtualization software to the mobile phone. In this example a mobile phone designer used one chip for both the baseband access and the operating system thus reducing the number of chips in the phone. This starts to look a little bit like software defined radio where more and more of the functionality of the phone moves from hardware into software.

The mobile phone is rich with features with varying requirements – some features require security, others require ease of use, and so on. Virtualization solves the problem by customizing each feature with its own virtual machine. Chips are reused, software is customized, and the entire unit goes to the next level of efficiency. Check out this blog post on the subject.

Another benefit of virtualization for the mobile phone is the ability to gather up any application regardless of the OS in which it is written and bring it together into a single phone. In this article the author shows how one can take Blackberry applications and run them on the iPhone using virtualization techniques.

If virtualization can optimize and speed the build out of a mobile phone, then it could certainly help optimize and build out a measurement and automation system. Given the rise of Linux and the need to combine it with legacy Windows-based device drivers, virtualization can become rather attractive.

Best regards,
Hall T.

Friday, July 11, 2008

Virtualization from AMD – Solving it at the Hardware Level

Virtualization continues to drive technology development. In the past few weeks we’ve looked at the basics of virtualization and Intel’s offering. Key issues for implementing virtualization at the chip level is memory management, virtual machine switching times, and CPU transfer to target area – memory, I/O, etc.

AMD offers hardware support under the name “AMD-V” in its Opteron line of chips for virtualization. This technology offloads the hypervisor by handling many of the memory management requirements through the use of larger caches, memory space protections, and paging algorithms for faster data transfer.

Microsoft and AMD co wrote a whitepaper describing virtualization. In addition to the hardware capabilities of AMD’s line of processors which shifts functions from the software into the hardware itself, there are software tools from Microsoft called Virtual Server 2005 which works with MS Windows Server 2003 offering VHD mounting, support for hardware –assisted virtualization, and new performance advantages including Volume Shadow Copy Service for backups and disaster recovery. There’s also support for running Linux operating systems

A great resource for news in virtualization can be found at the Sys-Con site, as well as Insight24’s slate of papers.

Best regards,
Hall T.

Friday, July 04, 2008

Virtualization for Today’s Servers and Other Applications

As devices such as set-top boxes, mobile phones, servers, and others become more complex, virtualization steps in to provide the solution to make numerous OS and applications work together in a single environment. In addition to consolidation of multiple OSes and applications, virtualization also offers additional reliability and security. Better reliability comes from using fault-tolerant architectures in which one core backs up another core. Better security comes from using different security settings on each core to customize the application for the target user.

One of the challenges in running multiple OSes in a “virtual” environment is that most OSes were designed to run at the highest priority level. In most cases, the primary OS runs at the highest level and guest OSes run at lower levels.

Intel offers tools to improve virtualization. Hardware support includes a set of queues for buffering up I/O requests from the hypervisor software. Intel offers an even higher priority level for the virtualization software and lets the original OSes run in their normal mode. Also, the hardware supports memory mapping and helps manage transitions from one OS to the other through the use of registers and queues.


Best regards,
Hall T.