Friday, September 30, 2005

Voice over WiFi – Coming to a Phone Near You

With the predominance of WiFi as the communication media of choice for enterprise settings, it’s not surprising that some are considering using it for voice communications. Companies are shaving their phone bills by switching over to Voice over WiFi for internal office communications. Those already using VOIP can employ voice over WiFi technology without stepping on the toes of Telco accounts payable departments.

Vendors of network and communications equipment are taking notice and are rolling out phones to take advantage of the technology. WiFi phones include Cisco WiFi based 7920 , Softphone from Xten , and PocketPresence. SpectraLink makes handsets that can provide voice over WiFi. (Full disclosure: I own no shares of stock in this company but consider buying it given their 40% annual growth rate in addition to their quarterly cash dividend of $0.10/share.)

The standard to watch is 802.11e. It provides Quality of Service (QoS) support for LAN applications in particular, Voice over Wireless IP (VoWIP). The standard provides levels of service with managed levels of QoS for data, voice, and video applications. Security and range are key issues to resolve in current implementations.
The current implementations also need jitter control and bandwidth partitioning to make it work on existing networks.

Deployments are starting up. City-wide voice over WiFi coverage is coming first to Rio Rancho, New Mexico, with a population of 65,000 people where Intel has a major facility.

If you are working with WiFi, I would like to hear from you. You can reach me at

Best regards,
Hall T. Martin

Friday, September 23, 2005

Patents – Getting a Read on Emerging Technologies

One way to get a read on the direction Emerging Technologies is taking is to look at filed patents. The information is public and taken as a whole; it can point out several trends.

Forrester Research analyzed 50,000 patents published between 2002 and 2004 and found that telecom and software inventions while still a big number is on the decline. Patents for Nanotechnology are on the increase.

The National Physics Laboratory in the UK set up a measurement system based on filed patents. They focus on three key areas including Medical, Advanced Materials, and Manufacturing themes. Their “patent map” of the manufacturing sector shows a shift to Nanosystems with applications clustering in near field optical microscopes, microactuators, tunable optical filters, microfluidic devices, gas sensing, to name just a few. Nanotechnology is a fruitful area of emerging technologies that goes beyond the arcane. In fact, the world of the small is going to be found nearly everywhere and in every application.

Technology Review offers a patent scorecard in Excel format providing a list of worldwide companies with the number of patents they have secured along with a rating for scientific linkage, current impact, and technology cycle time. They break it down into sectors covering Aerospace, Automotive, Biotechnology/Pharma, Chemicals, Computers, Electronics, Semiconductors, and Telecommunications. The absolute number of patents in the Computers, Electronics, and Semiconductors sectors vastly outweighs those in the Aerospace and Automotive fields.

Patents have become such a commodity market that Universities and Government institutions are setting up shop to “market” their patents. NASA has over 1000 patents they are seeking to commercialize through private industry.

Not everyone views patents to be a good thing. Open source advocates have long complained about the patent law process especially regarding software. Critics say the current US patent law provides too much protection to the owner of the patent and too little freedom to the user of software that may infringe on the patent. In this article the author outlines the challenges software developers and users face in confronting onerous patent filers.

Aside from driving innovation, patents provide another benefit – ego boost. Imagine you develop a cool new interface for an MP3 player. Let your imagination run wild and now you’re selling it in millions of units per year. It’s a dream come true. Now imagine your arch competitor obtains the patent for the interface. Now, it’s a nightmare come true. Microsoft won the patent on the Apple iPod menu interface. Check out this article for more details.

If you have an opinion about patents, I would like to hear from you. You can reach me at

Best regards,
Hall T. Martin

Friday, September 16, 2005

R&D 2004 Spending Report– Biotechnology Leads the Way

R&D spending fuels innovation especially those related to emerging technologies. The R&D 2004 Spending Report focuses on Corporate spending and highlights some interesting trends. The first is that Biotech is now leading the way. Biotech companies spending increased 69% over the previous year. IT companies spent the same compared to the previous year while Telecom and Computer Hardware companies spent less in 2004 than 2003.

Technology Review calculates an Innovation Index to measure the impact of R&D. It is based on several factors including R&D spending levels, R&D spending increases, and R&D as a proportion of sales.

Based on the Innovation Index, the top five are:
1. Biotechnology
2. Semiconductors
3. Pharmaceutical/medical devices
4. Computer hardware
5. Computer software

Individually, some companies are standouts in their investment in R&D. Again Biotech leads the way. The top ten companies ranked by the innovation index are as follows:
1. Sanofi/Aventis
2. Microsoft
3. Biogen
4. General Motors
5. Merck
6. Pfizer
7. Johnson & Johnson
8. Intel
9. DaimlerChrysler

The research dollar continues to pour into the life science space. Six of the top ten companies are in the life science space, two are in the automotive space, one is computer software and one is computer hardware. According to Drug Researcher Pharma appears to be slowing down a little in spending although the dollar volume is still quite high. A pharma company spent $1.2B for each NME (new molecular entity) approved in 2004. This is down from $1.82B in 2002, but is up four times from what was spent in 1995 ($317M).

Spending in computer software, pharma, and heavy industry is up 10%, telecom is down by 10%, and most others are flat. I can remember when Lucent’s Bell Labs held a vaunted position in the research world. It’s no longer in such an enviable place. In fact, it’s down by 36% in 2004. You can see more details at the scorecard.

The federal government provides R&D funding through its agencies. According to the American Institute of Physics the federal government budget proposals for 2006 show an increase of R&D funding by 1% overall with an increase in developmental funding and a decrease in basic research. It appears that NIST, NSF and NIH agencies are up for increases in their budgets, but the rest of agencies are in for flat or declining levels.

If you have an opinion about R&D spending, I would like to hear from you. You can reach me at

Best regards,
Hall T. Martin

Friday, September 09, 2005

Location-Aware Devices – Is the Geospatial Web the next World Wide Web?

Location awareness is still in its early phase (hence it’s being discussed in this blog). Currently, location applications focus on what can be called a “thing-finder” – finding a restaurant, an address, etc. In the next phase, location aware devices take into account their context and act accordingly. These devices could use their location for functions such as security -- setting up a device to only work when it’s in its proper location or sending out a call signal when it’s stolen. It could also use position to configure itself with the proper frequency and communications protocol for the area. For more good ideas check out this article.

GPS is a technology with which almost everyone is familiar. But GPS works only outdoors. What about indoors positioning? With the increased number of wireless devices and signals (WiFi, RFID, Cellular, etc) it is now possible to use these signals (signal presence and signal strength) to pinpoint a device’s position even if indoors.

Location awareness is considered to be the next killer app for cellphones. Check out this article which discusses location aware applications including finding a taxi cab by contacting the cab that is nearest to your location or using it to find your friends who are currently in the same geographical area you are.

If every cellphone were location-aware and began transmitting that information, the network would crash from an overload of data (location data) coming through. In this blog the author calculates that 200 million cellphones generating location data would require bandwidth of 200,000,000 kb/sec. Obviously, the cellphone needs to transmit data only when it needs to.

The recent conference Where 2.0 highlights the many applications and issues surrounding location-aware devices. The conference sponsored by O’Reilly drew over 500 attendees. The main theme of the show is that the traditional GIS—Geographical Information System market is converging with the Web to create a whole new market. The usual suspects, Google, Microsoft, and Yahoo, aired plans for providing tools and support for this new market.

One way to think of it is location-based services focus on where you are, while location aware services operate in the background interacting with the local context. In this article the author outlines a vision of the cellphone as the primary device interacting with the local environment to create ad hoc networks.

The geospatial web is reminiscent of the World Wide Web in the early 90s – plenty of promise but plenty of work to be done. In this article, Mike Liebhold lays out the challenges in making the geospatial web a productive reality. Infrastructure, investment, an ecosystem of companies developing tools, privacy, security, and a host of other issues are discussed. For a rich set of links on geospatial topics and tools, check out this web site.

Of course location awareness has a dark side. If a device can tell where you are, it could tell others where you are even when you don’t want it to. In this article the author highlights the challenges in protecting one’s privacy.

If you are working with location aware technology, I would like to hear from you. You can reach me at

Best regards,
Hall T. Martin

Friday, September 02, 2005

Cell Processor – Final Specs Released

IBM, Sony, and Toshiba formally announced the specs on their proposed Cell processor. For those who are not familiar with the Cell processor, it is a new processor based on a 64-bit architecture with features directed toward distributed processing and media-rich applications. It is a single chip with multiple cores each running its own application program with a rich set of DMA functions for fast communication between all the Cell processing elements. You can find a concise view of the architecture here. If you want to see the specs for the first implementation of the Cell processor on a gaming platform, here’s what Sony will do with it on their upcoming Playstation 3.

All processors share the same storage space. The processors are divided into two categories: PPE (Power Processing Element) – designed for running 32 and 64-bit applications and SPE (Synergistic Processor Element) – designed for running compute-intensive applications.

The Cell processor explicitly makes parallel the computation and transfers of data and instructions. Today’s processors are losing clock cycles waiting for data to come out of memory (unless it’s already in cache). The DMA model of the Cell processor allows each SPE to have several memory accesses running concurrently.

While current gaming systems use a graphics accelerator for speeding up the rendering process, the Cell processor replaces this chip with a more generalized computational resource that can be used for anything. Software will take over the rendering task. Check out this site for more details.

One question on the table is how will current bus architectures evolve to provide enough bandwidth to the Cell processor architecture.

Although it was originally targeted for the gaming community, it will have applications beyond that group. The head architect for the Cell processor, Dr. Hofstee sees the gaming platform as the first step on the road to making it a general purpose processor for other segments such as workstations, supercomputers, and engineering applications. He doesn’t envision it as a general purpose processor for the PC though. The Cell will run Linux which will be the key to moving it into new areas.

The video gaming industry is watching this closely. For example, the Austin Gaming Conference is dedicating their keynote to it in this year’s conference.

If you are working with next generation processors, I would like to hear from you. You can reach me at

Best regards,
Hall T. Martin