The Need for a New Approach to Networking Developer ToolsBy Rick Denker, Packet Plus, Inc.
Why have sophisticated development tools seemingly passed over the networking equipment development engineer? The answer lies in the evolution of networking test, the gaps that exist for the network development engineer and the way companies are coping today. We explore the root-causes for the situation and outline the need for a new innovative approach.
The Nature of the Problem
The networking development engineer designs and builds among the most complex systems that are engineered today. They are characterized by the need to be interoperable with other similar systems and are constantly buffeted by change.
Many now know the terminology of routers, switches and networking gear, but few realize the difficulty inside those systems. They are made up of real-time hardware and software that must be seamlessly integrated together. They use pipelined architectures to keep up with the speed of the networks that they are attached to, and have both an input path coming from the network and an output path that sends data out over the network.
Any single networking product usually interfaces with multiple networking standards. For example, a wireless access point will support communication with clients using Wi-Fi (IEEE 802.11), and will also talk to the rest of the network using Ethernet (IEEE 802.3). The equipment must fit with the larger network infrastructure of the user. Also it is the exception for a particular network to be constructed from a single vendor’s equipment. Generally there are pieces of equipment from multiple vendors in a network, and therefore this drives the need for interoperability between vendors. This requires the development engineer to be knowledgeable about other vendor’s equipment.
On top of being complex and needing to be interoperable, they are continually changing with new standards released on an ongoing basis. This has been especially active in new areas such as wireless networking that has seen a number of new standards (Wi-Fi, WiMAX, UWB, ZigBee, CDMA, GSM, etc.) Layered on top of these standards are security protocols with their encryption and key management. Along with new standards, there are the inevitable, faster speed versions. Even with a relatively stable standard such as Ethernet, the new speed versions and new features take a lot of effort to stay on top of.
Stated simply, the job of the networking development engineer is difficult and getting more so.
Network Testing Tools
As networking equipment has become more complex, test tools have evolved to keep up. Tools have progressed in three distinct phases: ad hoc, controlled loading and real-world traffic. The first phase was ad-hoc loading. This was characterized by having multiple users of the network such as PCs generate as much traffic as possible. This caught the most egregious issues, but the equipment tested this way still tended to lock up fairly regularly and was often well below the line rate.
The second phase was controlled loading and it revolutionized the performance of networks. A product such as SmartBits® from Netcomm Systems (now a part of Spirent®) provided precise traffic in a repeatable way on every port of the device. For example, a SmartBits could produce a load at wire rate that was made of minimum-sized packets with a minimum inter-packet gap. With the adoption of this class of test equipment, the reliability of networking equipment shot up dramatically.
The third phase was the introduction of real-world traffic. The controlled loading did not catch every case that was happening to systems in the field; It was too regular and not random like actual traffic. The ability to inject an arbitrary stream of traffic into the test environment has become important. Test architectures have been modified to allow an independent software stack running on an internal processor to inject traffic. This phase is continuing to unfold as more and more cases that are not caught by current testing methods are discovered.
Gaps in Tools
The current test tools are great for the quality assurance (QA) engineer, but leave several gaps for the development engineer. The gaps are in the areas of timeliness, breadth, abstraction level and portability.
The biggest issue for the development engineer is that the test tools are just too late to be useful. They need tools when the silicon comes back to the lab, or when the first prototype is fabricated. This is when there are the most unknowns in the design. An accurate tool that provides more data to this early environment is a critical productivity enhancement. However, most commercial tools are not available at this crucial time for a new standard.
The second gap is the lack of breadth of the tool suites available. When compared to software development, the tools lack breadth, sophistication and robustness. For example, Wind River, a wholly owned subsidiary of Intel, has an extensive array of software tools including compilers, trace, unit-test, multiple hardware debuggers, scan chain access and Eclipse based workbenches. For the networking equipment developer there is typically a protocol analyzer that provides a symbolic decode of the packet log file, and they may have a software simulator for some traffic situations. If you were to arm a software engineer with only a trace tool today, it would be ridiculous. However, this is effectively what we routinely do for the networking equipment development engineer.
Another gap that exists is that many tools work at the wrong abstraction level. For example, the network equipment engineer can use all the same tools that the software engineer uses, but there is always a need to translate the results to the level that they think and work in. Software tools work at the instruction level. They can stop at an instruction boundary, but the engineer has to translate that into what must be happening at the packet level. Another example is a spectrum analyzer that can give tremendous detail at the analog level, but again requires translation to be useful for the network engineer.
There are some tools at the packet level, notably protocol analyzers; there are just not enough of them. The use of tools that work at the wrong level generally causes additional effort to integrate them into the design environment, and additional effort to translate each measurement into the desired level.
Lastly a gap exists in the portability of tools. Many tools are large and stationary by their size. The tools should be as portable as the product that is being developed. Then it can be easily moved to where the equipment needs to be tested. For example, with wireless equipment the common test environments are open air, wired, test boxes and Faraday cages. Each environment is important to measure different aspects of the product. The measurement tools should move easily between these environments. Also a tool that is lab-bench bound may be useless for an issue that only seems to occur at the customer site. A portable tool will also support the convenience of working from home.
The predominant method that is used today for early-stage development is an internally developed custom tool. The most common way the tool is built is to take the equipment that is under development and alter the firmware for testing purposes. Effectively the equipment is looped back to itself with a different personality. Such a solution is often required because there is no other equipment that can communicate in the latest protocol.
This solution – even though predominant – has several drawbacks. First, it is expensive, taking an average from six to nine months of a developer’s time. Often it is used as a learning project for a junior member of the technical staff. Second, it is viewed as a short-term solution and thus may have a cumbersome user interface and be poorly maintained. Also, since it is based on the hardware under development, there can be bootstrapping issues when there is not certainty as to which parts of the system are working correctly yet.
In talking with project managers, the two biggest concerns about custom tools are the risk of missing something because it is incestuous testing and the consumption of a scarce resource to develop them.
The future for this approach does not look bright. One, it will continue to become more and more expensive to build the custom tool. Two, it will continue to be risky. And three, the lost productivity from using a hard-to-use tool will continue to grow.
Root Causes of Issues
The current state of tools for the networking development did not just occur overnight. There are several industry dynamics that have combined to create the current state of developer tools.
The primary cause has been the speed of technology evolution. If the technology had been stagnant, the investment would have been made to improve the breadth and quality of the tools. However, it has been an ever-changing target.
Such a treadmill effect exists in several tool industries. In the electronic design automation industry, the same set of tools needs to be re-built for each new chip layout geometry. The complexity of this challenge with ever-smaller geometries often consumes the development team. Because of this treadmill, new features are rarely added, and sometimes even dropped in subsequent product generations.
Another factor is that test tools use the same basic architecture as the equipment that they are testing with some modifications. (See Figure 1 - Networking Test Architectures.) The modifications are made in software, hardware or the network physical layer (PHY). For the software, a more controlled version with a limited number of options is implemented. For the hardware a higher performance or more programmable version is implemented. For the PHY, most commonly a standard interface chipset is used, but a custom interface solution may be used in some cases. The more modifications, the more the tool development will cost, so each modification is implemented, or not, based on a cost-benefit analysis. Also, the more modifications that are made, the earlier the development needs to start to meet the availability need.
Using the same architecture can also make the test tools suffer from chicken-or-egg dependencies. For example, a tool that uses a commercial PHY chipset for the physical layer will be limited by the availability of the chipset. Some of the disadvantages of this approach can be mitigated by an accelerated integration program once the chipset arrives. Ideally a tool would be available before the equipment under test, so that the developer could become comfortable with how it will perform before the critical time when first silicon or first prototype is delivered.
Effectively the test tool developers are on the same treadmill as their customers, except their market size is significantly smaller. This causes many tool companies to take a wait-and-see attitude to new standards. Because of these effects,companies often focus on the needs of quality assurance (QA) testing which is generally larger, but does not have the same critical need for early availability.
Another factor that has limited the advancement of sophisticated tools has been the lack of a way for the custom tool developers to share their efforts. Each vendor develops its custom tools independently. There have been a few cases in the past where a consortium has funded tool development, but it hasn’t been early enough to replace the internal custom tools.
New Approach Needed
Currently the networking industry tolerates the method of custom tool development that has gotten the industry this far even though it is expensive. However, with the continued growth of complexity the costs will continue to escalate both for the custom development and for the lack of productivity for the development engineers.
A dramatically new approach to the problem is required. A ten-percent better tool may only provide a few months in terms of keeping up with the coming issues. It reminds me of the Einstein quote, “We can’t solve problems by using the same kind of thinking we used to create them.” Several innovative solutions are being developed for QA testing such as those from VeriWave for wireless LAN products. The same kind of innovation needs to be applied to the tools for the development engineer.
The solution should have substantially the following characteristics. Some of these characteristics have been made obvious from the discussion in this article. Other characteristics are implied because of the need to preserve the primary benefits of the tool.
Characteristics for a new development tool solution:
- It should be available at first silicon, or first prototype. All of the duplication that causes delay needs to be eliminated.
- It should be flexible and configurable. This allows the exploration of new versions of protocols, or potential new changes to protocols without an entirely new development.
- It should handle sophisticated security protocols. Security is becoming a requirement in so many markets and can be painful to work with at the development stage. For some applications this dominates the development time.
- It should integrate easily with other tools that are used. The gains that were made in timeliness should not be given back because of the need to spend additional time integrating the tool into the development environment.
- It should support the easy generation of test cases. If a tool requires significant effort to develop test cases, or learn a new language, it limits when it can be used. Cumbersome test case generation can cause product delays.
- It should allow the efforts between companies or even standards to be combined. A solution that works well for just one company will not be an industry solution. The solution should be able to use the size of the networking industry to its advantage.
This set of requirements may set the bar too high for some approaches. However, it reinforces the need to come at the problem with a distinctly new approach to meet these requirements.
Rick Denker was the co-founder and vice-president of marketing for VeriWave, Inc., an innovative test system for wireless networks. He has a long history of launching new product innovations for leading companies including WeSync, Synopsys, PMC-Sierra, Intel and Hewlett-Packard. He has a computer science degree from MIT and an MBA from Dartmouth College.