Network Algorithmics:
An Interdisciplinary Approach to Designing Fast Networked Devices

George Varghese

Morgan Kaufmann 2005
A book review by Danny Yee © 2006 https://dannyreviews.com/
Modern network devices have to handle traffic in huge volumes at low latencies; achieving this requires ideas and approaches from all of computer science — hardware, algorithms, protocols, software engineering — and their integration in a discipline which Varghese calls "network algorithmics".

After an introduction in chapter one to bottlenecks and techniques for avoiding them, in chapter two Varghese provides a brief overview of protocols, hardware, network architectures, and operating systems, with examples. This is best suited as a refresher: Network Algorithmics assumes a general familiarity with networking protocols, operating systems and computer architecture, and if used as a text would be suitable for higher undergraduates. (The exercises at the end of each chapter are often quite demanding.)

Chapter three introduces the fifteen implementation principles that are the core of Network Algorithmics: P1 Avoid obvious waste; P2 Shift computation in time (precompute, evaluate lazily, share expenses or batch); P3 Relax system requirements (trade certainty or accuracy for time, shift computation in space); P4 Leverage off system components (exploit locality, trade memory for speed, exploit existing hardware), P5 Add hardware (use memory interleaving and pipelining, use wide word parallelism, combine DRAM and SRAM effectively); P6 Create efficient specialized routines; P7 Avoid unnecessary generality; P8 Don't be tied to reference implementation; P9 Pass hints in layer interfaces, P10 Pass hints in protocol headers; P11 Optimize the expected case (use caches); P12 Add state for speed (compute incrementally); P13 Optimize degrees of freedom; P14 Use bucket sorting, bitmaps; and P15 Create efficient data structures. Chapter four presents fifteen problems that illustrate these principles in action, with hints to solutions.

Part II of Network Algorithmics is devoted to end-nodes. A chapter "Copying Data" takes a web server, delivering files from disk to network, as the prototype, and explores different approaches to reduce pressure on the memory and I/O bus by reducing the number of copies required: copy-on-write, fbufs, RDMA, IO-Lite, and more. It also touches on making cache use more effective and the tantalizing possibilities of "integrated layer processing".

"Transferring Control" looks at minimising scheduling overhead and maximising concurrency: at context-switches, processes, threads, and event-driven servers. It evaluates the existing Unix select() call and considers ways of speeding it up, with and without changing the API. And it touches on ways of avoiding system calls and reducing interrupt overhead.

There are three shorter chapters. "Maintaining Timers" explores hashed and hierarchical timing wheels, the BSD implementation, and fine granularity. "Demultiplexing" looks at the development of packet filters: CMU/Stanford, Berkeley, Pathfinder, hardware, and Dynamic (generating filter code in real time). And "Protocol Processing" looks at some miscellaneous tasks that can become bottlenecks: buffer management, CRC checks and checksums, TCP and UDP protocol processing, and packet reassembly.

So far I've only skimmed Part III, "Playing with Routers". This has chapters on "Exact-Match Lookups", "Prefix-Match Lookups", "Packet Classification", "Switching", "Scheduling Packets", and "Routers as Distributed Systems". And Part IV offers chapters on "Measuring Network Traffic" and "Network Security", along with a summary and overview.

Varghese gets right down into the details in some places, but he sets the material he covers into its broader context. He often takes a historical approach, looking at how implementations have been driven by changing requirements. And he stresses that only with a broad view can the overall costs and possible optimizations be seen: with web server performance, for example, it is necessary to consider "the whole system, from HTTP and its headers, to the file system, and down to the instruction caches". Where many computing disciplines emphasize the isolation of components, network algorithmics stresses links across layers and boundaries. (It's not overdone, but approaches and principles are often illustrated with analogies to ordinary life — serving tables in a restaurant, for example — or parallels with other areas of computing.)

Purists might find the practical approach of network algorithmics distressing — Varghese admits that it "may seem drab and shallow" compared to "the beauty of theoretical techniques" — but it has an attraction all of its own. I've often felt that "computer science" as commonly constructed lacks any coherence, spanning everything from nearly pure mathematics to hardware and engineering, but Network Algorithmics has made me rethink that.

Little in Network Algorithmics is relevant to my job as a system and network administrator — I just plug switches in and configure them, and don't have to worry about their internals — but I found it fascinating. Apart from curious general readers with a computer science background, or hackers who enjoy stretching their minds, the obvious audience is anyone building high-performance network devices or looking at optimisation of networking code, say in the Linux kernel.

August 2006

External links:
- buy from Amazon.com or Amazon.co.uk
Related reviews:
- books about computing
- books about networking
- books published by Morgan Kaufmann
%T Network Algorithmics
%S An Interdisciplinary Approach to Designing Fast Networked Devices
%A Varghese, George
%I Morgan Kaufmann
%D 2005
%O hardcover, index
%G ISBN 0120884771
%P 465pp