It happens in the blink of an eye. A press of the 'Enter' key returns Google search results almost instantaneously. Video chats easily collapse the distance between conversing friends. For those born before the early nineties, the memory of those scratchy beep-beep-boop-boop sounds of a dial-up Internet connection now seem rather quaint.

Almost no one has participated in more of these advances in data communications than LIDS's Dave Forney. From his graduate school days at MIT to his career in industry and parallel career in academia, he has been a key player in many of these dramatic advancements.

Dave acknowledges that few imagined the current speeds of communication technology. "When Google first came out, it seemed like a miracle even to technically sophisticated people. Ten years earlier, who would have predicted it?" For that to happen, Dave says, people would have had to imagine what would become possible if everything could be done a million times faster. The difficulty of predicting the power of an idea has been illustrated time and again throughout his career.

Dave recalls that one such example occurred right under his nose. Back in 1960, Dave's future professor and advisor, MIT's Bob Gallager, had introduced "low-density parity-check" (LDPC) codes in his PhD thesis. Gallager was aiming for the Holy Grail of coding theorists: to find error-correcting codes that could approach the Shannon limit, with feasible complexity. In other words, he wanted to send messages over a given noisy channel efficiently, reliably, and as fast as possible. Gallager's work was appreciated as a theoretical contribution, and was published in an MIT Press monograph, but it didn't make any practical impact at the time. Indeed, Codex Corporation (for which Bob consulted, and Dave later worked) turned down the opportunity to exploit LDPC codes; they were simply too complicated for the available technology.

Thirty years later, in 1993, the coding world was rocked by the invention of "turbo codes," which approached the Shannon limit closely with very reasonable complexity. Turbo codes use an iterative decoding method that successively refines a set of estimates of the likelihoods of the encoded bits. This looked a lot like Gallager's iterative decoding methods for LDPC codes. Soon it was realized that turbo codes and LDPC codes were closely related, and in practice it turned out that LDPC codes worked even better. "And I kicked myself," Dave says. "I should have thought of this, and so should many of my colleagues, but somehow Bob's codes had been tagged as impractical." Maybe this was true in the Sixties and Seventies, but we never re-examined them when technology had advanced in the Eighties and Nineties." Today, Gallager's codes achieve the closest approaches to Shannon limit, and are used in most new data communications standards.

Dave experienced a surprise like this in his own career, as well. As an MIT doctoral student studying under Gallager and Jack Wozencraft, his aim also was to design coding systems with good performance and reasonable complexity. He invented a scheme called "concatenated codes," for which error probability decreases exponentially while complexity increases only as a small power of the code length for all rates less than the Shannon limit. His 1965 thesis too was published as an MIT Press monograph. However, these codes were also viewed as impractical. Dave recalls an employment interview at Bell Labs in which he presented some of these long and complicated codes; he had the clear impression that he was perceived as "not a very practical guy." But within a decade, concatenated codes became the standard in space communications. "I had no idea they would have such a big practical impact," he says. In 1998, they were awarded an IEEE Information Theory (IT) Society Golden Jubilee Award for Technological Innovation.

Another instance of a surprise success came after Dave wrote a system theory paper while a visiting scientist at Stanford, published in the SIAM Journal on Control in 1975. After it was accepted and Dave had become VP-R&D at Codex, he says that he virtually forgot about its existence: "I shot an arrow into the air; it fell to earth, I knew not where." Ten years later, he received a phone call from a system theorist wanting to ask him a question about his "famous paper," and Dave discovered that his paper had become a "citation classic."

Dave's primary career was in industry, first at Codex (1965-77), and then with Motorola (1977-99), after Motorola acquired Codex. Bob Gallager was one of the founding consultants for Codex, and suggested that Dave interview there after his graduation in 1965. At Codex, Bob and Dave became close colleagues. Codex's business success in the 1970s was based on a series of high-speed modems that Dave designed, building on Bob's basic work on quadrature amplitude modulation. These modems became international standards, and are still deeply imbedded in all personal computers, in a tiny corner of the Intel chip that allows you to connect to the Internet through a phone line.

In those days, Dave remembers that professors were encouraged to consult with outside companies one day a week, and most of them did. Dave says that practice is rare now, perhaps because professors have gotten busier: "I'm sorry to see that's changed." Dave says that his own experience has been that some of the most interesting research problems come out of trying to understand practically successful systems at a deep level.

Dave has been an Adjunct Professor in LIDS since 1996. He has received many awards and honors, including the IT Society Shannon Award, the IEEE Edison Medal, and membership in the NAE and NAS. He recently served as President of the IT Society, for the second time, and won the IEEE Donald Fink Prize Paper Award, for the second time. He continues to write research papers, and taught his graduate course on coding in Fall 2010. However, he says that his objective now is to be "more retired." He remarried five years ago, and now works mainly from his home office, although he still maintains an office at MIT. He says that his current research is a purely intellectual investigation of the theory of codes on graphs and its connections to system theory, with no pretensions to practicality. But, if the past is any guide, could Dave be surprised once again by an unexpected payoff?