towards a new metric: "quality of information"

1 gbps


It's certainly not news that information is the new currency of business transaction. On-line information is rapidly becoming a centerpiece of near-real-time information distribution. Electronic publishing is mushrooming, and the colonization of cyberspace is well underway. Time magazine is now available on-line, as is the New York Times, and many managers in forward-thinking companies like Lotus no longer read the Wall Street Journal in paper form, they have it delivered daily electronically via Lotus Notes. (With respect to the Internet I don't need to wax enthusiastic over the phenomenal growth that this medium is experiencing. You are, no doubt, reading about it in your daily newspaper.)

Anyone who has tried to surf the tsunami of this information wave for any length of time has probably wondered: what are we going to do with all this information? What indeed? When the editorial staff here at the magazine first started trolling on-line resources several years ago, it was an exciting process of discovery, and held out the prospect of being able to mine cyberspace for late-breaking stories that we could deliver to our readers. It even allowed us to scoop the weeklies on a number of occasions, and we were the first publication of any kind to break the story of Internet commercialization.

As always, however, ongoing experience yields additional perspectives. For example, anyone who has delved into Internet mailing lists can tell you how this excellent resource can easily get out of hand, becoming a time and energy sink of no small proportions. What we soon discovered in our on-line adventuring is that having all of these phenomenal resources continually pouring in really didn't always help the situation - in many cases it just doubled or tripled the amount of data needing to be processed. Thus, in addition to the reams of press releases we already contended with on a daily basis, we increasingly had to deal with reams of faxes, megabytes of E-mail, and data from the Internet, CompuServe, the Well, and various other information sources.

Publishing is a highly information-intensive business and other businesses may be less so. Still, even in the general business environment, the mounting problem of information overload is something that shouldn't be underestimated. There are two vectors in this dynamic. The amount of information being generated is increasing at an astonishing rate, and the ability to move it rapidly is also increasing, which, in turn, affects the amount of information - a vicious cycle if there ever was one.

Since these processes are self-synergizing, the end result is that the pace of business (and for that matter life in general) seems to move faster. A recent study by an executive search firm indicated that executive meetings that used to be allotted an hour in duration now take place in 15 minutes. At the same time, our own human bandwidth is playing a serious game of catch up. Paul Nicholson, former executive editor of Telecommunications [R], used to posit that human information reception clocked in at around 1 Gbps. I don't know how he arrived at that figure, but there's a solid point to be made: sooner or later, people are going to feel overwhelmed by this endless proliferation of data.

As John Naisbitt once observed, we are "drowning in information but starved for knowledge." Too much information can be as bad as too little. Imagine the information revolution occuring in something like three phases. If Phase I is information euphoria, as I described it earlier, then Phase 2 is information ennui, the realization that you are thrashing about in the middle of a whitewater stream with no paddle and no canoe. That brings us to Phase 3, as yet undefined. I predict that in this next phase, some new metrics are going to be needed. One of them is something I call "quality of information" or QOI. This concept - something I'll have more to say about in subsequent commentaries - is intended to develop better granularity in measuring the utility of information. Moving towards this concept of quality information means more than filtering, which only ensures specificity of interest, not quality. Such concepts will be essential in trying to deal with the problem of managing complexity as we proceed headlong into the interesting challenges of the nanosecond nineties.

Tom Valovic, Editor-in-Chief. Internet: valovic@world.std.com; MCI mail: 311-1693; phone: (800) 225-9977; fax: (617) 762-9071.(Note: this commentary is also available electronically on CompuServe)

Copyright 1995, TELECOMMUNICATIONS magazine, reproduced with permission

Quality of Information - Part II


voices