Content area
An investigation into the political and financial factors which inhibited the ready application of computers to individual academic libraries from 1967-1971 is summarized. Speculations on the future of libraries in a computer dominant society are also presented. Technical aspects of system design were specifically excluded from the investigation. Some 24 institutions were visited and approximately 100 persons were interviewed. Substantial change is envisaged in both the structure and function of the library if the emerging trend of coalescing libraries and computerized information processing centers continues. Only the development of regional and national bibliographic networks, with the assistance of substantial federal funding can really "save" the library. A failure to respond to the challenge of the computer could be fatal.
This paper (1) summarizes an investigation into the political and financial factors which inhibited the ready application of computers to individual academic libraries during the period 1967-71, and (2) presents the author's speculations on the future of libraries in a computer dominant society. Technical aspects of system design were specifically excluded from the investigation. Twenty-four institutions were visited and approximately one hundred persons interviewed.
Substantial future change is envisaged in both the structure and function of the library, if the emerging trend of coalescing libraries and computerized "information processing centers" continues.
SUMMARY OF MAJOR FACTORS WHICH INHIBITED THE APPLICATION OF COMPUTERS TO LIBRARY PROBLEMS, 1967-71
Major factors which inhibited the application of the computer to the library during the period 1967-71 can be categorized under three broad headings: (1) Governance, organization, and management of the computer facility; (2) personnel in the computer facility; and (3) deficiencies in the library environment.
GOVERNANCE, ORGANIZATION, AND MANAGEMENT OF THE COMPUTER FACILITY
1. Uncertainty over who was in charge of the computer facility. This problem was partly attributable to the fact that the goals and objectives of the facility were imprecisely stated or not stated at all. Often there was no charter, no systematic procedures for establishing priorities, and excessive autonomy by the computer facility. These factors often permitted the facility to operate as a self-directing, self-sustaining entity, responsible to no informed, upper-level manager.
2. Effect of high-level administrative changes. In a few instances, the library automation effort was instigated by the president of the institution. He could, in effect, personally direct the allocation of resources. However, whenever a high administrative official leaves, the resulting vacuum is quickly filled by other interests, the atmosphere changes, and his personal program goals dissolve.
3. Management inadequacies. The effects of domination by a technician or special interest group are described below in more detail. Although more and more organizations are putting together influential user groups to point the way toward better management, decision-making responsibility and author continued to be misplaced in a few institutions that vested authority for technical decisions in a committee of deans who were somewhat remote from current trends in computing because of their administrative responsibilities. (In one institution, it was half-jokingly stated that a dean in any hard science could be characterized as suffering from a minimum technological time-lag of two years.)
4. Lack of long-range planning inclusive of attention to community priorities. Few facilities visited had any written long-range plans, either for the acquisition of hardware, the conversion of older programs, or the involvement of users in systems design. Ad hoc arrangements were prevalent.
5. System instability. This was more the rule than the exception, especially in software, operating systems, hardware configuration, and pricing. Wherever an academic computing facility was used for library development, the same broken record always seemed to be playing: the facility was always being taken apart and put together again. Of course library development was not the only user affected; complaints arose from all users.
6. Biased pricing algorithms. In the academic facility, student and research use were competitive. Hence systems were typically geared to distribute computing resources around the clock in some equitable and rational way. For instance, short student jobs were sometimes given a high priority for rapid turnaround, while long, grinding calculation work was pushed off to the evening or night shift by means of variable pricing schedules or algorithms. A pricing algorithm is basically a load leveling device to smooth out the peaks of overdemand and the valleys of underutilization which would have occurred in the absence of such controls. Devising pricing algorithms is by no means a simple task, since many factors must be taken into account: the kinds of machine resources available, their respective costs, the data rates at which they can function, market demand, hardware and software available, and system overhead, to name but a few.
Library jobs tended to suffer in both batch and online processing. In the former case, because batch jobs on large databases took so much time, library work generally could not be done during the prime shift; in the latter case, an online library stem made substantial demands upon a facility's storage equipment and telecommunications support, and competed with all other online users.
7. Sense of competition with the library for hard dollars. This problem, which is related to pricing bias, is detailed later.
8. Scheduling problems. Many of the institutions visited had systems or charts for scheduling production, development, and maintenance. But conversations with system users often verified that schedules were either not met or had been unrealistically established. This was especially the case with development work.
PERSONNEL IN THE COMPUTER FACILITY
1. Selection and evaluation. Inasmuch as the library often did not have the competence to judge personnel or the ability to generate meaningful specifications, there was generally very little protection from incompetence in this area.
2. Elitism: The notion that the masters of the computer are inherently superior to and have better judgment than computer customers. Elitism is a paradox: it can be positive or negative--positive when the best brains produce software designs of true genius with respect to function, performance, economy, and reliability but in its negative manifestation, reminiscent of the girl with the curl in the middle of her forehead: "When she was good, she was very very good; when she was bad, she was horrid.
During the boom years when computer facilities were expanding faster than the supply of competent staff, elitism seemed fairly common in the computer center. The excitement of rapid development, the seemingly unlimited intellectual challenge presented by the powerful apparatus, and high-strung dispositions sometimes caused tempers to flare or immaturity to sustain itself beyond a reasonable time. Strange hours, strange habits, bizarre behavior, all seemed to conspire against ordered and rational development. Fortunately, as the field matures, the negative aspects of elitism are dying; managers now can concentrate on staff development work to turn top intellectual talents toward productive achievement.
3. Disinterest. This factor may be allied to elitism. In some instances, the computer center's staff gave considerable attention to the library during the period immediately following machine installation, when utilization was low. Later, the staffs keen interest became "dulled" at the thought of operating a production system. "More interesting jobs" were challenging the programmers and beginning to fill up the machine.
4. Fear of the unknown big user. It was recognized early that the library could be among the computer facility's largest potential customers, perhaps the largest. In some facilities, this recognition may have induced fear of being taken over or overwhelmed by the user, who would then be in a position to dominate and dictate the direction of further development and operations.
5. Fears of an unknown production environment. Simply expressed, a production environment removes much of the stimulus for creative approaches to problem solving unless continuous development is maintained for new systems and new applications. Many of the best programmers did not wish to lose their freedom to innovate and actively resisted participation in establishment of a production environment, with its concomitant requirement of "dull" maintenance support work.
DEFICIENCIES WITHIN THE LIBRARY ENVIRONMENT
1. Failure to understand in full detail the current manual system. Even where the manual system was understood, there was often an inability to describe it in the clear, unambiguous style essential to system design work. These deficiencies were further compounded by the unwillingness of some librarians to learn how to communicate adequately with computer personnel.
2. Inability to communicate design specification. Many did not understand how to put together a specification document; particularly, they did not know how to account exhaustively for all possible cases or alternatives. Librarians were unaccustomed to defining their data processing requirements quantitatively or with precision--both absolutely indispensable to the computer environment. Also, as much as the computer facility changed its software environment, many library development efforts were constantly changing their system requirements--a condition which made it all but impossible to program efficiently.
3. Failure to understand the development process. Development is a new phenomenon in libraries. Most librarians were not educated to comprehend development as an iterative process, characterized by experimentation, error, feedback, and corrective measures. Accustomed to the relative stability of long-established procedures--some of which had stood for generations, even centuries --some librarians were baffled by the rapidly changing new technology, others showed impatience and a low tolerance for frustration. Many expected development projects to resemble turnkey operations, and the failure of the process to accommodate these expectations produced disappointment and an inability to cope with the computer environment.
4. Failure to recognize the computer as a finite resource. Both librarians and early facility managers seemed to look upon the computer as an inexhaustible resource, the former through lack of sophistication and the latter apparently through myopia or possibly ambition. Some managers must have told their users that there was "no way" their equipment could be saturated in the foreseeable future. Apparently some library users were naive enough to believe that.
5. Excessive or unrealistic performance expectations. Few library users understood the relationship between the system specifications and functional results, and fewer still understood the significance of performance specifications. The situation was not assisted by notions of "instantaneous" retrieval pushed by salesmen or the popular press. (The writer recalls vividly how one salesman told him the library could have a CRT device for $1 a day! And indeed, the device itself was $1 per day if one cared to do without the keyboard, without cables, installation, control units, teleprocessing overhead, a computer, software, etc.)
6. Lack of an established tradition of research and development (R&D) and the lack of venture capital in the library community. The challenge of the computer may have been largely responsible for activating research and development as a serious and continuous effort in librarianship. Inexperience in raising and managing funds for R&D, as well as a general lack of knowledge of computer cost factors inhibited progress or tended to make the development effort inefficient and full of surprises.
7. Human problems. Some libraries having prior experience with small batch systems underestimated the scale of effort for contributing to the design of the large system, selling it to the users, installing it, and training the users.
8. Insufficient support from top management. In some instances, library management did not accord the automation effort the kind and degree of support essential to success. In particular, some librarians seemed to feel that automation was a temporary affair, definitely of less importance and significance than current manual operations. Some did not recognize the sacrifices in regular production that would be necessary and some did not appreciate the continuing nature of development work.
BACKGROUND
Two important prerequisites to progress in library automation were money and technical readiness. The government supplied the first industry the second. The announcement by IBM in 1964 of its System 360 occurred at a fortunate time for the American library community. President Johnson's administration had launched enormous programs in support of education. The Library Services and Construction Act was soon to channel millions of dollars into library plant expansion and, perhaps more significantly, the Higher Education Act of 1965 was to sponsor research, which until then had only the support of limited funds from the Council on Library Resources Inc., and the National Science Foundation (NSF). (Support from the National Science Foundation was largely, although not exclusively, directed toward discipline-oriented information services; one of the largest NSF grants went to the University of Chicago Library.)
It was the right time to invest in library automation. Important milestones were already behind the library community: the National Library of Medicine's MEDLARS program was well underway, the Airlie Conference on library automation had been held and its report published ("the White Book"), and the Library of Congress automation feasibility study ("the Red Book") had appeared.(1,2) The first MARC format was being tested in the field.
In computer technology, third generation equipment represented major increases in computing power, processing speed, reliability, and capacity to store data in machine-readable form. IBM's sales force was successful beyond imagination in getting System 360s installed in large universities, as well as in business and government. IBM promised a new kind of software--time-sharing--which would virtually eliminate the tremendous mismatch of data-processing speed between the human being and the machine. The new methods of spreading computer power through teleprocessing and time-sharing promised to make the computer at least competitive with and possibly an improvement over "antiquated" manual systems of providing rapid access to large and complex data files.
Within this relatively unknown environment, universities and libraries entered the software development process, which, if successful, could enable them to catch up where they had been hopelessly falling behind. Circulation, book purchasing, and technical processing loads in many libraries seemed to double and triple overnight as the country's schools and their programs grew to accommodate expanding enrollments. Manual systems that had been reasonably workable and responsive in environments characterized by slow growth demonstrated significant and disturbing defects--the inability to deal with peak loads, or rapidly changing loads. The same effects were felt in administrative and academic computing: a bigger and more complex payroll, more students to register, construction contracts to monitor, more research grants which demanded bigger computers, and so on. These were truly boom years.
But in the academic community there was still another force developing which was ultimately to be of even greater significance for libraries than the inconveniences of being unable to handle the housekeeping load: a dramatic rise in the expectations of patrons, especially in the academic community, here computers already abounded. Libraries had come to be felt by some as strongholds of conservatism and expensive luxuries; librarians were faulted for not "putting the card catalog onto magnetic tape," for not implementing automated circulation systems, or otherwise failing to take advantage of new and powerful data-processing techniques. The libraries were caught amidst a variety of sometimes conflicting, sometimes complementary factors: the visionary ignorance of the computer salesman, the senior academic officer possessed by the computer dybbuk, a lack of sympathy or understanding among some computer center managers, a lack of appreciation by students and faculty of the complexity of identifying, procuring, and cataloging unique copies of what must be the least standardized product known to man, and their own lukewarm commitment to undertake the hard work required to learn how to use the computer resource. Anxieties about job displacement caused some library staff to look upon computers with trepidation, thus further placing the librarian in a defensive position. While these forces were taking shape, the library's bibliographic activities continued to be seriously hampered by inadequate international bibliographic control.(a) Some essential computer hardware, especially the programmable CRT terminal with an adequate character set, was either nonexistent or totally unsuitable to library applications. In this institutional context librarians entered the world of computers and data processing.(b)
PURPOSE
It is the purpose of this report to examine in some detail how internal institutional factors affected the development of computerized bibliographic systems, and especially to consider nontechnical, negative factors: What slowed down or inhibited the applications of computers in librarianship? This report is not concerned with the merits or demerits of specific systems or their features; indeed, the investigator did not inquire about system specifications. Major questions centered around the factors that fostered or hindered the development process, regardless of the merit of a project or system.
SCOPE
Investigation was limited almost solely to those institutions considered likely to have large-scale, in-house development projects using third generation computer equipment. The majority of places visited were large academic libraries. The time span included in the survey begins approximately in 1967 and ends in 1971. A total of twenty-four institutions was visited and some one hundred persons interviewed; a list of the institutions visited is in appendix A.
METHODOLOGY
SITE VISITS AND INTERVIEWS
Arrangements were made to visit four types of individuals: the director of libraries, the head of the library's system development department, the director of the computation center, and whatever principal institutional officer was managerially and/or financially responsible for campus computing. Considerable variation was found in the type of person assigned this last responsibility--it could be the provost, the vice-president for academic affairs, or the vice-president for business/financial affairs. Choice of the major institutional official to be interviewed was often determined by the pattern of computing in a particular institution, or the facility that supported the development effort.
At first the investigator attempted to utilize a structured questionnaire for interviewing. This very quickly broke down, as the interviewees were generally voluble and ranged widely over many related topics or items which they would have been asked about later. Accordingly, after the first few interviews, the formal questionnaire approach was dropped and a simple checklist of major questions kept on a few cards to make sure that each major issue had been addressed. Every interviewee received the investigator graciously and none was unwilling to talk; indeed, if anything the opposite was the case--most persons seemed to be eagerly waiting for an opportunity to air their views.
Visits and interviews occurred during the period January-April 1972.
LITERATURE SEARCHES
Searching the literature on this topic has been extremely frustrating. In the literature of computer science and management, there are many articles on pricing algorithms, machine resource allocation schemes, and issues of managing the computer facility, but none specific to the topic of this report. Besides scanning professional literature, the author has regularly conducted for the past year monthly computer searches via the UCLA Center for Information Service's SDI Service. Abstracts and citations were searched in Research in Education (RIE) and Current Index to Journals in Education (CIJE). With respect to problems faced by the library in acquiring computer services, the results have been nil in both cases. The author reluctantly concludes that no major recent studies have yet been published in this sensitive area, although two papers by Canadian librarians are very helpful.(4,5) The National Academy of Sciences/Computer Science and Engineering Board's Information Systems Panel appears to have come closest to identifying the issues in its report, Library and Information Technology: A National Systems Challenge. Still, the comments in that report are highly generalized and do not grapple with specifics.(6)
STRUCTURE OF EDUCATIONAL COMPUTING
Most of the visited institutions maintained separate facilities for administrative and academic computing, while a few ran combined facilities or were in the throes of consolidating their facilities. The differences between administrative and academic computing have historical roots deeply embedded in institutional soil. Administrative computing is usually an outgrowth of punched card installations first set up for payroll and financial reporting. Academic computing, on the other hand, has its origins within the institution's instructional and research programs. Typically it has been supported by external grants and contracts and has been oriented toward the "hard" sciences. Until the recent dropoff in federal support of higher education, academic computing was a money-maker (through the overhead on grants and contracts) while administrative computing was a money spender.
ADMINISTRATIVE COMPUTING
Typically very little computational work is done in administrative applications; most of the computer work is associated with input, update, reading records, writing records, and printing reports. Except for the payroll application, the consumer group has tended to be somewhat smaller and less transient than the academic group. But to university administrators the computer could do much more than write checks and pay bills. Many significant administrative applications had already been installed on second-generation equipment: faculty-staff directories, inventories of space, supplies, and equipment, records of grades, course consumption reports, etc. All these tended to expand the user group, increasing competition for the resource. The advent of third-generation equipment made it attractive for administrators to think about applications centered around the so-called "integrated database." This led to a demand for further new services for the registrar, fundraising and gift solicitation, student services, purchasing, etc.
Conventional administrative computing--particularly that part of it which generated regular reports--lent itself naturally to batch processing and indeed many of the early computer installations actually continued established punched card operations, merely using the computer as a faster calculator and printer. The administrative computing shop is typically characterized by (or hopes to be characterized by) great systems stability and dependability, a cautious and measured rate of innovation, and in the opinion of some academic computing types, not much imagination. File integrity, backup and recovery, and timely delivery of its products are prime goals in an administrative computing system. The administrative computing facility very much resembles the library in two important aspects: (1) it is a production system; and (2) it is almost entirely an overhead function, i.e., there is little or no attempt at cost recovery from system users for its services.
ACADEMIC COMPUTING
Academic computing is a much different world. It serves a large, vociferous, influential, and mostly technological user community, many of whom are not only competent in programming, but more importantly, possess ready cash. But this is changing: as academic computing expands to service users in the humanities and social sciences rather than mainly those in the "hard" sciences, the user group is growing and it will probably not be long before it embraces the total academic community.
In hard science applications, the academic facility typically performs an enormous amount of computing ("number crunching") with a relatively small amount of output. System backup and recovery is important to the academic computing facility, but file integrity responsibility may often be assigned to the user since such a center sometimes does not maintain the database but merely provides a service for manipulating it. The main components of academic use are department-or discipline-oriented research and student instruction, the latter being particularly strong if there is a well-established computer science department.
Software development has customarily played a major role in academic computing and the usual practice was to actively seek out imaginative systems programmers for whom change and system improvement are food and drink. Consequently, instability, both in hardware and software, has been more the rule than the exception in the recent past, although as the management of computer facilities matures, this too is changing.
CURRENT TRENDS AND STATUS
It is obvious from the above that administrative and academic computing have been characterized by diametrically opposed machine and managerial requirements. Where they have been combined in the same facility, tensions have prevailed and neither user was happy. In a few instances known to the writer, such combinations have been abortive and a reversion made to divided facilities. But as computing matures it is becoming evident that operational stability is needed for all types of computing, not just administrative computing. Additionally, the financial crises now prevalent in institutions of higher education have brought more realistic attitudes to the fore in understanding just what kinds of facilities can be afforded, and how they should be managed. Additionally, the economies of scale, the increasing flexibility of hardware and growing sophistication of software are now combining to form an environment that can better satisfy all potential users of computers. There are clear indications that a unified, well-managed shop with competent staff might now economically and efficiently serve a variety of applications--including administrative and academic--on the same facility. However, this is a developing trend and does not correspond with what the writer actually observed during his visits. In situ he saw much evidence that Anthony Oettinger's observations of some years ago were still valid:
...routine scheduled administrative work and unpredictable experimental work coexist only very uneasily at best, and quite often to the serious detriment of both. Where the demands of administrative data processing and education require the same facilities at precisely the same time, the argument is invariably won by whoever pays the bills. Finances permitting, the loser sets up an independent installation.
Indeed, it would not be unreasonable to conclude from the interviews that in most places visited, computing during the period 1967-71 was in a state of disarray. There is abundant and disagreeable evidence of technical incompetence, lack of management ability, ill spent money, communication failures, and naive and disillusioned users.
But it would be a mistake to conclude that the failures in library automation are attributable primarily to computer-oriented personnel or hardware problems--librarians in their own way displayed many of these same failures.
It would be another mistake to dwell excessively on the high failure rates observed. In any complex technological endeavor, the rate of failure is dramatically high at the beginning; there is ample evidence here from the aircraft and space industries. Indeed. the likelihood of a first success in anything complex--library automation is complex, as we have learned the hard way--is practically nil.
ORGANIZATION AND MANAGEMENT PROBLEMS: THE ACADEMIC COMPUTING ENVIRONMENT
Early academic computing facilities were typically run by faculty members in engineering, applied mathematics, computer science, or related fields. This arrangement was satisfactory when computers were small, relatively primitive, and the user community was confined to those few people who could program in machine language or assembly language. As equipment became bigger and more powerful, and as higher-level programming languages developed, more and more people learned programming. Correspondingly, the task of managing the computer facility grew rapidly in size and scope. The budget of a large computer center in a modern university can easily run to several millions of dollars annually. The manager must balance seemingly innumerable, complex forces: personnel, management, government and vendor relationships, demands from vocal users, establishing priorities, the challenge of hardware advances, marketing, pricing services, balancing the budget. etc. It soon became clear that few faculty members possessed either the multifaceted talents or the experience required for effective management.
As the center's budget grew, and particularly as the shift was made from second to third generation equipment, the faculty member tended to be replaced by the technician as manager. Unfortunately for many of the facility users, the technician tended to promote his own technical interests in software development or hardware utilization. In some instances, the user community felt that the facility was being run more for the benefit of the staff than for the users. The technician-manager often looked at the computer as his personal machine, much as some faculty members had earlier felt the computer to be their own private preserve. The vice-president of one university expressed the view that the technician-manager doesn't really have an institutional loyalty tied to the goals and objectives of the academic programs; he is more loyal to the machine or the software. In a school with a long history of computer utilization, there had been no technician in charge of the computer facility for a decade. Yet in a school not too far away, an officer indicated that his institution had "made the same mistake twice in a row by hiring a technician to manage the computer facility.
The technician-manager represents a highly personalized management style, one in which goodwill, friendship, or personal interest is the key to effective service. It can hardly represent an arrangement for the successful development and implementation of computerized, bibliographic systems.
In the third and current organization and management phase of academic computer facilities, the professional manager is in charge. Schools are now beginning to see the need to develop formal charters for their computing centers, quasi-legal instruments which will lay out their specific responsibilities as service agencies. A professionally managed service agency eliminates one of the most irritating elements in the allocation of computer resources: personal judgment by the faculty or technician-manager as to the worth of a project, which was so prevalent during earlier management stages. At the time of the interviews, very few institutions actually had such charters, but their need was being recognized. It is now universally accepted that the computer center can no longer be the plaything of the faculty nor the expensive toy of the technician.
ORGANIZATION AND MANAGEMENT: THE ADMINISTRATIVE ENVIRONMENT
Because of its historical development, the administrative computing facility was usually first run by someone with an accounting or financial background. (Academic computing persons occasionally put disparaging labels on such people as "EDP-types" or characterized them as having a "punched card mentality.") The nature of the workload virtually meant that the administrative shop would be set up mainly for batch processing and any database services provided for other users would involve printed lists. Such facilities were found satisfactory by a number of libraries even for applications such as circulation, which produced gigantic lists--probably because it represented a vast improvement over an antiquated, poorly designed, or overloaded manual system.
However, there was at least one major technical consideration which had direct political and financial implications for the library that turned to the administrative computing facility for its computer support. This was the library's need to support and manipulate a database with nearly every data element of variable length--a requirement that was practically non-existent in administrative computing. Some facilities were unable or unwilling to meet this requirement.
The move from tape-oriented systems to mixed disc and tape systems on third-generation equipment necessitated an upgrading of programming staff, and brought into the administrative shop the same clearcut distinction between system programmers and application programmers that had emerged earlier in the academic shop. This change in turn demanded appointment of more knowledgeable facility managers, many of whom were drawn from business and industry rather than the ranks of in-house accounting staff.
This transitional period was characterized by two enormously challenging parallel efforts: the conversion of existing programs to run on third-generation equipment and the development of new applications. To an extent these responsibilities were competitive, and from this viewpoint it was certainly not a propitious time to embark upon anything complex as bibliographic data processing. Yet numerous workable systems emerged for circulation, book catalogs, ordering and accounting systems, and serials lists.
These were not accomplished without anguish as the library did not control the machine resources and often did not control the human resources--the facility manager tended to make his priority decisions to please his boss who was certainly not the librarian. Besides, no application could really take precedence over payroll or accounting in the administrative shop. To the librarian it was more like borrowing another person's car than renting or owning a car: when the resource was urgently needed someone else had first call.
ORGANIZATION AND MANAGEMENT: THE LIBRARY AUTOMATION ENDEAVOR
A detailed study of this subject is not within the scope of this investigation. However, it will be useful to note that the organization and management of library automation activities demonstrate development phases that closely parallel those in the computing environment:
1. A stage in which the user himself (cf. accountant or faculty member) undertakes to perform the activity. In this stage individual librarians learned programmmg, did their own design work, wrote, debugged, and ran programs themselves. (This was possible in the "open shop" environment prevalent in many early computer facilities.)
2. A stage in which the technician--in this case a librarian with appropriate public service expertise (for circulation applications) or technical processing knowledge (for acquisitions, cataloging, or serials)--took charge of an organized development effort, hired his own programmers and systems analysts, and negotiated directly with the computer facility.(c)
3. A stage in which the professional system development manager is hired to oversee the total effort. Such a person is sometimes drawn from business or industry, is a seasoned project manager, and has broad knowledge of computers, especially in the area of costs. Such an appointment is more common in the large library, the consortium, or network.
HUMAN PROBLEMS ASSOCIATED WITH RAPID CHANGE IN INSTITUTIONS
Some institution, particularly in their administrative functions, became embroiled in a seemingly endless round of internal psychosocial problems that did not make the environment conducive to problem solving. The move to computerizing manually oriented functions, whether in the library or other parts of an institution, was found to be extremely threatening to established departmental structures. It was consistently reported that the political and emotional aspects of system conversion, both in the library and elsewhere, were much more aggravating than the technical aspects. The problem simply showed up first outside the library because applications of computers occurred there earlier. Departments were sometimes unwilling to give up data for computer manipulation for fear that computerization would take jobs away. This phenomenon is not unknown in librarianship where some professionals take an extremely proprietary attitude toward bibliographic data. Now pressures from governments, legislatures, and the academic community at large are gradually establishing the concept that some categories of data are corporate, and do not belong to a specific individual or department, or even to an institution, but should be shared through networking or other mechanisms. But the rapidity of microsocial change and its upsetting emotional consequences caught some library leaders unawares. A considerable reeducational process for both management and labor is required to smooth the transition to the new view.
MOTIVATION PROBLEMS
It is difficult to elicit sound comment concerning motivation (or lack thereof) as a deterrent to progress in library automation. It is an emotional subject and neither the librarians nor the programmers come out "clean."
The prima donna computer programmer, much in evidence in the early days of computer center development, is very much on the wane these days. Like the spoiled child, the prima donna programmer could only exist where personal interests were permitted to take precedence over social goals--or perhaps where institutional goals for the computer facility had not been clearly articulated or had not yet come into focus. Some prima donnas, partly out of ignorance, partly through a stereotyped image of library activities, were inclined to disdainfully dismiss library applications as "trivial," and demand "really challenging" assignments.
But the librarians had their prima donnas, too. Some had learned enough programming to be a little dangerous and they then felt like peers who could tell the computer center not only what to do but how to do it. At first, few members of the library staff were willing to learn how to articulate their specifications and requirements to the management of a computer facility. Most librarians expected some kind of miraculous magic, akin to a wave of the hand, to bring a computer system to reality. Very few understood the heuristic nature of development.
So there were barriers of status, depth of knowledge, and language--any one of which would have sufficed to kill the development of the good motivation essential to breaking new ground. In the wrong combination they could present an overwhelming conspiracy, for their mutual interaction could only produce polarization and intransigence.
THE LIBRARY AND THE COMPUTER FACILITY
THE ROLE OF SIMILARITIES AND DIFFERENCES
For a long time the library has been the "heart of the university." Until the advent of the computer, little could challenge the supremacy of the library as the principal resource of an educational institution. Even the faculty could be put into second place, since it was difficult to attract high quality faculty without good library resources, and the faculty were to a greater degree transient, for the library was considered "permanent," an investment for all time. The computer represents a new and challenging force in the arena where shrinking resources are allocated among competing academic users. Both the library and the computer facility have experienced exceedingly rapid growth in the recent past, concurrent with an expanded demand for services that can easily outstrip available resources. Among some of the larger academic libraries, the staff of the computer center may be half or greater than half that of the library.
Important differences between the two services have recently come into focus. First, most of the services and benefits of the library are intangible. Because of this it has always been difficult to measure the cost benefit of the library as an institution, and it is well known that counts of the number of people entering the door or the number of circulations are far from true measures of the library's functional success. The computer, on the other hand, is a relentless accounting engine; computer facilities can produce endless statistics on the number of jobs run, lines printed, terminal hours provided to users, turnaround time, cards punched, etc. The computer's output is extremely tangible and can be more directly and easily related to academic achievement than can library use.
A second major difference lies in apparently different financial roles within the institution. In most organizations, the library is run as an overhead expense, without any attempt to charge back to users or departments proportional costs of utilization. Like air, the library resource is there for anyone to use as much or as little as he pleases; the library gets a "free ride," but the computer center is expected to pay its own way. This dichotomy is often explicitly designated as the "library--bookstore" duo model. Furthermore, since the library does not generate much in the way of research grants and contracts, it is looked upon as a consumer rather than a producer of financial resources. In fact, those who support computing in preference to books point to the fact that overhead income generated by computer-related research grants and contracts is shared with the library which may have done little to contribute toward the acquisition of such income! In some institutions the situation has become critical indeed because of the recent substantial reductions in federal support. Much political infighting has been necessary to maintain current levels of computer activity, and not all such efforts have been successful. Some institutions have been forced to cut back on computing power, merge facilities, or combine resources with other institutions.
Several years ago when the National Science Foundation imposed an expenditure ceiling on grants, associated overhead income was correspondingly reduced. One computer center director was reported to have suggested that the effect of this overhead cut could be nullified by a simple, internal reallocation of funds, say by taking the needed amount from the budget of another agency on campus of less significance to researchers and scientists, such as the library. This attitude is clear evidence that the library has lost its sacred cow status as a "good thing" on the campus. It too must justify itself.
Close examination of the library and the computer facility gives clear evidence that both deal with the same commodity: information. Within the recent past several computer facilities have changed their designations to "information processing" facilities or centers. Several institutions, notably the University of Pittsburgh and Columbia University have coalesced the library and the computer center organizationally or have both units reporting to a vice-president for information services. The recognition and furtherance of this natural linkage may do much to reduce the potentially destructive competition that can characterize the relationship between the two units.
There are remarkable growth parallels between the two facilities--the library acquiring and processing more and more books in response to expanded publication patterns, more users, and the growth of new disciplines and interdisciplinary research, while the computation facility moves rapidly from one generation of software and hardware to the next. The expansion of both organizations produces seemingly equal capital-intensive and labor-intensive pressures: library processing staff doubles and triples, while the newly acquired books demand more in the way of housing, whether of the traditional library type or warehouse space; the computer center moves toward more sophisticated hardware, especially terminals and communications, which need to be supported by greater numbers of still more highly qualified systems programmers, communication experts, and user services staff. Both services have a marketing problem, but the computation facility being relatively more dynamic and more interactive (because of terminal services), can be more sensitive and responsive, financially and technically, to its clientele than can the library. Only now with the emphasis upon computerized bibliographic networking has the library as an institution begun to approach the marketing strategies and the effective user feedback already well developed in computation facilities.
SERVICE CAPACITY, RESOURCE UTILIZATION, AND SHARING
Differences both in service capacity and resource utilization represent a key political issue affecting the future of both libraries and computer facilities. In major universities, the budget for the computer facility is now not far from the library budget in size, and in a few institutions it exceeds the library budget. With the diminution of external grants and contracts, the two organizations compete for the same hard dollars. This economic competition can either drive the two facilities apart, dividing the campus, or cause them to coalesce--as has been the case at Columbia and Pittsburgh.
Despite its high operating costs, from the viewpoint of resource utilization, the well-managed computer facility can almost always point to an excellent record.(d) No matter how well managed, the research library can never make this claim in the contest of its current materials and processing expenditures, much of which by definition are aimed at filling future needs. The library and its patrons cannot "use" all the resources at their command; the library could not even service all the patrons should they demand the use of "all" the resources. In contrast, the computer facility (particularly large online systems with interactive capabilities) can be very efficiently utilized even when demand is heavy. Thus, to the "objective" eye, it would appear that in the computer facility both the institution and the individual patron get more value for their dollar than they do in the library, which in comparison resembles a bottomless financial pit. One may counter that apples and oranges are being compared, but the institution that pays their bills nevertheless makes the comparison.
FLEXIBILITY, INFLEXIBILITY, AND THE FUTURE
Besides better resource utilization, the computer facility offers the patron far greater flexibility of resource use than can the library. There is no way a large collection of books on the Celtic language or the military history of the Austro-Hungarian Empire can help a professor of structural engineering, a student of marine biology, or a researcher in modern urban problems. Even the books these people actually need and use cannot easily assist others, as relevant data in them is not indexed or readily available for computer manipulation.
The point is that, unlike the library, the computer is a highly elastic universal tool, one that each user can temporarily shape to his own need, replicate the shape later, or if he wishes change the shape at will. The traditional library has no such flexibility; its main bibliographic retrieval device--the card catalog--is especially noted for its high maintenance cost, its limited ability to respond to complex queries, and a general fixity of organization and structure that is ever at variance with changing patron expectations and interests. (If computers can be flexible, why can't the library?)
There is much in the library that is not used because it is inaccessible--locked up in an inflexible retrieval tool or unavailable because the state-of-the-art (both in bibliography and computer science) or staffing does not yet permit far deeper access via "librarian-negotiators" and patrons at terminals interacting with large and deeply indexed databases. As long as major portions of the library budget and staff are devoted to housekeeping and internal technical processing, the library will look less good, less "cost-beneficial" to the academic community than does the computer facility. But there is growing recognition that both institutions deal with information processing which covers a wide spectrum of time. True, the storage formats differ, but this may be a temporary phenomenon. As progress is made on improved, less expensive conversion of data from analog to digital form and vice versa, the day may arrive when the library and the computer facility are indistinguishable.
WILL THE LIBRARY BECOME AN INFORMATION UTILITY?
Computer utilities are an important developing trend and it is sometimes suggested that library services could be delivered within the utility model. Utilities and libraries as they exist today have very different characteristics.
A utility can be defined as a system providing a relatively undifferentiated but tangible service to a mass consumer group and with use charges in accordance with a pricing structure designed for load leveling (i.e., optimization of resource utilization). Typically, a utility both wholesales and retails its services. Within this definition, a conventional library cannot be construed as a utility; its services are generally intangible and very highly differentiated --indeed, chiefly unique, for rarely is one book "just as good as another"; its clientele is not the general public but a highly select group which itself contains highly unequal concentrations of users; and almost no libraries impose user charges in the interest of cost recovery; practically speaking, there is only one United States wholesaler (of bibliographic data)--the Library of Congress.
This situation is changing in several respects. First, the establishment of practical, computerized bibliographic networks has introduced among participating institutions cost sharing schemes closely resembling the load leveling or rate averaging algorithms prevalent among utilities.(e) These new ideas have been readily accepted by libraries and could even become the basis for balancing more equitably the costs of interlibrary loan traffic.
Second, specialized "information centers" have evolved in certain fields, partially as a consequence of lack of responsiveness (or slow turnaround) by conventional library services, and "for profit" commercial services have been set up. Examples of the latter include the European S'il Vous Plait and its American counterpart, F.I.N.D. (Often such commercial services do not hire librarians as they are considered too tradition bound.)
A third force which is rather inchoate at the moment may soon take on a recognizable shape: facilities management. Under such a scheme, the complete management responsibility for all or part of a function is contracted to an outside vendor. For instance, it is conceivable that some libraries in the near future may have no in-house staff for technical processing. Services would be purchased totally from a vendor or obtained from his resident staff, much as computer centers buy specialized expertise through the "resident s.e." (systems engineer). The gradual buildup of computerized bibliographic services offers an excellent opportunity for commercial ventures into turnkey bibliographic operations for libraries. This would bring the libraries one step closer to the utility concept, as they buy a complete package from a wholesaler who probably services many customers.
The tradition library service concepts we know today may undergo drastic changes in financing and in methods of delivery. Beyond the commercialized or contractual arrangement for technical processing, which is only one component of the total information flow, lie unknown territory and little explored concepts: use charges for library services (the bookstore model), the "for profit" library, the complete information delivery system integrated with computers, communication satellites, and cable TV.
If the computer-based library is to become an information utility, a major accommodation will be needed in the financing arrangements, perhaps in the form of user charges--for no utility can survive without regulated demand. An unlimited, uncontrolled demand for any product or service is untenable, for without regulation (i.e., pricing), demand rapidly outruns supply. In the traditional library, where theoretically every user has the "right" to unlimited demand, this never happens for several reasons: (1) not all potential patrons elect to use the resource; (2) the users must usually go to the library to access the bibliographic apparatus and obtain the materials held by the library; (3) every item in a library collection does not have an equal probability of use; and (4) there is a finite rate at which human beings can "use the resource," i.e., people can read just so fast. None of these self-limiting factors applies to say, electric power, radio and TV broadcasting, telecommunication services, or similar utilities.
The library picture could become quite different if these limitations were removed or mitigated. Suppose the patron could access the bibliographic apparatus through his home computer terminal attached to his TV in the "wired city." Further suppose that he could receive selected, short items (where time of delivery is important to him) directly at his TV set, or longer items having less time value as microforms or hard copy delivered by mail or private delivery systems. Given such possibilities, the collecting policies of individual "libraries" (if they continue to be called by that name) might well change drastically so that nationally, collections might become much more standardized or "homogenized" increasing the likelihood that individual holdings will have more nearly equal use probabilities. This would imply the need for one or more national and/or regional centers for servicing the less used materials, along with appropriate delivery systems and pricing schedules.
CONCLUSION
Work on library automation has proceeded during a highly developmental period in the history of computing. In this sense, librarianship has suffered no worse than any other computer application, nearly all of which have gone through traumas of design, installation, redesign, reprogramming, etc. The main distinction is that in many of these other applications --government, military, industrial, or commercial--there have been far greater resources available to the task and vastly greater experience with the development process. Despite the obstacles, progress in computerized bibliographic work has been far more significant and has achieved far more than many librarians--especially those unaccustomed to the development cycle--can appreciate. The snowballing growth of practical consortia and networks along with the successful installation and operation of several online bibliographic systems has already changed the face of librarianship in a very short time. Like the breaking of the sonic barrier, once the initial difficulty is overcome, further progress is easier.
The computer has successfully achieved what librarians have until recently only paid lip service to: cooperation and wide sharing of an expensive and large resource. Though the linear growth model in libraries has been dead for some time, the recognition of this fact has not yet penetrated the entire profession. If libraries are to survive as viable institutions throughout this century and into the next, their leaders must solve the financial, space, and human communication problems inherent in growth. Local autonomy, local self-sufficiency, and the "freedom" to avoid, evade, and even undermine national standards now show up as expensive and dangerous luxuries--potentially self-destructive. Only through the computer will true library cooperation be possible. Only the development of regional and national bibliographic networks, with the assistance of substantial federal funding, can really "save" the library. The computer is actually the library's life insurance and blood plasma. A failure to respond to the challenge of the computer could be fatal, for it is increasingly apparent that patrons growing up in the computer era will not patiently interact with library systems geared to nineteenth-century methods. Nothing in the educational system exists to force people to use a given resource; people use the resources that are effective, responsive, and economical. If the computer is a better performer than the library, patrons will go to the computer. This will particularly be the case as computer services become broader in coverage, simpler to use, and unit prices continue to decline. Despite the serious and irritating problems associated with learning to use the computer, librarians must continue aggressively to support computer applications; indeed, library leaders can impart no more important message than this to their community leaders.
ACKNOWLEDGMENTS
I wish to thank the following persons for their support: Dr. E. Howard Brooks, who was vice-provost for academic affairs in 1971, and David C. Weber, director of libraries, respectively, Stanford University, for granting the leave of absence which enabled me to undertake this project.
I acknowledge with thanks the contributions of the following persons who reviewed early drafts of the paper, in many cases making valuable suggestions and in other instances helping me ward off errors: Mrs. Henriette D. Avram, head, MARC Development Office, Library of Congress; Hank Epstein, director of Project BALLOTS and associate director for library and administrative computing, Stanford Center for Information Processing; Frederick G. Kilgour, executive director, Ohio College Library Center; Peter Simmons, professor of library science, University of British Columbia; Carl M. Spaulding, program officer, Council on Library Resources, Inc.; David C. Weber, director of libraries, Stanford University.
FOOTNOTES
a. Implementation of the Library of Congress' Shared Cataloging Program under Title II of the Higher Education Act of 1965 was soon to alter this situation dramatically.
b. The painful trauma libraries and librarians experienced in getting into computers is too well documented to summarize here. Perhaps the best summary has been done by Stuart-Stubbs.(3)
c. The technical person need not be a librarian. Northwestern University represents a significant instance where a faculty member in Computer Sciences and Electrical Engineering undertook the development effort.
d. In fact, if a computer resource is not much used and isn't "carrying its weight," it can be disposed of, by sale if purchased, or by cancellation if leased.
e. An example of rate averaging is the practice of the Ohio College Library Center to lump total telecommunication cost and prorate it into the membership fee, in effect creating a distance independent tarriff. (This arrangement does not hold outside of Ohio.)
REFERENCES
1. Barbara Evans Markuson, ed., Libraries and Automation; Conference on Libraries and Automation, Warrenton, Va., 1963 (Washington, D.C.: Library of Congress, 1964). 2. U.S. Library of Congress, Automation and the Library of Congress; a survey sponsored by the Council on Library Resources, Inc. (Washington, D.C.: Library of Congress, 1963). 3. Basil Stuart-Stubbs, "Trial by Computer: A Punched Card Parable for Library Administrators," Library Journal 92 (Dec. 15, 1967): 4471-4. 4. Ibid. 5. Dan Mather, "Data Processing in an Academic Library: Some Conclusions and Observations," PNLA Quarterly 32 (July 1968): 4-21. 6. Libraries and Information Technology: A National Systems Challenge; a Report to the Council on Library Resources, Inc., by the Information Systems Panel, Computer Science; and Engineering Board (Washington, D.C.: National Academy of Sciences, 1972). 7. Anthony Oettinger, Run, Computer, Run (Cambridge, Mass.: Harvard University Press, 1969), 196. These same comments were cited in Allen B. Veaner's earlier article, "Major Decision Points in Library Automation," College & Research Libraries 31: 299-312.
Copyright American Library Association Mar 1993