Content area
What I/S organizations need is a new approach - one that leverages the best aspects of existing remote computing technologies while offering new capabilities to address the performance, management and cost obstacles encountered in deploying business-critical applications on an enterprisewide scale. MIS managers have 3 primary concerns when deploying remote applications throughout an enterprise: cost, management, and performance. Thin-client/server computing is an evolutionary, not a revolutionary, solution for remote computing that reduces costs, increases management and improves application performance for organizations seeking to implement enterprise-wide remote computing. Rather than costly replacement of a prior architecture, thin-client/server technology takes an existing infrastructure to a higher level of performance and efficiency.
Taking remote computing to the next level
The face of the traditional business environment is changing. Across the spectrum of organizational entities, decision makers are recognizing the need for a new way of functioning. Loosely organized workgroups are replacing the once rigid office/management paradigm. Decision-making power is distributed, with local, focused teams gaining autonomy in order to react more efficiently to changes in their specific markets.
It's an exciting time in the evolution of the workplace, but this evolution is placing a strain on businesses' existing organizational and communications infrastructures and, in many instances, the architectural wisdom of the late 80s and early 90s is no longer applicable. Today's distributed, global workplace requires a reexamination of the underlying application architecture and application deployment tools that can best satisfy the needs of I/S organizations seeking to distribute enterprise applications to remote users on a regional, national and global scale.
What I/S organizations need is a new approach-one that leverages the best aspects of existing remote computing technologies while offering new capabilities to address the performance, management and cost obstacles encountered in deploying business-critical applications on an enterprise-wide scale.
Challenges Of Remote Application Deployment
MIS managers have three primary concerns when deploying remote applications throughout an enterprise: cost, management and performance. Their first concern, the bottom-line cost for deploying an application, includes the expense associated with hardware, software and personnel involved with the application's initial rollout as well as its ongoing maintenance. Start with an application's initial deployment to remote users. A server must be installed at each remote site that will run the application. In addition to the expense of the required servers, client workstations and notebook computers must often be upgraded with larger hard disk drives or more memory to run the program. In the event that the application requires a processor or operating system upgrade, the roll-out costs can grow to include completely new workstations. Finally, the MIS team must either reside at, or travel to, each site to perform any needed hardware installation and upgrades as well as to load and configure the application's server and client components.
Their second concern, application and user management, involves client-side maintenance, roll-out support, version control, scaleability, security and flexibility issues on both the server and the desktop. While hardware and software costs can most often be readily identified, those associated with ongoing MIS management of the application deployment are less clear. On the client side alone, Gartner Group, an industry analyst firm, estimates that the total cost of ongoing PC management can surpass $6,000 per year per PC. This figure includes the cost of hardware upgrades required to support increasingly complex applications as well as the requisite MIS staffing expenses associated with the desktop and application maintenance.
And finally, their third concern, performance, surrounds the speed at which an application can be accessed by a user. If an application is mission critical, it must have acceptable performance; otherwise users will not take full advantage of its features. Application performance is usually not a problem on the local area network (LAN). Today, connections between clients and servers average 10Mbps, and l00Mbps connections are becoming increasingly common. In contrast to local speeds, wide-area connections between networks average 56Kbps to 1.5Mbps, while the dial-up connections of most mobile users average 28.8Kbps or less.
The First Phase Of Remote Computing: Laying The Pipes
In the first half of this decade, organizations tackled the challenge of remote computing by implementing a communications infrastructure that essentially provided the "plumbing" for remote users to connect to the network, either by remote node hardware/ software or by branch office routers. In other instances, IS organizations implemented remote control software, allowing an individual to remotely connect to a dedicated desktop machine back at the office to access applications and information.
Remote control is based on a 1:1 connection between two dedicated PCs, whereby the remote PC connects to the host and then mirrors its interface and input devices. Mouse clicks, keystrokes and other input are transmitted from the remote PC to the host. The remote/host communication is typically accomplished over traditional analog telephone lines, using highspeed modems. When deployed in a limited fashion (in a small workgroup), a remote control solution can be a cost-effective. However, the cost and management issues associated with the 1:1 relationship between the remote and host PC rule out remote control as a serious enterprise-wide application deployment solution.
Designed to provide seamless LAN access to geographically disparate clients, remote node solutions extend the LAN to the remote PC. Network packets are passed to and from the remote PC over a WAN connection (typically a dial-up modem link), and the remote PC functions as if it were directly attached to the local LAN. Remote node's biggest advantage is its seamless operation. To the remote PC, the WAN connection looks just like a local LAN connection: applications interact with the remote link just like they do with a traditional LAN link, and the remote users can browse and connect to network resources as if they were physically attached to the network.
Unfortunately, this seamless operation is also the source of remote node's major disadvantage: sluggish application performance. Most traditional applications, like databases, are monolithic-they exist as one or more large files (executable, data or both), and must be completely downloaded to the remote PC before executing. This, in turn, overwhelms the limited bandwidth of the remote node connection, which is 100-700 times slower than a typical LAN connection. Even client/server applications tend to generate too much network traffic to perform well over a remote node link. Plus, loading standalone versions of the necessary executable programs onto each remote PC introduces additional management and version control problems.
The Effect Of Application Architecture On Remote Computing
Now that the communications pipes have been laid for the enterprise remote computing infrastructure, IS organizations are realizing that the challenges of remote computing extend far beyond the mere connection of users to the network. These challenges are partly architectural: many of the existing remote control and remote node solutions are inherently PCcentric and lack the kind of scaleability and manageability that are required for enterprise-wide application deployment. Other problems can be linked to the applications that are being targeted for these solutions. Most traditional applications are poorly suited to the limited bandwidth and disconnected nature of a remote computing environment.
The problems are also inherent in the two-tier client/server architecture itself, which emphasizes client-side computational power. Two-tier client/server divides applications into two parts. The presentation services and the business logic functions execute at the client PC, while data access functions are handled by a database server on the network. In today's widely distributed enterprises, the client/server model breaks down as the client moves farther away from the server yet is required to perform the same tasks as a local, LAN-based machine.
The New Face Of Remote Computing: Thin-Client/Server Computing
The fundamental problem with traditional application architectures and deployment tools is that they retain the fundamental model established by two-tier client/ server computing. In the case of the remote control technology, a workstation "surrogate" runs business logic for the remote user. Consequently, these traditional approaches thwart remote deployment of enterprise applications from cost, management and performance issues. Two-tier client/server architectures provide unacceptable performance over dial up or WAN connections because they are designed and optimized to run over multi-Mbps local links-not remote links. Additionally, they are difficult to manage as application upgrades require software and potential hardware upgrades to all client PCs, resulting in potential version control problems.
To establish a more effective remote application deployment model, emerging thin-client/server technologies challenge the assumption made by traditional client/server models that client workstations must execute application business logic. Thin-client/server computing is an evolutionary, not a revolutionary, solution for remote computing that reduces costs, increases management and improves application performance for organizations seeking to implement enterprise-wide remote computing. Rather than costly replacement of a prior architecture, thin-client/server technology takes an existing infrastructure-hardware, operating systems, software, networks, pipes-to a higher level of performance and efficiency.
With a thin-client/server architecture, a user interface is processed independently from any application business logic. As a result, only the user interface is executed on the client, and all application business logic and data reside on the server. Because only keystrokes, mouse clicks and screen updates travel across the network, users require a small fraction of the normal bandwidth. The end result is it that any client-fat or thin-is transformed into an ultra-thin machine and users receive access to virtually any businesscritical application across any type of network . This capability provides true location independence and increases the flexibility and cost-effectiveness of remote computing.
It's important to note that a thinclient/server architecture and network computing architecture are not the same thing. In the network computing architecture, as defined by Sun, Oracle, Netscape, IBM and Apple, an application's components are dynamically downloaded from the network into the client device for execution by the client. Under the thin-client/server computing model, 100 percent of the application executes on the server.
A Complete Offering Of EnterpriseWide Remote Benefits
In addition to market need for a superior remote computing solution, a key force driving adoption of the thinclient/server architecture is its ability to sharply address the three critical needs for enterprise-wide application deployment. Thin-client/server technology reduces total cost of ownership and increases management through centralized application deployment. By enabling applications to reside totally on the server, a thin-client/server architecture gives IS managers centralized control over application deployment, maintenance, upgrades and support. This is a significant advantage over traditional architectures and deployment technologies, such as remote node, that require physical distribution of software and upgrades to every client, including remote clients.
Version control is likewise established, encouraging frequent, minor application updates. Additionally, IS managers gain more control over who is using applications; and software licenses are easier to define and enforce. All these attributes translate into dramatic increases in productivity for IS personnel and correspondingly sharp reductions in computing costs. Added to these advantages is the ability to maximize the investment in a company's current information technology infrastructure.
Thin-client/server computing offers other important benefits, such as improved application performance and security. Although remote computing will never be quite as fast as a corporatebased LAN, with the thin-client/server scenario, latency is reduced because fewer packets are transferred between client and server. By enabling applications to run at near-LAN speeds over phone lines and WAN connections, this solution solves the speed and performance issues of remote node technology. The use of this type of architecture also strengthens security because all of the data and applications reside on the server, which can be protected by a firewall.
As companies grow increasingly decentralized through global expansion, online commerce and the telecommuting paradigm, remote access to the full range of computing capability has become an imperative. Thin-client/server architecture provides a new way to leverage the current infrastructure of existing remote computing technologies, such as remote node and branch office routers, and application architectures, such as two-tier client/server, while offering a host of additional benefits. Thin-client/server computing not only answers the demand for fast, highperformance access to mission-critical applications, but at the same time speaks to the perennial business issues of cost control and performance. It is a practical, evolutionary approach that nonetheless is creating a quiet revolution in the way people connect across the enterprise.
Vicky Harris is the corporate communications manager at Citrix Systems (Fort Lauderdale, FL).
www.citrix.com
Copyright West World Publications, Inc. Summer 1997