responses to client requests, a NOS server requires a powerful CPU to execute its tasks or programs. Single processor systems with one CPU can meet the needs of most servers if the CPU has the necessary speed. To achieve higher execution speeds, some systems are equipped with more than one processor. Such systems are called multiprocessor systems. Multiprocessor systems are capable of executing multiple tasks in parallel by assigning each task to a different processor. The aggregate amount of work that the server can perform in a given time is greatly enhanced in multiprocessor systems. Since servers function as central repositories of resources that are vital to the operation of client systems, these servers must be efficient and robust. The term robust indicates that the server systems are able to function effectively under heavy loads. It also means the systems are able to survive the failure of one or more processes or components without experiencing a general system failure. This objective is met by building redundancy into server systems. Redundancy is the inclusion of additional hardware components that can take over if other components fail. Redundancy is a feature of fault tolerant systems that are designed to survive failures and can be repaired without interruption while the systems are up and running. Because a NOS depends on the continuous operation of its server, the extra hardware components justify the additional expense. Server applications and functions include web services using Hypertext Transfer Protocol (HTTP), File Transfer Protocol (FTP), and Domain Name System (DNS). Standard e-mail protocols supported by network servers include Simple Mail Transfer Protocol (SMTP), Post Office Protocol 3 (POP3), and Internet Messaging Access Protocol (IMAP). File sharing protocols include Sun Microsystems Network File System (NFS) and Microsoft Server Message Block (SMB). Network servers frequently provide print services. A server may also provide Dynamic Host Configuration Protocol (DHCP), which automatically allocates IP addresses to client workstations. In addition to running services for the clients on the network, servers can be set to act as a basic firewall for the network. This is accomplished using proxy or Network Address Translation (NAT), both of which hide internal private network addresses from the Internet. One server running a NOS may work well when serving only a handful of clients. But most organizations must deploy several servers in order to achieve acceptable performance. A typical design separates services so one server is responsible for e-mail, another server is responsible for file sharing, and another is responsible for FTP. The concentration of network resources, such as files, printers, and applications on servers, also makes the data generated easier to back up and maintain. Rather than have these resources distributed on individual machines, network resources can be located on specialized, dedicated servers for easy access and back up. Interactive Media Activity PhotoZoom: Server Components In this PhotoZoom, the student will view components inside a server.
Content 6.1 Workstations and Servers 6.1.3 Client-server relationship The client-server computing model distributes processing over multiple computers. Distributed processing enables access to remote systems for the purpose of sharing information and network resources. In a client-server environment, the client and server share or distribute processing responsibilities. Most network operating systems are designed around the client-server model to provide network services to users. A computer on a network can be referred to as a host, workstation, client, or server. A computer running TCP/IP, whether it is a workstation or a server, is considered a host computer. Definitions of other commonly used terms are: An example of a client-server relationship is a File Transfer Protocol (FTP) session. FTP is a universal method of transferring a file from one computer to another. For the client to transfer a file to or from the server, the server must be running the FTP daemon or service. In this case, the client requests the file to be transferred. The server provides the services necessary to receive or send the file. The Internet is also a good example of a distributed processing client-server computing relationship. The client or front end typically handles user presentation functions, such as screen formatting, input forms, and data editing. This is done with a browser, such as Netscape or Internet Explorer. Web browsers send requests to web servers. When the browser requests data from the server, the server responds, and the browser program receives a reply from the web server. The browser then displays the HTTP data that was received. The server or back end handles the client's requests for Web pages and provides HTTP or WWW services. Another example of a client-server relationship is a database server and a data entry or query client in a LAN. The client or front end might be running an application written in the C or Java language, and the server or back end could be running Oracle or other database management software. In this case, the client would handle formatting and presentation tasks for the user. The server would provide database storage and data retrieval services for the user. In a typical file server environment, the client might have to retrieve large portions of the database files to process the files locally. This retrieval of the database files can cause excess network traffic. With the client-server model, the client presents a request to the server, and the server database engine might process 100,000 records and pass only a few back to the client to satisfy the request. Servers are typically much more powerful than client computers and are better suited to processing large amounts of data. With client-server computing, the large database is stored, and the processing takes place on the server. The client has to deal only with creating the query. A relatively small amount of data or results might be passed across the network. This satisfies the client query and results in less usage of network bandwidth. The graphic shows an example of client-server computing. Note that the workstation and server normally would be connected to the LAN by a hub or switch. The distribution of functions in client-server networks brings substantial advantages, but also incurs some costs. Although the aggregation of resources on server systems brings greater security, simpler access, and coordinated control, the server introduces a single point of failure into the network. Without an operational server, the network cannot function at all. Additionally, servers require trained, expert staff to administer and maintain them, which increases the expense of running the network. Server systems require additional hardware and specialized software that adds substantially to the cost.
Content 6.1 Workstations and Servers 6.1.4 Introduction to NOS A computer operating system (OS) is the software foundation on which computer applications and services run on a workstation. Similarly, a network operating system (NOS) enables communication between multiple devices and the sharing of resources across a network. A NOS operates on UNIX, Microsoft Windows NT, or Windows 2000 network servers. Common functions of an OS on a workstation include controlling the computer hardware, executing programs and providing a user interface. The OS performs these functions for a single user. Multiple users can share the machine but they cannot log on at the same time. In contrast, a NOS distributes functions over a number of networked computers. A NOS depends on the services of