Go to page of
Similar user manuals
-
Server
Dell PowerEdge T105 Systems
190 pages 14.81 mb -
Server
Dell PowerEdge MP148
58 pages 3.91 mb -
Server
Dell POWEREDGE DL385
28 pages 0.6 mb -
Server
Dell 17154738.2-4
173 pages -
Server
Dell PowerEdge 4220
190 pages 6.07 mb -
Server
Dell PowerEdge 2420
148 pages 4.48 mb -
Server
Dell PowerEdge C1100
136 pages 11.22 mb -
Server
Dell PowerEdge E07S Series
176 pages 9.3 mb
A good user manual
The rules should oblige the seller to give the purchaser an operating instrucion of Dell CX4, along with an item. The lack of an instruction or false information given to customer shall constitute grounds to apply for a complaint because of nonconformity of goods with the contract. In accordance with the law, a customer can receive an instruction in non-paper form; lately graphic and electronic forms of the manuals, as well as instructional videos have been majorly used. A necessary precondition for this is the unmistakable, legible character of an instruction.
What is an instruction?
The term originates from the Latin word „instructio”, which means organizing. Therefore, in an instruction of Dell CX4 one could find a process description. An instruction's purpose is to teach, to ease the start-up and an item's use or performance of certain activities. An instruction is a compilation of information about an item/a service, it is a clue.
Unfortunately, only a few customers devote their time to read an instruction of Dell CX4. A good user manual introduces us to a number of additional functionalities of the purchased item, and also helps us to avoid the formation of most of the defects.
What should a perfect user manual contain?
First and foremost, an user manual of Dell CX4 should contain:
- informations concerning technical data of Dell CX4
- name of the manufacturer and a year of construction of the Dell CX4 item
- rules of operation, control and maintenance of the Dell CX4 item
- safety signs and mark certificates which confirm compatibility with appropriate standards
Why don't we read the manuals?
Usually it results from the lack of time and certainty about functionalities of purchased items. Unfortunately, networking and start-up of Dell CX4 alone are not enough. An instruction contains a number of clues concerning respective functionalities, safety rules, maintenance methods (what means should be used), eventual defects of Dell CX4, and methods of problem resolution. Eventually, when one still can't find the answer to his problems, he will be directed to the Dell service. Lately animated manuals and instructional videos are quite popular among customers. These kinds of user manuals are effective; they assure that a customer will familiarize himself with the whole material, and won't skip complicated, technical information of Dell CX4.
Why one should read the manuals?
It is mostly in the manuals where we will find the details concerning construction and possibility of the Dell CX4 item, and its use of respective accessory, as well as information concerning all the functions and facilities.
After a successful purchase of an item one should find a moment and get to know with every part of an instruction. Currently the manuals are carefully prearranged and translated, so they could be fully understood by its users. The manuals will serve as an informational aid.
Table of contents for the manual
-
Page 1
Dell/EMC CX4-series Fibre Channel Storage Array s With Microsoft ® Windows Server ® Failover Clusters Hardware Installation and T roubleshooting Guide[...]
-
Page 2
Notes, Cautions, and W arnings NOTE: A NOTE indicates important informati on that helps you make better use of your computer . CAUTION: A CAUTION indicates either po tential damage to hardware or loss of data and tells you how to avoid the problem. WA RN I N G : A WARNING indicates a potential for property damage, personal injury , or death. ______[...]
-
Page 3
Contents 3 Contents 1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . 7 Cluster Solution . . . . . . . . . . . . . . . . . . . . . . 8 Cluster Hardware Requirements . . . . . . . . . . . . . 8 Cluster Nodes . . . . . . . . . . . . . . . . . . . . . 9 Cluster Storage . . . . . . . . . . . . . . . . . . . 10 Supported Clus ter Configurat[...]
-
Page 4
4 Contents 3 Preparing Y our Sy stems for Clustering . . . . . . . . . . . . . . . . . . . . . . . . 39 Cluster Configuration Overview . . . . . . . . . . . . . 39 Installation Overview . . . . . . . . . . . . . . . . . . 41 Installing the Fibre Channel HBAs . . . . . . . . . . . . 42 Installing the Fibre Channel HBA Drivers . . . . . . 42 Implemen[...]
-
Page 5
Contents 5 A T roubleshooting . . . . . . . . . . . . . . . . . . . . 55 B Zoning Configuration Form . . . . . . . . . . . 61 C Cluster Data Form . . . . . . . . . . . . . . . . . . 63[...]
-
Page 6
6 Contents[...]
-
Page 7
Introduction 7 Introduction A Dell™ F ailover Cluster combines specific hardwar e and software components to provide enhanced availabili ty for applications and serv ices that ar e run on the cluster . A F ailover Cluster is designed to reduce the possibility of any single point of failur e within the system that can cause the clustered applicati[...]
-
Page 8
8 Introduction Cluster Solution Y our cluster implements a minimum of two nodes to a maximum of either eight nodes (for W indows Server 2003 ) or sixteen nodes (for W indows Server 2008) and provides t he following featur es: • 8-Gbps and 4-Gbps F ibre Channel technology • High availability of resources to network clients • Redundant paths to[...]
-
Page 9
Introduction 9 Cluster Nodes T able 1-1 lists the har dwar e req uir ements for the cluster nodes. NOTE: For more information abo ut supported sy stems, HBA s and operating sy stem variants, see the Dell Cluster Configuration Support Matrix on the Dell High A vailability website at www .dell.com/ha . T able 1-1. Cluster Node Requirements Component [...]
-
Page 10
10 Introduction Cluster Storage T able 1-2 lists supported storage system s and the configuration r equir ements for the cluster nodes and stand-alone sy stems connected to the storage systems. T able 1-3 lists har dwar e r equirements for the storage processor enclosur es (SPE), disk array enclosures (D AE), and standby power supplies (SPS). NOTE:[...]
-
Page 11
Introduction 11 Each storage system in the cluster is centrally managed by one host system (also called a management station ) running EMC Navispher e ® Manager—a centralized storage management application used to configure Dell/EMC storage systems. Using a graphical us er interface (GUI), you can select a specific view of your storage arrays, a[...]
-
Page 12
12 Introduction Supported Cluster Configurations The following sections describe the supported cluster configurations. Direct-Attached Cluster In a direct-attached cluster , all the no des of the cluster are dir ectly attached to a single storage system. In this configuration, the R AID controllers (or storage processors) on the storage system are [...]
-
Page 13
Introduction 13 SAN-Attached Cluster In a SAN-attached cluster , all nodes are attached to a single storage system or to multiple storage systems through a SAN using redundant switch fabrics. SAN-attached clusters are superior to dir ect-attached clusters in configuration fle xibility , expandability , and performance. F i gure 1-2 shows a SAN-atta[...]
-
Page 14
14 Introduction •T h e Getting Started Guide provides an overview of initially setting up your system. • F or more information on deploying your cluster with W indows Server 2003 operating systems, see the Dell F ailover Clusters with Microsoft Windows Server 2003 Installation and T r oubleshooting Guide . • F or more information on deploying[...]
-
Page 15
Cabling Y our Cluster Hardware 15 Cabling Y our Cluster Hardware NOTE: T o configure Dell bl ade server module s in a Dell PowerEdge cluster , see the Using Dell Blade Servers in a Dell PowerEdge High A vailability Cluster document located on the Dell Support website at support.dell.com . Cabling the Mouse, Keyboard, and Monitor When installing a c[...]
-
Page 16
16 Cabling Y our Cluster Hardware Figure 2-1. Power Cabling Example With One Po wer Supply in the PowerEdge Sy stems 01 0123 01 0123 redundant power supplies on one AC power strip (or on one AC PDU [not shown]) NOTE: This illustration is intended only to demonstrate the power distribution of the components. primary power supplies on one AC power st[...]
-
Page 17
Cabling Y our Cluster Hardware 17 Figure 2-2. Power Cabling Example With T w o Power Supplies in the PowerEdge Sy stems Cabling Y our Cluster for Public and Private Networks The network adapters in the cluster nodes provide at least two network connections for each node, as described in T able 2-1. NOTE: T o configure Dell bl ade server module s in[...]
-
Page 18
18 Cabling Y our Cluster Hardware F igur e 2-3 shows an e xample of cabling in which dedicated network adapters in each node are connected to each ot her (for the private network) and the remaining network adapters ar e co nnected to the public network. Figure 2-3. Example of Network Cabling Connection Cabling the Public Network Any network adapter[...]
-
Page 19
Cabling Y our Cluster Hardware 19 Cabling the Private Network The private network connection to the no des is provided by a different network adapter in each node. This network is used for intra-cluster communications. T able 2-2 describe s three possible private network configurations. NOTE: Throughout this document, Gigabit Ethernet is used to re[...]
-
Page 20
20 Cabling Y our Cluster Hardware Cabling Storage for Y our Direct-Attached Cluster A dire ct-attached cluster configuratio n consists of redundant F ibre Channel host bus adapter (HBA) ports cabled di rectly to a Dell/EMC storage system. F igur e 2-4 shows an example of a dir ect-attached, single cluster configuration with re dundant HBA ports ins[...]
-
Page 21
Cabling Y our Cluster Hardware 21 Cabling a Cluster to a Dell/EMC Storage Sy stem Each cluster node attaches to the storage system using two F ibr e optic cables with duplex local connector (LC) mult imode connectors th at attach to the HBA ports in the cluster nodes and the storage processor (SP) ports in the Dell/EMC storage system. These connect[...]
-
Page 22
22 Cabling Y our Cluster Hardware Figure 2-5. Cabling a T wo-Node Cluster to a CX4-120 or CX4-240 Storage Sy stem Figure 2-6. Cabling a T wo-Node Cluster to a CX4-480 Storage Sy stem 01 01 23 01 01 23 cluster node 1 cluster node 2 01 10 HBA ports (2) HBA ports (2) CX4-120 or CX4-240 storage sy stem SP-B SP-A 01 01 23 01 01 23 01 23 01 23 cluster no[...]
-
Page 23
Cabling Y our Cluster Hardware 23 Figure 2-7. Cabling a T wo-Node Clus ter to a CX4-960 Storage Sy stem Cabling a Multi-Node Cluster to a Dell/EMC Storage Sy stem Y ou can configure a cluster with mor e than two nodes in a dire ct-attached configuration using a Dell/ EMC storage system, depending on the availability of front-end fibre channel ports[...]
-
Page 24
24 Cabling Y our Cluster Hardware 2 Connect cluster node 2 to the storage system: a Install a cable from cluster node 2 HBA port 0 to the second front-end fibre channel port on SP -A. b Install a cable from cluster node 2 HBA port 1 to the second front-end fibre channel port on SP -B. 3 Connect cluster node 3 to the storage system: a Install a cabl[...]
-
Page 25
Cabling Y our Cluster Hardware 25 Cabling T w o T wo-Node Clusters to a Dell/EMC Storage Sy stem The following steps are an e xample of ho w to cable a two two-node cluster . The Dell/EMC storage system needs to have at least 4 front-end fibre channel ports available on each storage processor . 1 In the first cluster , connect clu ster node 1 to th[...]
-
Page 26
26 Cabling Y our Cluster Hardware F igur e 2-8 shows an e xample of a two node SAN-attached cluster . F igur e 2-9 shows an e xample of an eight-node SAN-attached cluster . Similar cabling concepts can be applied to clusters that contain a different number of nodes. NOTE: The connections listed in this sect ion are representative of one proven meth[...]
-
Page 27
Cabling Y our Cluster Hardware 27 Figure 2-9. Eight-Node SAN-Attached Cluster public network storage sy stem cluster nodes (2-8) Fibre Channel switch Fibre Channel switch private network[...]
-
Page 28
28 Cabling Y our Cluster Hardware Cabling a SAN-Attached Cluster to a Dell/EMC Storage Sy stem The cluster nodes attach to the storage system using a redundant switch fabric and F i bre optic cables with duple x LC multimode connectors. The switches, the HBA ports in the clus ter nodes, and the SP ports in the storage system use duplex LC multim od[...]
-
Page 29
Cabling Y our Cluster Hardware 29 Cabling a SAN-Attached Cluster to a De ll/EMC CX4-120 or CX4-240 Storage Sy stem 1 Connect cluster node 1 to the SAN: a Connect a cable from HBA port 0 to F i br e Channel switch 0 (sw0). b Connect a cable from HBA port 1 to F i br e Channel switch 1 (sw1). 2 Repeat step 1 for each additional cluster node. 3 Connec[...]
-
Page 30
30 Cabling Y our Cluster Hardware Figure 2-10. Cabling a SAN-Attached Cl uster to the Dell/EMC CX4-120 or CX4-240 Cabling a SAN-Attached Cluster to the Dell/EMC CX4-480 or CX4-960 Storage Sy stem 1 Connect cluster node 1 to the SAN: a Connect a cable from HBA port 0 to F ibre Channel switch 0 (sw0). b Connect a cable from HBA port 1 to F ibre Chann[...]
-
Page 31
Cabling Y our Cluster Hardware 31 d Connect a cable from F i bre Channe l switch 0 (sw0) to the second front-end fibre channel port on SP -B. e Connect a cable from F i bre Channel switch 1 (sw1) to the thir d front- end fibr e channel port on SP -A. f Connect a cable from F i bre Channel switch 1 (sw1) to the thir d front- end fibr e channel port [...]
-
Page 32
32 Cabling Y our Cluster Hardware Figure 2-12. Cabling a SAN-Attached Cluster to the DellEMC CX4-960 Cabling Multiple SAN-Atta ched Clusters to a De ll/EMC Storage Sy stem T o cable multiple clusters to the stora ge system, connect the cluster nodes to the appropriate F ibr e Channel switches and then connect the F ibre Channel switches to the app[...]
-
Page 33
Cabling Y our Cluster Hardware 33 Cabling Multiple SAN-Attached Clus ters to the CX4-120 or CX4-240 Storage Sy stem 1 In the first cluster , connect cluster node 1 to the SAN: a Connect a cable from HBA port 0 to F i br e Channel switch 0 (sw0). b Connect a cable from HBA port 1 to F i br e Channel switch 1 (sw1). 2 In the first cluster , repeat st[...]
-
Page 34
34 Cabling Y our Cluster Hardware c Connect a cable from F i bre Channe l switch 0 (sw0) to the second front-end fibr e channel port on SP -A. d Connect a cable from F i bre Channe l switch 0 (sw0) to the second front-end fibr e channel port on SP -B. e Connect a cable from F ibre Channel switch 1 (sw1) to the third front- end fibr e channel port o[...]
-
Page 35
Cabling Y our Cluster Hardware 35 • MSCS is limited to 22 drive letters. Because drive letters A through D ar e r eserved for local disks, a maximum of 22 drive letters (E to Z) can be used for your storage system disks. • W indows Server 2003 and 2008 support mount points, allowing greater than 22 drives per cluster . F i gure 2-13 provides an[...]
-
Page 36
36 Cabling Y our Cluster Hardware NOTE: While tape libraries can be connected to multiple fabrics, they do not provide path failover . Figure 2-14. Cabling a Storage Sy stem and a T ape Library Obtaining More Information See the storage and tape backup docum entation for more information on configuring these components. Configuring Y our Cluster Wi[...]
-
Page 37
Cabling Y our Cluster Hardware 37 Figure 2-15. Cluster Config uration Using SAN-Based Backup cluster 2 cluster 1 Fibre Channel switch storage sy stems tape library Fibre Channel switch[...]
-
Page 38
38 Cabling Y our Cluster Hardware[...]
-
Page 39
Preparing Y our Sy stems for Clustering 39 Preparing Y our Sy stems for Clustering WARNING: Only trained service technicians are authoriz ed to remove and access any of the components inside the sy stem. See your safety information for complete information about safety precautions, working inside the computer , and protecting against electrostatic [...]
-
Page 40
40 Preparing Y our Sy st ems for Clustering 5 Configure each cluster node as a member in the same W indows Active Directory Domain. NOTE: Y ou can configure the cluster nodes as Domain Controllers. For more information, see the “Selecting a Domain Model” section of Dell Failover Clusters with Microsoft Windows Server 2003 Installation and T rou[...]
-
Page 41
Preparing Y our Sy stems for Clustering 41 12 Configur e highly-available applications and services on your F ailover Cluster . Depending on your configurat ion, this may also requir e providing additional L UNs to the cluster or cr eating new cluster r esource groups. T est the failover capabilities of the new resources. 13 Configur e client syste[...]
-
Page 42
42 Preparing Y our Sy st ems for Clustering Installing the Fibre Channel HBAs Fo r d u a l - HBA configurations, it is recomm ended that you install the F ibre Channel HBAs on separate perip heral component interconnect (PCI) buses. Placing the adapters on separate buse s improves availability and performance. F or more information about your syste[...]
-
Page 43
Preparing Y our Sy stems for Clustering 43 Zoning automatically and transparently enforces access of information to the zone devices. Mor e than one P owerEdg e cluster configuration can share Dell/EMC storage system(s) in a switched fabric using F ibr e Channel switch zoning and with Access Control enabled . By using F i bre Channel switches to im[...]
-
Page 44
44 Preparing Y our Sy st ems for Clustering CAUTION: When you replace a Fibre Channe l HBA in a PowerEdge server , reconfigure your zones to provide continuous client data access. Additionally , when you replace a switch module, recon figure your zones to prevent data loss or corruption. CAUTION: Y ou must configure your zones before you configure [...]
-
Page 45
Preparing Y our Sy stems for Clustering 45 Installing and Configuring the Shared Storage Sy stem See "Cluster Hardwar e Requir ements" on page 8 for a list of supported Dell/EMC storage systems. T o install and configure the Dell/E MC storage system in your cluster: 1 Update the cor e software on your storage system and enable Access Cont[...]
-
Page 46
46 Preparing Y our Sy st ems for Clustering Access Control is enabled using N avisphere Manager . After you enable Access Control and connect to the storage system from a management station, Access Control appears in the Storage System P roperties window of Navisphere Manager . After you enable Access Control , the host system can only read from an[...]
-
Page 47
Preparing Y our Sy stems for Clustering 47 T able 3-2. Storage Group Properties Property Description Unique ID A unique identifier that is automatically assigned to the storage group that cannot be changed. Storage group name The name of the storage group. The default storage group name is formatted as Storage Group n , where n equals the existing [...]
-
Page 48
48 Preparing Y our Sy st ems for Clustering Navisphere Manager Navisphere Manager provides centralized storage management and configuration from a single management console. Using a graphical user interface (GUI), Navispher e Manager allo ws you to configure and manage the disks and components in one or more shar ed storage systems. Y ou can access[...]
-
Page 49
Preparing Y our Sy stems for Clustering 49 2 Add the following two separate lines to the agentID.txt file, with no special formatting: • F irst line: F ully qualified hostname. F or example, enter node1.domain1.com , if the host name is node1 and the domain name is domain1 . • Second line: IP address that you want the agent to r egister and use[...]
-
Page 50
50 Preparing Y our Sy st ems for Clustering 3 Enter the IP address of the storage management server on your storage system and then pr ess <Enter>. NOTE: The storage management server is usually one of the SPs on your storage sy stem. 4 In the Enterprise Storage window , click the Storage tab. 5 Right-click the icon of your storage system. 6 [...]
-
Page 51
Preparing Y our Sy stems for Clustering 51 d Repeat step b and step c to add additional hosts. e Click Apply . 16 Click OK to exit the Storage Group P roperties dialog box. Configuring the Hard Drives on the Shared Storage Sy stem(s) This section provides information fo r configuring the hard drives on the shared storage systems. The shar ed storag[...]
-
Page 52
52 Preparing Y our Sy st ems for Clustering Assigning LUNs to Hosts If you have Access Control enabled in Navisphere Manager , you must create storage groups and assign L UNs to the proper host systems. Optional Storage Features Y our Dell/EMC CX4-series storage a rray may be configured to provide optional features that can be used in conjunction w[...]
-
Page 53
Preparing Y our Sy stems for Clustering 53 Updating a Dell/EMC Storage Sy stem for Clustering If you ar e updating an existing Dell/ EMC storage system to meet the cluster r equirements for the shar ed storage subsystem, you may need to install additional F ibre Channel disk drives in the shar ed storage system. The size and number of drives you ad[...]
-
Page 54
54 Preparing Y our Sy st ems for Clustering[...]
-
Page 55
T roubleshooting 55 T roubleshooting This appendix provides troublesho oting information for your cluster configuration. T able A-1 describes general cluster problems you may encounter and the probable causes and solutions for each problem. T able A-1. General Cluster T roubleshooting Problem Probable Cause Corrective Action The nodes cannot access[...]
-
Page 56
56 T roubl eshooting One of the nodes takes a long time to join the cluster . or One of the nodes fail to join the cluster . The node-to-node network has failed due to a cabling or har dware fai lure. Check the network cabling. Ensure that the node-to-node interconnection and the public network are connected to the correct NICs. One or more nodes m[...]
-
Page 57
T roubleshooting 57 Attempts to connect to a cluster using Cluster Administrator fail. The Cluster Service has not been started. A cluster has not been formed on the system. The system has just been booted and services are still starting. V erify that the Cluster Service is running and that a cluster has been formed. Use the Event Viewer and look f[...]
-
Page 58
58 T roubl eshooting Y ou are prompted to configure one network instead of two during MSCS installation. The TCP/IP configuration is incorrect. The node-to-node network and public network must be assigned static IP addr esses on different subnets. F or more information about assigning the network IP s, see "Assigning Static IP Addresses to Clu[...]
-
Page 59
T roubleshooting 59 Unable to add a node to the cluster . The new node cannot access the shared disks. The shared disks are enumerated by the operating system differently on the cluster nodes. Ensure that the new cluster node can enumerate the cluster disks using W indows Disk Administration. If the disks do not appear in Disk Administration, check[...]
-
Page 60
60 T roubl eshooting Cluster Services does not operate correctly on a cluster running W indows Server 2003 and the Internet F irewall enabled. The W indows Internet Connection F irewall is enabled, which may conflict with Cluster Services. P erform the following steps: 1 On the W indows desktop, right-click My Computer and click Manage . 2 In the C[...]
-
Page 61
Zoning Configuration Form 61 Zoning Configuration Form Node HBA WWPNs or Alias Names Storage WWPNs or Alias Names Zone Name Zone Set for Configuration Name[...]
-
Page 62
62 Zoning Configuration Form[...]
-
Page 63
Cluster Data Form 63 Cluster Data Form Y ou can attach the following form in a convenient location near each cluster node or rack to r ecord information abou t the cluster . Use the form when you call for technical support. T able C-1. Cluster Information Cluster Information Cluster Solution Cluster name and IP address Server type Installer Date in[...]
-
Page 64
64 Cluster Data Form Additional Networks T able C-3. Storage Array Information Array Array xPE T ype Array Service T ag Number or W orld Wide Name Seed Number of Attached DAEs 1 2 3 4[...]
-
Page 65
Index 65 Index A Access Control about, 4 5 C cable configurations cluster interconnect, 1 9 for client networks, 1 8 for mouse, k eyboard, and monitor , 1 5 for power supplies, 1 5 cluster optional configurations, 1 2 cluster configurations connecting to multiple shar ed storage systems, 3 4 connecting to one shared storage system, 1 2 d i r e c t [...]
-
Page 66
66 Index H HBA drivers installing and configuring, 4 2 host bus adapter configuring the F ibre Channel HBA, 4 2 K key b o a rd cabling, 1 5 L LU N s assigning to hosts, 5 2 configuring and managing, 5 1 M MirrorView about , 1 1 monitor cabling, 1 5 mouse cabling, 1 5 MSCS installing and configuring, 5 3 N Navisphere Agent about , 4 8 Navispher e Ma[...]
-
Page 67
Index 67 S SAN configuring SAN backup in your cluster , 3 6 SAN-Attached Cluster , 13 SAN-attached cluster about, 2 5 configurations, 1 2 shared storage assigning L UNs to hosts, 5 2 single initiator zoning about, 4 4 SnapV iew about, 1 1 storage groups about, 4 6 storage manage ment softwar e Access Control, 4 5 Navispher e Agent, 4 8 Navispher e [...]
-
Page 68
68 Index[...]