Archive for February, 2012

10 Step Data Center Relocation Methodology

Monday, February 13th, 2012

Following are the ten most critical tasks that any data center relocation must address. Look for a provider who can guarantee all ten of these tasks are meticulously and thoroughly addressed. Otherwise, you are taking a big risk with your most valuable and expensive IT investment.

  1. Take inventory of every component that will be relocated or consolidated.
  2. Address data security and data protection to ensure the business remains un-compromised.
  3. Perform detailed planning to maximize the efficiency and budget of the relocation.
  4. Assess budget to adequately address construction, renovation, site closure, equipment, and staff.
  5. Communicate precisely in RFPs, SOWs and contracts — Vague RFPs create poor SOWs. Partner with a data center relocation specialist right from the start to ensure you have detailed, relevant documentation.
  6. Partner with data center relocation specialists according to your continuum of needs.
  7. Plan the move, move on plan to prepare and engage all components — including staff.
  8. Prepare the new facility, and close the old one to ensure that all data center services are ready, tested and approved.
  9. Back up your data and have a disaster recovery plan and data protection strategy — just in case.
  10. Migrate — the moment when careful planning and competent project management result in a flawless move to a new data center hosting site.

For more information about all that is involved in a data center relocation, what to expect, and how to make it a success, read our white paper: Keys to a Successful Data Center Relocation.

Considerations for Choosing a DCIM Solution

Thursday, February 9th, 2012

I recently gave a presentation at Data Center World about the importance of using Data Center Infrastructure Management (DCIM) tools in your data center. At Consonus, we use nlyte Software for data center infrastructure management and capacity planning. But regardless of the vendor you choose, you should consider the following:

1) Data Collection Process – Should the collection process be automated? Agentless? The process of discovery should uncover ALL network physical assets and should provide accurate data, faster implementation, and reduce audit time.

2) Presentation Process – How is the information presented? Visually? The information shown should include: hotspot identification, optimal asset placement, connections, and power.

3) Modeling Capabilities – What type of modeling capabilities does the software have? Does it offer “What if” scenarios? It should address changes in power, space, heating, and cooling.

4) Control Process – How much automation is provided? What kind of scheduling capabilities are there? The software should improve service delivery, reduce server deployment time, and should enforce ITIL best practices.

5) Management Capabilities – What management-level dashboards are provided – standard and user-defined? The tool should include a daily measurement of operations, robust asset management and reporting capabilities.

6) Analysis Functions – The software should enable you to take a proactive stance to capacity planning, providing trend analysis to predict the lifespan of your data center. It should also include operational metrics for the entire datacenter including heat and cooling, space, and network connections.

Bottom line…using a DCIM solution can help you maximize your most expensive asset, the data center. Use it.

Bruce W. Cardos, PMP
Director, PMO
ITIL Foundation Certified

What is a SAS70 Type II Data Center?

Sunday, February 5th, 2012

If you are in the data center business or have your information stored at a data center, you should know how SAS70 compliance relates to your business and how vital it is for a data center to demonstrate SAS70 compliance. For those who don’t know…here’s your primer.

In a nutshell, when a data center says they are SAS70 compliant, they are stating that adequate, internal controls and safeguards have been implemented to secure customer information in the data center. 

SAS70 is an internationally recognized auditing standard for service organizations. Specific areas for analysis and evaluation include:

  • Organization controls
  • Application development
  • Maintenance controls
  • Logical security
  • Access controls
  • Application controls
  • System maintenance controls
  • Data processing controls
  • Business continuity controls

Having the SAS70 guarantee is invaluable for creating trust between you and your data center partner. With this credential, you can rest assured that data center control policies utilize effective best practices and that physical access, IT infrastructure, data, and the network are protected against threats.

SAS70 audits are performed every year, not only to verify that rigorous controls are in place, but to ensure that they are maintained.

There are two types of SAS audits:

Type I: This is usually done when a company first begins the auditing process. The auditor evaluates to what degree the data center fairly represents its services in terms of internal controls.

Type II: In addition to the Type I report, Type II includes the auditor’s commentary about how effective these internal controls operate in a given period.

It’s important to note that SAS70 audits are based on expert opinion made by an auditor. This means, when you are looking for a SAS70 data center, you will also need to determine if the audit was performed by a reputable firm. Plus you will need to assess the individual data center’s audit report to see the internal control claims that were evaluated.

From 2007 to 2010, Consonus has received favorable SAS70 Type I and Type II audits. Results of these audits are offered freely to all current and prospective Consonus clients.

So now when you are comparing data centers or hosting partners and they say, “we’re SAS70 audited,” you can actually know what it means.