Spreading your Data across the Hybrid Cloud – What data goes where?

cloud 4The term hybrid cloud is used to refer to a combination of public and private clouds that are tailored to suit an organisation’s specific business needs.  As a minimum, a single private cloud and single public cloud are all that are required to create a hybrid cloud computing platform. However, organisations can combine multiple private and public clouds to suit the business’s requirements.

 A public cloud enables organisations to adopt enterprise class technologies for their environment at an affordable price point; however, security, availability, compliance, performance, portability and the cloud provider’s market longevity can often be of concern. A private cloud can answer these concerns but is more expensive to deploy and operate. A hybrid cloud offers both these benefits by integrating an organisations data and processing the data into the correct cloud.

 This asks the question; with regards to the hybrid cloud, what data goes where?

 Well it’s really a case of data classification and risk, writes Del Lunn, Principal Consultant at GlassHouse Technologies (UK).  When a company’s applications and data are moved from on premise platforms to a public cloud offering, the organisation will essentially be ‘renting’ services alongside other customers whilst entirely entrusting the provider and its staff with regards to data security, uptime of services, confidentiality, compliance and transition. All of which can have a catastrophic effect for some businesses if not adhered to or met.  Before considering public cloud offerings, companies will need to thoroughly understand the business impact and revenue loss that could occur from hosting data off-premise in the public cloud. Even if the aforementioned are of little relevance, companies looking to move to public cloud offerings must still proceed with caution; what if the cloud provider goes bankrupt? What is the relationship with the provider becomes toxic? What if they decide they no longer want to provide cloud services?

 At present, the above normally dictates that an organisation’s ‘Crown Jewels’, i.e. enterprise, business critical, secure or regulatory data and applications, remain on-premise or within a secure private cloud environment whilst more commodity based or tactical services such as data archiving, backup, e-mail, collaboration and workspace recovery, are moved to a public cloud.  However, the continued maturation of public cloud service offerings is starting to challenge this principle and more progressive organisations are embracing a ‘cloud first’ approach to application deployment.

 GlassHouse strongly believes that although public cloud offerings can often be cost effective, they must also be the correct fit for the organisation, striking the right balance between scalability, agility, compliance, performance, security and operational flexibility.

 If you would like more information please visit our website at www.GlassHouse.com

 Posted by: Del Lunn, Principal Consultant at GlassHouse Technologies (UK)

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • Digg
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS
Posted in Cloud Computing, Data Center | Tagged , , , , | Leave a comment

Legal IT is turning to Managed Infrastructure Services to Drive Innovation

The legal industry has not been exempt from the rapid technological and social change we’ve seen across other industries, recently. Legal sector IT organizations face the same challenges other IT departments are experiencing – improving user experiences, enabling increased collaboration, support for new initiatives and mobility, increasing productivity while maintaining security, all while reducing operational expenses. person with laptop open

While these challenges in-and-of-themselves aren’t unique, the pressure of increasingly rigorous legal security and compliance requirements are, and new considerations such as Cloud Ethics raise new questions.  Legal IT departments struggle to be a consistent, innovative force, building the firm’s practice, supporting and protecting its assets, legal team and data. These IT departments have to find innovative ways to reach operational maturity, efficiently manage their technology infrastructure, ensure proper governance and improve the end user experience.

In an effort to keep up in the fast-paced, rapidly changing industry, many law firms are turning to managed infrastructure services to maintain control and help drive innovation.  GlassHouse has helped many organizations in the legal sector with this, and enabled them to allocate and focus their existing resources on new and innovative applications and services and enhance their responsiveness. In one use case, GlassHouse enabled a global legal firm to realize the following benefits:

  • Improving end-user legal team productivity and experience
  • Increasing SLA adherence to 97%
  • Identification and process remediation of failed data backups
  • Creation of a cloud strategy including meeting growing needs of Cloud Ethics

To learn how legal IT leaders are taking back control of their technology resources and driving business innovation, download our eBook here.

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • Digg
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS
Posted in Business Services, Infrastructure Operations | Tagged , , , | Leave a comment

A Look Back at 2013

14365488_mIt seems like it was just yesterday that we were making predictions for how technology would accelerate change in 2013. Now, it’s hard to believe we’ll be preparing to make predictions for the New Year.  At GlassHouse, we’re excited about what lies ahead in 2014. We’re proud of our continued dedication to IT leaders and over the past year, we’ve enhanced our services in order to better assist organizations in the planning, implementation and management of their IT infrastructure and services, and to help them deal with the continuous introduction of new technologies.

Before we ring in the New Year, we’d like to take time to recognize a few of the key 2013 initiatives that have made a significant impact in the way CIOs have been able to innovate, grow and meet their business goals over the course of the year.

Taking a Tiered Approach to Data Center Management

It didn’t take long for data centers to evolve – from the rooms that simply housed storage and other infrastructure, to the facilities that now run highly demanding, business-critical applications. Recent forecasts predict that total data center space will grow to more than 700 million square feet by 2016, forcing CIOs into rethinking their data center strategies. With a whole slew of data center management options available, including in-house, cloud and colocation facilities, it can be difficult to decipher the model that best fits an organization’s structure. But with our approach, we’re helping IT leaders understand that they don’t need to choose just one option to meet their business needs.

In the GlassHouse whitepaper “Cloud, Colo or In-house”, we explain that the solution lies in a tiered data center management strategy, which maximizes the benefits of all three options. This multi-tiered approach provides flexibility and enables IT to prioritize certain application components to meet their unique infrastructure requirements and boost the overall business output.

Helping CIOs Move the Business Needle

If you ask CIOs about their top IT priority, the majority would likely say it is aligning IT initiatives with business goals. In fact, that is exactly what 63 percent of respondents stated earlier this year in CIO magazine’s 2013 State of the CIO survey. The problem is, IT has been synonymous with in-house tech support and 40 percent of their time is actually tasked with fighting fires instead of supporting growth or thinking about long-term strategy.

Over the course of the year, we enhanced our managed infrastructure services  to provide an option for CIOs to successfully move the business needle. The GlassHouse approach gives companies the traditional outsourcing benefits of a managed services provider, without the risks of losing control of their IT infrastructure. This enables CIOs to select which assets they want our consultants to manage, in turn freeing up the IT staff to focus on growing the business with new technology solutions.

Establishing a New World for Colocation

The sands have been shifting around colocation providers for quite some time, so much so that they could be at risk of losing market share to other hosting-type services if they don’t change course or realign their service models. Today, we’re seeing organizations crippled by colocation providers’ fixed service contracts that don’t provide any flexibility or scalability. What’s more, most organizations aren’t able to project what they’ll need in terms of power and space years down the road, yet colocation providers still make it a requisite part of the agreement.

But the colocation services market doesn’t need to suffer in light of the rise in established hosting services. In our recent whitepaper, we outline the many ways colocation providers can reinvent their go-to-market strategy and position themselves as relevant and attractive to current and prospective customers. For instance, if colocation service providers can showcase similar attributes as  fully funded, on-premise data centers – such as offering tailored services on a case-by-case basis and delivering a better review model – then they stand the chance of growing their customer base and maintaining a strong business.

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • Digg
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS
Posted in Business Services, Cloud Computing, Infrastructure Operations, IT Services Management | Tagged , , , , , | Leave a comment

GlassHouse’s Managed Infrastructure Services Enables Innovation in Biotech and Life Sciences

The biotechnology industry is undergoing rapid disruption. Firms are under increasing pressure to accelerate time-to-market for new products in order to improve clinical outcomes. Unfortunately, GxP compliance and risk mitigation processes often delay the TTM cycle. Additionally, while a boon for the overall business, the extreme growth and unprecedented capacity needs take its toll on IT, which is tasked with managing compliance and all other IT assets.

programmer in data center

Without the ability to efficiently manage IT assets, organizations in the biotechnology industry are at a serious competitive disadvantage. They not only run the serious risk of falling behind on innovation, but also risk being negligent in adhering to compliance mandates comprise this highly-regulated market.

Because of this, more biotech firms are turning to managed infrastructure services to focus their IT resources on new technologies and methodologies to support their rapid growth. For our customers, GlassHouse’s managed services teams handle the mundane task of managing these firms’ data centers, freeing up the time and resources to help them keep pace and innovate. One particular use case, illustrates the benefit of such a partnership. Some tangible business results include:

  • Improving end-user productivity and experience
  • Increasing SLA adherence to 98%
  • Reducing net operating expense by more than 28%
  • Reducing time-to-production after risk management review for system compliance

If you’re interested in learning how GlassHouse’s managed infrastructure services are helping Biotech IT leaders drive innovation while maintaining control in a fast-paced, rapidly changing industry, you can download our eBook here.

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • Digg
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS
Posted in Biotech/Pharmaceutical | Tagged , , , | Leave a comment

Weekly Highlights – December 2, 2013

Yes, the Cloud Is Replacing Enterprise Hardware and Software
InfoWorld, David Linthicum, December 3, 2013

In its 2014 global technology outlook, Barclays suggests that businesses will shift away from their traditional data centers, reinforcing the trend that as more public cloud services are sold, the more traditional enterprise hardware and software vendors shrink. While AWS and other IaaS providers are expected to see revenues reaching nearly $10.2 billion by 2016, “big iron” hardware and server companies like EMC and IBM could face serious declines in revenue. Additionally, systems integrators will likely struggle with this changing paradigm, which will bring a lesser need for on-premise integration.

Our take: As David points out at the end of his article, “The fundamentals of architecture, design, operation, development, security, assurance, and management remain true whether we consume technology resources from the data center or from the cloud.” While traditional enterprise hardware and software will remain viable for some organizations, we believe that now is the time to embrace the cloud. There are three main factors contributing to increased cloud adoption: new technological advancements, more credible cloud solutions and a whole new and open mindset. Additionally, the evolution of service offerings and technologies means that it is more likely there is a solution in the cloud for organizations that will be meet the increasing demands of its users.

Hybrid Cloud Is About More than Savings
Datacenter Dynamics, Yevgeniy Sverdlik, December 3, 2013

 While virtualization and private cloud bring cost reductions, organizations look to the hybrid cloud for other benefits, including infrastructure flexibility and reliability. Recently, Gartner predicted that nearly half of large enterprises will have deployed hybrid clouds by the end of 2017, with a primary goal of optimized business agility.

Our take: It wasn’t too long ago that everyone thought the public cloud was the be-all-end-all for enterprise computing. That idea came to a halt when concerns around security, compliance and data sovereignty emerged as it pertained to the 100-percent public cloud-based model. While the scalability and affordability of the public cloud is still in high demand for many enterprises, a hybrid cloud deployment grants the best of both worlds to enterprises. It offers organizations a secure environment for sensitive applications, while providing the opportunity to leverage the benefits of a public cloud for applications best suited for the public infrastructure.

Looking Ahead to Cloud Services in 2014 and Beyond
Talkin’ Cloud, Tech Data, December 2, 2013

Cloud computing is annually yielding an approximate $111 billion and, according to Gartner, that annual revenue number cloud reach $200 billion by 2016. As the evolution and maturation of cloud computing continues and models shift it will be increasingly important for organizations to focus on the development of integrated solutions. Ultimately, cloud computing’s potential is significant and multi-layered, so the more education and understanding we have, the sooner we can begin to capitalize on everything the cloud has to offer.

Our take: Organizations will be able to achieve equal or greater cloud service levels by tiering application components based on IT criticality, and it’s in this multi-tiered cloud approach that will help organizations achieve levels of scalability and cost efficiency that they aren’t achievable with a one-size-fits-all approach to cloud. This attitude enables IT departments to prioritize certain application components to meet their unique infrastructure requirements and boost the overall business output.

Interested in learning more about the future of cloud? Download GlassHouse’s recent eBook, “Is Now the Ideal Time to Embrace the Cloud?” available on its website

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • Digg
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS
Posted in Cloud Computing, Data Center, Virtualization | Tagged , , , | Leave a comment

Weekly Highlights – November 11, 2013

cloud on keyboardWhat’s Your Cloud Strategy?
Virtualization Review, Elias Khnaser, November 13, 2013

It wasn’t long ago that Elias Khnaser predicted enterprise IT would be out of owning and operating physical infrastructure as well as out of the data center business and moving into the cloud. His predictions came under scrutiny at the time, but now with 15 to 25 percent of workloads in the cloud, it looks like he was right. As cloud adoption rates continue to increase, it will become important for IT leadership to identify and deploy the right strategy for their business. It is also critical that they understand how the role of IT and organizational priorities may change as a result of cloud.

Our take: Now is the right time to adopt cloud services, but enterprises can’t adopt them without the right cloud strategy for their businesses. This entails an in-depth discovery phase to understand their applications and the supporting infrastructure underneath them. As that picture becomes clear, it will be easier to match applications with a suitable cloud model. Once the discovery phase and analysis phase are complete, organizations need to justify the change and develop a roadmap, outlining how the deployment will be executed. By developing a proper cloud strategy, and completing all of these necessary steps, organizations will be equipped and prepared for a successful deployment.

Private Cloud Adoption Expected to Reach $69B in Revenue by 2018
The WHIR, Chris Burt, November 13, 2013

According to a report released by Technology Business Research, private cloud adoption will generate $69 billion in revenue by 2018. The results of the report came from a survey of 650 cloud users worldwide, and covered customers’ perceptions of vendors, buyer behaviors and drivers and barriers to adoption over the next five years.

Our take: Whether it’s private, public or a multi-tiered strategy, the benefits to cloud remain the same: reduced costs, increased productivity and collaboration, and innovation. While barriers still remain, the technological advancements that have been made, the availability of credible solutions and a brand new mindset around cloud computing are contributing to a golden age in cloud adoption.

3 Challenges in Moving to the Cloud (A CIO’s Story)
Forbes, Cynthia Stoddard, November 12, 2013

Troubled by slowed productivity and headaches from her IT department, CIO of NetApp, Cynthia Stoddard needed to make a change. End users were going outside the boundaries of IT to secure their own resources, so to get away from the rogue, “shadow” solutions users were using, she decided to implement a hybrid cloud strategy to meet their needs. While, the benefits included an estimated $1.5M annual savings, the deployment also came with its challenges. Cynthia and her IT department were faced with challenges that included a change in culture, uncertainty around who to call when applications fail and support is need as well as understanding the importance of the relationship with their vendor.

Our take: The move to any cloud platform is not an easy one, and it comes with its own challenges and difficulties. A hybrid cloud implementation is no different, organizations need to execute a strategy that touches on three basic elements. First, organizations need to design and implement a private cloud if they don’t already have one deployed. Similarly, a public cloud needs to be in place; this can either be from a single vendor or from a cloud broker. Once a public cloud is selected, organizations need to agree and contract terms with the provider, laying out all of the SLA requirements to protect and compensate the organization in the event of an outage.

GlassHouse recently released an eBook entitled “Is Now the Ideal Time to Embrace the Cloud?” Additionally, GlassHouse has made a new webinar available on designing the ideal hybrid cloud on its website.

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • Digg
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS
Posted in Virtualization | Leave a comment

Where is my Data and who has Access to it?

cloudThere are so many -as-a-Service services that data heads off in to the cloud and lands who knows where, writes Kingsley Eley, Principal Consultant at GlassHouse Technologies (UK). Your service provider may not state where your data will be held, let alone guarantee that it stays in the same country or even on the same continent. They may move data around for load balancing, or may failover to another data center if things go wrong. You are unlikely to know at any one time where your data is. Not so long ago data stayed safely in the data center behind a firewall, and rarely ventured much further than branch offices or to the tape storage warehouse. Life was easy then.

If you are sending personal data outside of the European Economic Area (EAA) then you required to comply with Principle 8 of the UK Data Protection Act which states that:

Personal data shall not be transferred to a country or territory outside the EEA unless that country or territory ensures an adequate level of protection for the rights and freedoms of data subjects in relation to the processing of personal data.

Furthermore you will be subject to the local laws of the country in which your service provider is based and of the country in which your data is actually stored. You can be prosecuted under these laws even if you just make use of a data centre in a European country.

And of course thanks to the 2001 USA Patriot Act data stored in the US or UK by any company headquartered in the US is subject to access by federal authorities. This includes financial information and emails. There is widespread concern that this allows access to a UK customer’s data despite strong UK and EU data protection laws.

GlassHouse Technologies helps organizations navigate their way through the legal minefield to ensure they stay compliant and legal as they transition to the cloud.

If you would like to delve a bit deeper into cloud computing, why not have a read of our compelling eBook – ‘Is Now the Time to Embrace the Cloud?’ Download the entire eBook here.

 

GlassHouse provides vendor-independent data center consulting and managed services, guiding customers through the complexities of cloud, virtualization, storage, security and workspace.  We enable clients to evolve to a services-enabled data center model providing on-demand, elastic services and agility, and enabling IT to focus on innovation. We consider the people, processes, policies and technology in place and create a customized plan that mitigates risk, reduces cost and improves efficiency, driving business value rather than technology outcomes. For more information, visit www.GlassHouse.com, visit the GlassHouse blog for expert commentary on key data center issues, and follow us on Twitter @GlassHouse_Tech.

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • Digg
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS
Posted in Cloud Computing, Data Center, Virtualization | Tagged , , | Leave a comment

A New World of Colocation

Can colocation service providers evolve their service offerings to satisfy today’s commercial needs, demands and expectations; while differentiating themselves from the competition?

data centerA colocation data center can provide IT facilities equipment, rack space, connectivity and environmental support facilities and can be procured by both the wholesale and retail customers; typically priced on power presentation and selected on a number of key attributes.

With increasing complexity and costs to build and maintain on-premise data centers, there is an opportunity for service providers to change the way they provide services and take advantage of the slow Cloud adoption and customers who cannot (or will not) consume cloud only services.

QUESTION Will the current economy changed the way data center services be procured?

QUESTION Could colocation service providers adopt utility based contract and billing models, rather than just the typical property leasing model?

QUESTION Could additional use-case for colocation be used to drive efficiency and business agility?

QUESTION Could the evolution of colocation produce more business and community benefits, through financial modeling, development and innovation?

QUESTION Will we see a change in service provider and consumer behaviors; transitioning towards quality, green and clean data centers?

NEXT

Read the full whitepaper where Peter White, Enterprise Architect at GlassHouse Technologies discusses and tries to answer these questions.

 

GlassHouse provides vendor-independent data center consulting and managed services, guiding customers through the complexities of cloud, virtualization, storage, security and workspace.  We enable clients to evolve to a services-enabled data center model providing on-demand, elastic services and agility, and enabling IT to focus on innovation. We consider the people, processes, policies and technology in place and create a customized plan that mitigates risk, reduces cost and improves efficiency, driving business value rather than technology outcomes. For more information, visit www.GlassHouse.com, visit the GlassHouse blog for expert commentary on key data center issues, and follow us on Twitter @GlassHouse_Tech.

 

 

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • Digg
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS
Posted in Cloud Computing, Data Center | Tagged , , | Leave a comment

Cloud Computing: Where is my Data?

blog pic - where is my dataIn cloud environments it doesn’t really matter where your compute power is; as long as network latency doesn’t impact user experience it can be anywhere in the world.  Data however is an entirely different matter, writes David Boyd, Enterprise Architect at GlassHouse Technologies.  It really does matter where your data is.

Data security and privacy are high on the agenda at the moment and have gained increased recent attention from the EU’s Directorate General and the British Terrorism Act. In the wake of the PRISM revelations the impact of the US Patriot Act and the US Foreign Intelligence Surveillance Amendment Act on data in the cloud has been brought home.

Organizations should know where their data is being hosted and organizations with regulatory compliance obligations must know and restrict where their data is being hosted. It is not a responsibility easily passed onto a cloud provider.  Attention also has to be paid to backup, archive and disaster recovery copies of data.  While your primary data copy could be in a compliant data center, copies of that data may exist elsewhere without your knowledge.

Geographical data location and compliance assessments are core components of GlassHouse’s Cloud Advisor services portfolio. If you would like to delve a bit deeper into cloud and the reasons behind it, why not have a read of our compelling eBook – ‘Is Now the Time to Embrace the Cloud?’ Download the entire eBook here.

If you want to check your compliancy position or evaluate the impact of moving data to the cloud contact GlassHouse via email sales@glasshouse.com or visit our website: www.GlassHouse.com for more information or for the contact information for a GlassHouse office near you.

GlassHouse provides vendor-independent data center consulting and managed services, guiding customers through the complexities of cloud, virtualization, storage, security and workspace.  We enable clients to evolve to a services-enabled data center model providing on-demand, elastic services and agility, and enabling IT to focus on innovation. We consider the people, processes, policies and technology in place and create a customized plan that mitigates risk, reduces cost and improves efficiency, driving business value rather than technology outcomes. For more information, visit www.GlassHouse.com, visit the GlassHouse blog for expert commentary on key data center issues, and follow us on Twitter @GlassHouse_Tech.


 

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • Digg
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS
Posted in Cloud Computing, Disaster Recovery, Virtualization | Tagged , , | Leave a comment

Am I better off using array-based replication or LAN-based replication for SRM?

disaster recovery

VMware Site Recovery Manager (SRM) is a disaster recovery orchestration product which protects virtual machines by duplicating them in a secondary site. This is achieved using either storage array or network based replication, reports Paul Grimwood, Technical Consultant at GlassHouse Technologies.

The decision on which replication technology to use should be determined by business requirements; specifically the Recovery Point Objectives (RPO) defined in the disaster recover service level agreement. These requirements should be matched to the capabilities and scalability of the replication technology and balanced against costs and other considerations.

Array based replication (ABR) uses a SRM storage adapter to leverage the replication and snapshot capabilities of the array. This allows for a high performance, synchronous or asynchronous replication of large amounts of data. Where a SLA demands a low RPO – a minimal loss of data – ABR remains the preferred option due its synchronous replication capability. However, this performance comes at a cost; storage infrastructure from the same vendor is required at both sites and generally features like array replication and snapshots incur additional licensing costs.

Alternatively SRM can leverage the vSphere replication (VR) feature, incorporated into vSphere5.1, which utilises the hypervisor to replicate over the network on a per VM basis. This approach offers more flexibility as it allows replication between disparate storage and is storage protocol independent. So low end, even direct attached, storage or cloud infrastructure can be used in the failover site to reduce costs.  SRM with VR supports features such as failback and re-protect which were previously only available with ABR. Microsoft VSS can, additionally, be used to quiesce application data during replication passes to ensure data consistency. Also, multiple points in time recovery allows a roll back to a known consistent state.

A disadvantage of using VR is the comparatively lower performance. At best, a 15 minute RPO compared to the synchronous replication possible with ABR. Due to this limitation VR is not suitable in situations requiring minimal data loss, for example, with a database tier. Instead VR would be suited for use with more static systems, such as a web application server tier. There is also a performance impact on the host whilst running replication as well as limits on the total number of replica VMs that can be supported (500 as opposed to 1000 with ABR). Certain features such as linked-clones, physical mode RDMs and Fault Tolerance are not supported but this may be addressed in the future.

It is possible to combine the two technologies, within supported limits and considering the resulting RPO, should it be desirable to do so. For example, small branch sites could use VR replication to a main site which is then protected using ABR replication. Or, certain VMs could be replicated, using VR, to a cloud provider as well as to an ABR linked site.

To summarise, VR is simpler, more flexible and cheaper than ABR but at the expense of reduced performance, scalability and feature support. The decision on which technology to use, or whether to combine the two, will be determined by business requirements as well as any existing investments.

GlassHouse provides vendor-independent data center consulting and managed services, guiding customers through the complexities of cloud, virtualization, storage, security and workspace.  We enable clients to evolve to a services-enabled data center providing on-demand, elastic services and agility, and enabling IT to focus on innovation. We consider the people, processes, policies and technology in place and create a customized plan that mitigates risk, reduces cost and improves efficiency, driving business value rather than technology outcomes. For more information, visit www.GlassHouse.com, visit the GlassHouse blog for expert commentary on key data center issues, and follow us on Twitter @GlassHouse_Tech.

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • Digg
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS
Posted in Data Center, Disaster Recovery, Storage/Backup & Recovery | Leave a comment