Friday, 26 March 2010

New DataCore and Hyper-V White Paper: Uninterrupted Access to Cluster Shared Volumes (CSVs) Synchronously Mirrored Across Metropolitan Hot Sites

Failover Cluster support in Windows Server 2008 R2 with Hyper-V provides a powerful mechanism to minimize the effects of planned and unplanned server downtime. It coordinates live migrations and failover of workloads between servers through a Cluster Shared Volume (CSV). The health of the cluster depends on maintaining continuous access to the CSV and the shared disk on which it resides.

In this paper: Uninterrupted Access to Cluster Shared Volumes (CSVs) Synchronously Mirrored Across Metropolitan Hot Sites

You will learn how DataCore Software solves a longstanding stumbling block to clustered systems spread across metropolitan sites by providing uninterrupted access to the CSV despite the many technical and environmental conditions that conspire to disrupt it.

Tuesday, 16 March 2010

DataCore Software's SANmelody Storage Virtualization Solution Wins 2010 Network Computing Award: Wins Storage Product of the Year

DataCore Software's SANmelody Storage Virtualisation Solution Wins 2010 Network Computing Award: Wins Storage Product of the Year

Keith Joseph of Datacore accepted the Storage Product of the Year award

DataCore Software has once again scooped one of the UK's top networking awards by winning the Storage Product of the Year Award at the prestigious 2010 Network Computing Awards dinner held last week at the Tower Hotel in London. DataCore's SANmelodyTM, beat stiff category competition from finalists EMC, Compellent, LSI, CA, amongst others.

Keith Joseph, Regional Manager, Northern Europe, DataCore Software, commented, "It's tremendous recognition from the UK's network managers and administrators who nominated and voted for SANmelody in force. In total, over 7000 votes were cast for the Awards, so it is a real honour in light of the stiff competition we were facing."

The Storage Product of the Year Award signifies the readers' appreciation of SANmelody and how it easily and cost-effectively it converts physical Intel/AMD servers or virtual machines (VMs) into fully-capable, virtual storage servers that are able to optimise, protect and manage storage over existing networks to fulfill the needs of application servers.

In addition, SANsymphonyTM, the enterprise version of SANmelody, came runner up as Data Centre Solution of the Year.

The Network Computing Awards were established in 2004 to recognise best-in-breed, easy to use solutions that make the working lives of network managers easier and more effective.

In SAN We Trust: City Government Deploys DataCore Storage Virtualization To Empower VMware; Sets A New Level Of Uptime For Critical Applications

Computer Technology Review: To read the complete story, please visit:

DataCore Software has been deployed along with VMware vSphere for the government of Dinwiddie County in Virginia. "We were convinced that we needed a SAN to support our objective of server consolidation with VMware as well as our desire to achieve uptime," explained Norman Cohen, IT Director, Dinwiddie County Government. "We are so impressed with the new virtualization environment and DataCore's role as the virtual storage dimension to everything that we are actively encouraging other county governments in our state to do what we have done in our VMware-DataCore deployment."

"DataCore's storage virtualization software solution has allowed us to springboard into the world of virtualization – particularly enabling us to leverage the benefits of VMware," noted Cohen. "The most important component of our virtualization deployment has to do with virtualizing our disks with DataCore. We can now utilize disk space – no matter where those disks reside on campus. But storage virtualization does so much more than delivering shared storage by way of pooling disks. With DataCore, our IT environment has attained high-availability and fault-tolerance, by way of the remote replication that is made possible by the two DataCore-powered SANs."

Tuesday, 9 March 2010

DataCore Storage Virtualization Software Delivers 100% Uptime to iomart hosting
iomart hosting, one of the UK's fastest growing managed services providers, is using DataCore's SANsymphony™ solution to provide the high-availability backbone for iomart's hosted customers.

Richard McMahon is the company's Infrastructure Manager responsible for all hardware and software deployments and services within the group. Twenty-five people operate from within iomart's five (5) state-of-the-art data centers located in London, Maidenhead, Glasgow, Nottingham and Leicester. All centers are linked via Dark Fibre or GigE connections.
"Our 100% uptime guarantee is not a marketing ploy - it's iomart's motto and mantra. We live and breathe it to facilitate our customers' business models. Therefore total high-availability is top of our priority list," he stated.

Recently, to help clients facilitate cloud computing, iomart launched a dedicated, virtualized server service, offering VMware dedicated servers capable of running multiple operating systems and applications from one physical server.

With these services in mind, the challenge that iomart faced was managing 30 terabytes (TBs) of storage across a heterogeneous environment in the City of London and Maidenhead data centers. To meet this objective, iomart approached DataCore, EMC, NetApp and 3PAR for proposed solutions. The suggested environment from DataCore involved using standard x86 hardware, highly specified due to their being of a hosting nature, each having 128GB RAM for caching, 4 Quad Core Processors and 8GB HBA's for connectivity. The disk that sat behind the server was classified as a low-end commodity disk. On top of this hardware, DataCore's SANsymphony solution was used in a mirrored SAN-SAN configuration to virtualize and manage the environment across a fibre link. In total, the solution was less than a one-third of the cost of the equivalent software and hardware alternative proposals.

Keith Joseph, Regional Manager, DataCore Software picks up the story: "Cost savings were not the most important factor in iomart's search. What they really wanted was to achieve total control of the environment and have the ability to migrate customer data between storage systems and disk types and to easily replicate to remote sites. This they found with DataCore's SANsymphony."

Mirrored high-availability (HA) between sites over 20 miles away also provides iomart with enhanced disaster recover (DR) capabilities. In their configuration, the environment is distributed throughout several locations and ensures that they have a fully distributed, HA model but across DR geographies.

DataCore's synchronous replication functionality operates on a forced cache coherency model, based on a grid architecture that replicates the I/O block between the cache on each DataCore server before sending the acknowledgement to the application server and committing the data to disk. This overcomes the problems associated with clustered storage, while allowing iomart a greater degree of performance and flexibility.

McMahon concludes, "What we have achieved with SANsymphony is totally flexible, highly performing SAN-to-SAN mirroring, but straight out-of-the-box and backed by a decent price point. We can now move data 'at will' between vendors and seamlessly achieve LUN virtualisation across RAID arrays. No other solution offers us this."

Friday, 5 March 2010

Video: Storage Virtualization and DataCore (Jon Toigo Interviews Ziya Aral)

Video: Storage Virtualization and DataCore (Click on link below for Video Interview)

Ziya Aral , CTO and co-founder of Ft. Lauderdale, FL-based storage virtualization software vendor, DataCore Software, offers an analysis of server and storage virtualization that blows the socks off of the marketspeak one usually hears or reads from industry analysts and vendors. That's because he has been part of the evolution of the technology over the past three decades.

Says Aral, in this interview recorded for the C-4 Summit Cyberspace Edition, virtualization was -- and is -- inevitable. He notes that embedding software on hardware became a costly and inefficient design choice years ago -- especially as hardware architecture became more standardized. Software complexity required its migration to a computer so it could be maintained and improved without requiring the costly replacement of the underlying hardware platform.

He argues that DataCore's key innovation was to conceive of the "storage application" as any other computer application, separating it from the storage hardware layer -- the costly and complex value-add array controller. The benefits of such abstracting storage services into a virtualization layer would be to deliver a rich set of services that could be applied across hardware without concern for the vendor badging of the hardware itself. Software and hardware could scale independently.

Performance and management also improve with virtualization, Aral notes, citing the success of DataCore customers for proof. To hear the rest of Aral's fascinating analysis, check out the C-4 Project ( ), where Aral is part of an on-going "summit in cyberspace."

Tuesday, 2 March 2010

Virtualization should be 3D: DataCore

Virtualization should be 3D: DataCore

DataCore Software is evangelizing a virtualization strategy that entails server, desktop and storage concurrently from the outset to avoid performance bottlenecks. IDC’s Dave Pearson calls storage virtualization the “support” technology

Storage technology vendor DataCore Software Corp. is touting a three-dimensional approach to virtualization that encourages organizations to address server, desktop and storage concurrently from the outset when building a virtualized environment.

Augie Gonzalez, director of product marketing with the Fort Lauderdale, Fla.-based company, said focusing on just server and desktop at the beginning while neglecting storage will lead to negative consequences in the long run.

“Leaving out any one of those dimensions creates shortcomings and obstacles for that customer downstream,” said Gonzalez.

The number of dependencies on the storage infrastructure are magnified with server and desktop virtualization, said Gonzales. Specifically, performance bottlenecks occur when desktops and servers are consolidated and an increased number of inputs and outputs are driven against the storage pool, he explained.

Gonzalez said problems encountered down the road due to bad planning are one reason IT departments hesitate to move Tier 1 workloads to a virtual environment. “As soon as they scale out to actually meet the workload demand, that’s when they hit upon it … and that’s when people start to rethink whether they were doing the right thing by virtualizing,” said Gonzales.

According to Dave Pearson, senior analyst for storage at Toronto-based IDC Canada Ltd., storage virtualization is a relatively new, albeit key, component of virtualization. But he said the storage component is still a “support” technology led in most cases by server virtualization.

Pearson said storage virtualization forms part of what he calls "virtualization 3.0," where it’s not just about getting rid of physical machines and improving processes anymore. “It’s now part of optimizing the entire solution, making sure you’re really getting your money’s worth out of the hardware that you’re committing to these virtualization projects,” he said.

However, while Pearson agrees storage is a vital component of an organization’s virtualization strategy, he said such a 3D approach is hardly a novel idea to IT pros involved in large virtualization projects.

The 3D approach is “just really another tool that’s being added to the box of tools for virtualization,” said Pearson. Besides DataCore, Fremont, Calif.-based 3PAR Inc., among others, is in the market.

DataCore is relying on its community of resellers to evangelize this 3D virtualization approach because, according to Gonzales, customers still require a lot of education on the topic.

Gonzales said the “second phase of adoption” of virtualization to areas like business continuity, risk mitigation and disaster recovery, is beginning to shine a spotlight on storage. “All of those many machines and many users and many virtual workloads now become so centred on how the storage environment behaves and how well that virtual view of storage remains intact as you move through generations of hardware,” he said.

Pearson said storage virtualization is growing more quickly in importance than server virtualization, but not at the same level overall.

ACW Insights: How to add value to your virtual environments

Virtualization should be 3D: DataCore