Monday, 31 January 2011

It’s time to solve the “Big Problem” stalling today’s server and desktop virtualization projects

Deployment and operation of a successful virtualization strategy depends on several core components: the right people, the right solution, a strong grip on infrastructure investment, and a clear strategy.

Nonetheless, whether it’s a straightforward data consolidation project or a larger virtualization-based migration and integration exercise, most organizations must overcome potential issues relating to flaws in the core components that, if left unaddressed, will hinder a successful project. This is a big challenge for companies, something that research organizations such as Gartner have attested to, but not an insurmountable one.

First and foremost, it’s critical that the IT department keeps on top of managing and controlling storage assets. Poor use of available storage assets, guerrilla purchasing and deployment of storage at a workgroup level, and disconnected storage silos are among the biggest problems. And all result in strains on IT spending, significant storage management overhead, and inherent inefficiency.

However, solving this does not have to entail a widespread rip-and-replace of existing hardware. Far from it, in fact. Intelligent use of software enables you to consolidate your storage estate virtually, rather than just physically, allowing the business to access unused capacity across the resource, rather than trapping it in local silos and workgroups, saving money and improving utilization in one fell swoop.

This is the Software Advantage we focus on. Software-based storage virtualization infrastructure delivers transparency and flexibility, and enables businesses to tackle current and future needs while at the same time bypassing the physical constraints of a hardware-centric storage solution that can hinder workflow.

Consolidating storage is not just a space utilization issue; it is also one of cost saving. A consolidated environment serving virtual and physical machine storage requirements can, if done properly, deliver a range of operational efficiencies from data and back-up management improvements to maximizing availability, all of which reduce the need for costly physical intervention to solve storage resource issues.

When done well, and without the need for a massive investment in new hardware to fit the strategy, storage virtualization software can deliver significant cost savings in the short and long term, enabling you to actually achieve your projected return on investment on your overall virtualization initiative, while also unlocking the asset value of data and reducing the risk of data loss or data management errors.

This Software Advantage in storage virtualization is at the heart of what we do at DataCore. With that in mind, today marks the launch of DataCore’s SANsymphony-V, the newest version of our storage virtualization software. SANsymphony-V delivers the Software Advantage by freeing customers from the high-costs, inadequate performance, inflexibility, and vendor lock in inherent in a hardware-centric approach to storage. It is an open software platform that “future proofs” your business against changing storage requirements caused by server and desktop virtualization initiatives, and enables customers to repurpose existing resources more efficiently and choose lower-cost alternatives when adding new resources. In short, SANsymphony-V solves today’s Big Problem stalling desktop and server virtualization projects: the storage problem.

Thursday, 27 January 2011

Brian Madden Blogs: DataCore Software releases a "nirvana" VDI storage solution & Benchmark Paper

DataCore releases a "nirvana" VDI storage solution. Full local virtual storage that's really cheap! by Brian Madden

Back in September, I wrote an article describing a product that I wanted that didn't exist: a local "virtual" storage option for VDI. Basically I described why I didn't like SANs for VDI and that I thought it would be cool if there was some sort of software that could virtualize access to the local hard drives that are in a VDI host server. I was thinking a solution like that could create the best of both worlds: fast flexible storage without the overhead costs of a SAN.

In a new white paper from DataCore (direct PDF link), they claim that their SANmelody software running on a VDI host does fulfill my fantasy storage requirements. And they claim they can do it with full multi-server redundancy with a cost of less than $70 per user. (That's $70 for everything.. the VM host, the SANmelody software, the disks you need for storage... everything!)...

Frequent readers know that I'm not one to republish vendor papers. But in this case, the DataCore paper (by Ziya Aral & Jonathan Ely) is actually really, really good. They take a no-BS look at VDI storage, and they validate their architecture with standard tools like Login Consultants' VSI benchmark.

From the paper: Previous publications have reported on configurations which use thousands of virtual desktops to defray the cost of these controllers. Reading between the lines, it becomes immediately apparent that per-virtual desktop hardware costs rise very sharply as such configurations are scaled downward. Yet, it is precisely these smaller VDI configurations which are the more important from most practical standpoints. On the other hand, this configuration may also be scaled upwards, in a linear fashion, to thousands of virtual desktops, thus eliminating distended configurations created by the search for artificial "sweet spots" at which costs are optimized...

Read the full Blog post at: http://www.brianmadden.com/blogs/brianmadden/archive/2011/01/25/datacore-releases-a-quot-nirvana-quot-vdi-storage-solution-full-local-virtual-storage-that-s-really-cheap.aspx

Wednesday, 26 January 2011

DataCore Software Publishes Breakthrough VDI Benchmark Results and Virtualization Paper: 'Benchmarking a Scalable and Highly Available Architecture for Virtual Desktops'

The benchmark results represent a greater than ten-fold (10x) decrease in the per-Desktop costs of SAN-based, high-availability virtual desktop infrastructures (VDIs).

http://finance.yahoo.com/news/DataCore-Software-Publishes-prnews-2900175007.html?x=0&.v=1

"I loved this paper! Every other ROI discussion you see is based on the economic benefit a company derives by deploying thousands of virtual desktops - but in the real world, we start with ten, then a hundred, etc.," states Steve Duplessie, founder and senior analyst, Enterprise Strategy Group. "They will never show you that cost because it's huge! I appreciate the folks at DataCore shining the light on this and showing that you don't have to be able to spend a zillion dollars before you can see a return on a VDI initiative."

Read the full paper here: Benchmarking a Scalable and Highly Available Architecture for Virtual Desktops – http://tinyurl.com/4hlrh46

See supporting videos here: The Future of Desktop Virtualizationhttp://tinyurl.com/276yaot

Tuesday, 18 January 2011

SANsymphony 7.0 Selected as 2010 Products of the Year finalists for Storage Management

http://searchstorage.techtarget.com/generic/0,295582,sid5_gci1525938,00.html

Many will look back on 2010 as the year of big storage acquisitions, but the wheeling and dealing was only part of the story. Data storage vendors turned out a slew of ingenious enterprise data storage products to address some of the key issues users have been grappling with in their storage shops.

From nearly 200 entries, the judges of Storage magazine's and SearchStorage.com's 2010 Products of the Year awards have selected and announced their finalists.

Storage management tools: 2010 Products of the Year finalists
Jan 2011 | Storage magazine and SearchStorage.com Contributors

Here are the eight finalists in the storage management tools category in Storage magazine's and SearchStorage.com's 2010 Products of the Year competition. The category covers storage resource management (SRM) and SAN management software, performance monitoring, file systems, volume management, virtualization software and security software. Finalists are listed below. 
DataCore Software Corp. SANsymphony 7.0
DataCore SANsymphony 7.0 storage virtualization software adds non-stop high availability (HA) to let users do maintenance, upgrades and expansion, and address system failures without disrupting applications. Also new are support for Fibre Channel over Ethernet (FCoE), "advanced site recovery' (ASR) to enable IT shops to fail over physical and virtual servers to multiple remote and branch offices, and efficient space reclamation for thinly provisioned storage.

Balesio AG FILEminimizer Server balesio FILEminimizer Server

Gluster Inc. GlusterFS 3.1 GlusterFS 3.1

Nasuni Corp. Nasuni Filer 2.0 Nasuni Filer 2.0

Nexenta Systems Inc. NexentaStor 3.0 NexentaStor 3.0

Quantum Corp. StorNext 4.0 StorNext 4.0

Quest Software Inc. vFoglight Storage 1.0 vFoglight Storage 1.0

Rackspace Hosting Inc. Cloud Files Rackspace Hosting Cloud Files

Saturday, 8 January 2011

The Next Big Industry Push is VDI

Check out the latest post form Jon Toigo at: http://www.drunkendata.com/?p=3264

One of the problems I have been detecting in the literature is the same issue we saw in server virtualization:  the gotchas of hidden cost.  In addition to software and operating system licensing fee structures, the industry will need to sort out the enabling software for virtualizing desktops themselves.  But the really big cost is — you guessed it — storage.

I have read with amusement and disdain the papers by leading storage hardware vendors stating that, from a storage cost perspective, you can stand up 5000 desktop images on their array for a fraction of what a physical desktop costs.  They usually tout $100 desktops versus $400 physical PCs.  Big cost savings there, right?
Not really.  The reason why they use 5000 virtual PCs to achieve their groundbreaking cost reductions is that it is the only way they can amortize their box of disk drives and value add software.  NOBODY IS GOING TO VIRTUALIZE 5000 DESKTOPS!!!

Sorry for shouting.  I guess there are some companies out there with that many PCs.  However, the truth is that VDI will likely advance incrementally — tens or maybe a couple of hundred PCs at a time.  This is especially true at the outset of the trend, with early adopters and experimenters. 

So, the truth of most of these claims that 5000 desktop images can be stored for a quarter of the price of an equal number of physical PCs is quite self serving — and, more importantly, falls apart in the real world. 

The question shouldn’t be how many VDIs can I stack up to amortize my storage investment, it should be, how much cost can I take out of distributed desktops if I virtualize a few dozen?  The benchmarks of the EMCs, NetApps, et al never go there.

But DataCore Software does.  In case you missed their announcement today, here’s the news — at least the preliminary bits that were released today.

My friends at DataCore have just completed what I regard as the first meaningful benchmark on the cost of desktop virtualization from a storage hosting perspective.  I like what Ziya, George, Bettye and the gang have done — a lot!  In fact, I thought it was important enough to fly down there with a camera and shoot a video interview on the subject.  They have posted the videos to their site but I will be placing them at the C-4 Summit in Cyberspace by end of week.

Here is an extract of what I just wrote about it for Storage Magazine in the Netherlands…

Late 2010 saw a succession of “proof of concept” papers coming from the storage brand name vendors touting cost models for VDI that were a fraction of the acquisition price of physical PCs.  One three-letter vendor boasted that a 5000 PC environment could be effectively hosted on its storage array for roughly $50 US per box. 

Once you get past the wow factor, however, you quickly realize that the storage vendor’s benchmark is rather self-serving.  The benchmark ignores the cost of desktop OS and application software licensing, hypervisor licensing, server hardware, network enhancements, and storage cabling requirements to focus narrowly on the cost per VDI using the vendor’s storage gear.  Moreover, a business will need to stack up a full 5000 virtual PCs in order to amortize the cost of the storage rig and achieve the cost-per-virtual-desktop advanced by the vendor.

Truth be told, virtually no one is going to virtualize 5000 desktops all at once – not even the large insurance companies or government research labs that actually have desktops in those numbers.  It is more likely that companies will tread cautiously when pursuing a desktop virtualization strategy, virtualizing only a handful of machines at a time so that the real cost and efficacy of the strategy can be clarified.  Until that happens, buying a huge EMC, IBM, HDS, etc. storage rig to support 5000 virtual PCs at sub-$100 each will not return its investment.  In fact, companies pursuing this course will likely find out the hard way that the easiest path to doubling or tripling desktop computing costs is to virtualize their desktops on expensive infrastructure.
I wish everyone would read CTO Ziya Aral of DataCore Software’s wonderful benchmark on VDI and storage.  His goals were not to amortize a specific storage rig (DataCore sells storage virtualization that works just as well with Joe’s JBODs as it does with VMAX or USP).  He wanted simply to understand the capacity and performance requirements for desktop virtualization – especially in the 100 to 500 virtual machine range that will be much more common in the real world.

DataCore Software demonstrated pretty persuasively, and without trying terribly hard, that you could stand up that range of VDIs on a virtualized storage platform for about $35 per desktop.  They also discovered that the low cost could be maintained as you grew infrastructure using storage and servers configured as part of a star topology – stars later serving as a building block for scaling.  That price included all of the redundancy and failover capabilities touted as “enterprise class” hosting by the brand name storage rig vendors – leveraging only the secret sauce of DataCore read/writable snap shots created from virtualizing underlying storage.

There were many non-intuitively-obvious findings in DataCore’s research that I will only summarize here.  For one, a Microsoft Windows 7 machine will boot in less than 198K of memory – so there is no real advantage to throwing a ton of RAM at the system.  For another, most desktops require very little physical storage:  so, when you begin doing VDI, you need to stop thinking as though you are deploying a physical PC with parameters dictated by “boundary conditions” (i.e., configuring according to what the most data intensive application or user might require in terms of storage capacity).

Bottom line:  DataCore’s benchmark proves that it doesn’t require an overpriced “enterprise class” storage rig to virtualize desktops in a cost-effective way.  All that it really requires is common sense and storage virtualization software, preferably from DataCore Software, which [will shortly release a new and revamped version of its flagship storage virtualization wares].

We are off to a great start in 2011.  Maybe DataCore’s research will cajole the hardware guys to get real about their cost estimation around VDI.  For now, a sub-$35 price tag is pretty compelling.

Thursday, 6 January 2011

Interesting Technologies that could come of age in 2011. Storage Virtualization Software #1

http://www.storagetopics.com/2011/01/interesting-technologies-that-could.html

In my last blog I noted that the promise of the New Year encourages a flurry of predictions each trying to foretell next year’s technology winners and losers. Just as with New Year’s resolutions, I have tried to avoid engaging in this end of year ritual despite my last blog that may have created a contrary impression...

To me interesting predictions are those that reach beyond the safety of conservatism. Being bullish is risky but what is the point of predicting the obvious. So in the spirit of sticking myself out there on the proverbial limb my selection of technologies to watch in 2011 are; Storage Virtualization Software, Cloud Storage Gateways and Autonomic Software Architectures.

#1. Storage Virtualization Software: Not what is traditionally thought of as storage virtualization; for example HP’s EVA or a 3Par array. Storage virtualization software is a hardware independent software product, designed to deliver storage virtualization that replicates the many advantages VMware, Microsoft’s HyperV and Citrix et al has delivered to the server world.

How? By looking at what is traditionally consider to be platform based controller functions and physical storage virtualization as a portable software program that is abstracted from the physical layer and hence hardware independent. Very interesting concept and one that I anticipate will generate some interesting responses from the traditional storage vendors.

Companies to watch; DataCore, a 10+ year old company that could have a break out year in 2011, and for those interested in content addressable storage, Caringo.

Wednesday, 5 January 2011

The Software Advantage for Storage Virtualization: DataCore Software Sets the Stage for Virtualization Leadership in 2011

DataCore Software, an early pioneer and now a leading provider of software-based virtualization solutions empowering storage, today highlighted what it considers points of emphasis for 2011 in terms of furthering the promise of virtualization.

The promise of virtualization will be further realized this year. Software-based storage virtualization will take its rightful place by making storage anonymous within virtual infrastructures. Equally important, a number of trends, strategic drivers and software architectures will emerge to make a new level of flexibility and scale both practical and cost-effective to achieve. The most significant driver will be the simple realization that only a software-based infrastructure can truly deliver the necessary advantages to make virtualization and Clouds a reality because software can abstract itself from device specific limitations, provide portability across platforms, and endure beyond the life of underlying hardware platforms that come and go over time.

Compelling Benefits: The "Software Advantage" and Anonymous Storage
According to DataCore Chairman and Co-founder Ziya Aral, "The entire storage virtualization angle was real simple - abstract yourself from the hardware. DataCore was founded on the belief that storage controllers in general and storage virtualization in particular were essentially ‘software programs.' For us, virtualization was driven by a very simple need to make a portable software program to do disk storage and run it on any platform."

This sentiment is both echoed and further emphasized by DataCore's President, CEO and Co-founder George Teixeira, "From a business standpoint, the total cost of ownership and the payback on investment of a pure ‘software infrastructure,' where you pay once for intelligent software to manage, protect and get more from your storage assets - as they come and go from generation to generation and brand to brand - is a value proposition that is as compelling as it is inevitable."

Bottom-line: A software based storage virtualization infrastructure can live within the same abstraction level as virtual servers and virtual desktops and bring along the same type of benefits that we have seen from these movements in regards to making storage hardware brands largely irrelevant - or anonymous - to users and applications, therefore simplifying management, removing storage task complexity, speeding up response and provisioning times and increasing overall utilization and flexibility.

The "software advantage" becomes more obvious when you consider the many scenarios it works in and the many shapes it takes and will take in the future. Because DataCore software is portable, it can run on a virtual machine (VM) or on a multitude of physical servers. It not only virtualizes and manages storage, but can coexist along with the server hypervisor in the virtualization layer, providing many possibilities to solve real world and real budget challenges. Hardware simply can't do that.

Software-based storage virtualization has been lauded as a "game changer" in a number of reports, including: InformationWeek's ‘Storage Anonymous' Cover Story.

DataCore's Future Directions for Storage Virtualization
"The whole idea of DataCore and our vision was that if you could move the software program from the constraints and boundaries of the physical platforms and take it to a modern programming environment, then it would be possible to do with software what had been done manually to that point - that's storage virtualization," explains Aral.

Virtual desktops (vDesktops) are an excellent example that highlights the need for a new model for storage. The major challenge for vDesktops is that SANs are often implemented with large and costly storage controllers and complex external storage networks. While these have the advantage of achieving reasonable scalability, they introduce a very large threshold cost to virtual desktop implementations. To overcome this high capital cost burden, hardware vendors typically tout the economics of deploying several thousand vDesktops. The real problem in this case is not in scaling up to "thousands" of vDesktops but in scaling them down to practical configurations. It barely needs mentioning that this must occur without radically spiking costs at the low end and also without forgoing the SAN feature set which assures portability, availability, and data redundancy. Otherwise, the very benefits of vDesktops are compromised.

DataCore has done extensive benchmarking to understand the economics of vDesktops and has been able to configure high-availability configurations supporting a few hundred instead of "thousands" of vDesktops, and at a cost per desktop of less than one tenth of what was previously reported. In addition, new topologies allow the scaling of such configurations to "thousands" if that is what is required. Based on these initial findings, DataCore will make a significant impact on removing storage costs as a primary barrier to virtual desktop deployments. DataCore will be posting a series of Virtual Desktops Benchmark Reports in 2011.

Virtual desktops have one element in common with the other major computing movement of our day: cloud computing. Both technologies promise to deploy very large numbers of "machines" of the same class. This creates the opportunity to present one additional level of virtualization and to "divide and conquer" the problem which is otherwise daunting for its scale. What if, instead of attempting to manage hundreds or thousands of virtual machines discretely, one could divide them into arbitrary groups or sub-units and then manage a far smaller number of sub-units? This will also be the future direction of our work, impacting not only vDesktops and cloud computing but also the organization of storage itself.

Stay tuned to DataCore in 2011 for more details.

Learn More - Supporting Materials

Ziya Aral discusses: The DataCore Product Vision, Virtual Desktops and Benchmark Findings.

George Teixeira shares his "Perspectives on the Shifting Economies of Storage Virtualization Software, Private Clouds and Virtual Desktops" - DataCore CEO Perspectives: 2011.