Saturday, 19 November 2011

Solve HDD Shortages with DataCore SANsymphony-V Storage Hypervisor Software

Amongst the catastrophic flooding in Thailand lie hard drive manufacturing plants that supply about 40% of the world’s hard drives and analysts are expecting an impact on supply well into 2012. This means that prices are rising, supply is falling and you are stuck buying more storage in the midst of the crisis because you can’t afford to run out of space.

But what if there was a different way?


Save Money by Saving Space

DataCore Software Storage Hypervisor enables you to maximize utilization and minimize capacity consumption. It thinly provisions capacity from the virtual storage pool to hosts only as needed. No longer do you strand capacity by pre-allocating space to applications that may never use it.

New disk shelves and disk arrays can be added to the virtual storage pool without disrupting applications or users, even in the middle of peak workloads. You won’t need to worry about backwards compatibility with your other storage devices. DataCore overcomes those differences, allowing you to mix and match different models and even different brands within the same virtual pool.

This unique capability allows you to shop around for the best value and defer purchases until you really need the capacity.

Interested in reducing costs by getting the most utilization out of your storage? Contact us now.

Tuesday, 15 November 2011

DataCore Storage Virtualization Software Lowers Cost of Ownership and Accelerates Performance at HARTING Technology Group

http://www.dabcc.com/channel.aspx?id=208

DataCore Software, the industry’s premier provider of storage virtualization software, announced today that HARTING Technology Group has deployed its SANsymphony storage virtualization software to realize greater cost efficiency, high-performance and a high-availability enterprise storage environment. A complete case study on the storage challenges that HARTING overcame with DataCore Software is available here: http://www.datacore.com/Testimonials/Harting-Technology-Group.aspx.

HARTING runs on DataCore’s SANsymphony and hardware from Hitachi. The storage solution supports the delivery of business critical applications, such as SAP, MS Exchange, as well as CAD and product lifecycle management software.

HARTING Technology Group is a large global manufacturer and services company specializing in electrical, electronic and optical connection, transmission and networking. It produces technology products and solutions for industries including high-speed rail, automotive and renewable energy such as wind. With over 3,300 employees in 36 countries relying on being able to access the company’s data at all hours of the night and day, a high performance storage environment is paramount.

The company chose a joint solution proposed by solution provider ISO Dataentechnik, which included Hitachi storage hardware and DataCore’s SANsymphony storage virtualization software. By moving from a less-flexible legacy hardware infrastructure to a cost-effective midrange hardware storage system and managing all their storage environment with a DataCore-powered virtualized SAN, HARTING was able to overcome three key challenges:

  • Reduce the overall costs of storage and provide greater options and flexibility for adding storage systems in the future.
  •  Improve reliability through the addition of DataCore’s high availability for critical business systems.
  • Substantially increase enterprise application performance.

A Compelling Combination: High Performance and Low Cost
"Our expectations of the combination of HDS hardware and DataCore software have been exceeded,” said Rudolf Laxa, operations and data center team leader at HARTING Technology Group. “The new HDS midrange systems and the DataCore virtual storage layer have allowed us to lower costs and achieve a significant increase in fail-safety and performance. The excellent interaction between DataCore software and VMware is another reason why we are more than satisfied with the current solution."

According to Laxa, there was initial hesitation to move business-critical SAP applications to the virtualized storage environment – as it represented a significant break from HARTING’s past practices. However, examples of success with similar moves with other DataCore customers and the opportunity to significantly enhance current capabilities ultimately prevailed. Laxa continued, "The benefits of central administration finally provided the impetus for implementation -- a decision we have not had a reason to regret so far.”

In particular, HARTING credits DataCore’s SANsymphony software for an unprecedented level of performance and business agility, especially when combined with the company’s existing VMware-based server virtualization deployment throughout its data center.

"The technical capabilities of DataCore virtualized storage appealed to us almost immediately; it creates high availability, gives us independence from the hardware and makes flexible migration scenarios possible,” said Laxa “The software has proven to be a meaningful extension of our VMware environment and guarantees the highest levels of availability we require from our storage solution."

Saturday, 12 November 2011

Arizona State University Selects DataCore to Manage Data Storage Growth

Ensures higher performance and availability

http://www.datamation.com/storage/managing-data-storage-growth-buyers-guide-1.html

Vincent Boragina, Manager System Administration, W. P. Carey School of Business IT Arizona State University, aimed to reach a 100% in server virtualization. Performance from IT assets was imperative. The advance in server virtualisation over the years, alongside desktop virtualization, led the school to dabble in high-end storage I/O needs with sequel databases and file servers (initially kept off the server virtualisation layer as the products were yet to mature). But when they started to virtualize these platforms, they faced a higher degree of latency. The need for I/O had advanced.

Boragina explains, “The issues with virtualization rests not so much with the storage capacity, as much as with how fast and the low latency it requires, to get the data on and off the disc. What is key, are the controllers and the fiber connectivity, etc., that run the disc, which impact the IOPS (Input/Output Operations Per Second) and the latency of that disc. This is where complexity rises, as it is harder to measure latency. Performance was my key criteria.”

The school implemented DataCore’s SANsymphony-V and XIO storage, where XIO was the disk sub system and DataCore was the hypervisor for the storage and the storage I/O controllers. As a result, the school achieved a 50% reduction in latency time and a 25-30% increase in the overall I/O. With the redundancy and I/O requirements met, the school was able to virtualize any platform.

Importantly, to address issues like high performance, one need not overhaul the existing storage stack, added George Teixeira, CEO at DataCore. DataCore’s SANsymphony-V Storage Hypervisor, for instance, utilizes existing storage assets to boost performance with the adaptive caching. Its auto-tiering enables optimal use of SSDs/Flash, and high-availability for business continuity. “This precludes the investments of purchasing additional IT assets and pre-mature hardware obsolescence,” says Teixeira.

Business continuity was the added benefit for the school, as it came built-in within the DataCore solution. An added effect of this implementation: speedier backup due to a faster I/O.

Thursday, 10 November 2011

Storage Hypervisor Delivers Just-in-Time Capacity Management

Just-in-time (JIT) production practices, which view inventory not as an asset but a cost, have accelerated the delivery and reduced the cost of products in a wide range of industries. But perhaps the biggest benefit to the companies that adopted them was the exposure of widespread manufacturing inefficiencies that were holding them back. Without the cushion of a large inventory, every little mechanical or personnel hiccup in the assembly line had an immediate effect on output.

Virtualization technology is playing a similar role for IT, and nowhere is this more visible than in storage. Server virtualization has been incredibly successful in reducing the processor “inventory” needed to provide agile response to business demands for more and better application performance. Average processor utilization often zooms from the 10% range to 60-70% in successful implementations. But this success exposed serious storage capacity management inefficiencies.

As Jon Toigo of the Data Management Institute points out in the first of his Storage Virtualization for Rock Stars white papers, Hitting the Perfect Chord in Storage Efficiency, between 33 and 70 cents of every IT dollar expended goes for storage, and the TCO of storage is estimated to be as much as 5 to 8 times the cost of acquisition on an annualized basis. However, as illustrated in that paper, on average only 30% of that expenditure is actually used for working data. This isn’t due to carelessness on the part of IT managers. They are doing the same sort of thing manufacturers did before JIT: in this case using large storage inventories to compensate for inefficiencies in storage capacity management that make it impossible to provision storage as fast as they can provision virtual servers.

This is a major factor driving the adoption of storage virtualization, which can abstract storage resources into a single virtual pool to make capacity management far more efficient. (It can do the same for performance management and data protection management, as well—I’ll look at them in future posts.) I say “can” because, given the diverse storage infrastructures that are the reality for most organizations, full exploitation of the benefits of storage virtualization requires the use of a storage hypervisor. This is a portable software program, running on interchangeable servers, that virtualizes all your disk storage resources—SAN, server direct-attached storage (DAS) and even those disks orphaned by the transition to server virtualization—not just the disks controlled by the firmware within a proprietary SAN or disk storage system.

With a storage hypervisor such as DataCore’s SANsymphony-V, the storage capacity management inefficiencies exposed by server virtualization are truly a thing of the past. Rather than laboriously matching up individual storage sources with applications—and likely over-provisioning them just to be sure of having enough, you can draw on a single virtual pool of storage for just the right amount. Thin provisioning permits allocating an amount of storage to an application or end user that is far larger than the actual physical storage behind it, and then provisioning real capacity only as needed based on actual usage patterns. Auto-tiering largely automates the task of matching the right storage resource to applications based on performance level needs.

The result is true just-in-time storage: capacity when, where, and how it’s needed. And, because the capacity management capabilities reside in the storage hypervisor, not the underlying devices, they’re available for existing and future storage purchases, regardless of vendor. You can choose the storage brands and models needed to match your specific price/performance requirements, and when new features and capabilities are added to the hypervisor, they’re available to every storage device.

For more information on how a storage hypervisor can enable effective capacity management from an architectural, operational, and financial standpoint, check out Jon’s second Storage Virtualization for Rock Stars white paper: Capacity Management’s Sweet Notes – Dynamic Storage Pooling, Thin-Provisioning & More.

Next time I’ll look at how a storage hypervisor boosts storage performance management.

Friday, 4 November 2011

ESG reports 59% Have not Deployed Virtualization on Tier-1 Windows Apps

Respondents to a recent ESG survey indicated that increasing the use of virtualization was their number one IT priority over the last two years and will continue to be the top priority for the next 12-18 months. While server virtualization penetration continues to gain momentum, IT organizations still have numerous hurdles to overcome in order to deploy it more widely and move closer to a 100% virtualized data center. ESG found that 59% have yet to employ virtualization where it will provide the most benefit: their mission-critical tier-1 applications. These tier-1 application workloads include Microsoft Exchange 2010, Microsoft SQL Server 2008 R2, and SharePoint 2010. For IT organizations supporting large numbers of users, hesitation to implement virtualization stems from the perception that it adds performance overhead and unpredictable scalability and availability to the tier-1, multi-user, business-critical applications relied upon by the majority of their users.

http://www.enterpriseittools.com/sites/default/files/ESG%20-%20Hyper-V%20R2%20SP1%20Application%20Workload%20Performance%20-%20March%202011.pdf


DataCore STAR HA Solution Adds Resiliency and Performance to Microsoft Hyper-V Environments
http://www.it-analysis.com/technology/storage/news_release.php?rel=28058

The DataCore STAR HA (high availability) solution is primarily aimed at the large installed base of Microsoft servers running lines of business applications as well as Exchange, SQL Server and SharePoint, eager for better data protection and performance.

Many of these IT organisations realise they must move their data from internal server disks to shared storage area network (SAN) to meet growth needs, improve uptime, and enhance productivity. However, some have concluded that this migration could add more risk, disruption, and cost than they can currently afford. Thus, they seek a solution that minimizes these obstacles. They need a simple way to enhance the performance and resiliency of their application servers, while providing easy access and a transition path to the compelling advantages of shared SAN.

The DataCore STAR HA solution is tailored for these IT organisations seeking the benefits of a SAN, while allowing their data to remain safely stored on their servers.

The new business continuity solution enhances the resiliency and performance of Windows server farms enabling those systems to stay up longer, their applications to run faster, and shorten the time to recover from unexpected situations. It is most appealing for customers wishing to keep their data distributed across their server disk drives, while at the same time gaining many of the centralised and advanced services made possible by a SAN.

STAR Topology Leverages Central Server for Faster Recovery from Failures
Rather than migrate server data to an external SAN, the DataCore STAR HA software automatically mirrors the data drives on each Windows server to a central DataCore STAR server for safekeeping. In addition, the Window’s host-resident software speeds up applications by caching repeated disk requests locally from high-speed server memory.

If an application server in the farm goes down, another server can resume its operations by retrieving a current copy of its data drive from the central DataCore STAR server. The DataCore STAR server also offloads replication requests across the farm. It can take centralised snapshots of the data drives and remotely replicate all critical data to a remote disaster recovery site. The solution has the added benefit of automatically redirecting requests to the central DataCore STAR server when a local server data drive is inaccessible.

Thursday, 3 November 2011

DataCore Software Offers SAN Alternative for Windows Server Farms; Targets Microsoft Servers Running Exchange, SQL Server and SharePoint

http://www.thewhir.com/web-hosting-news/110111_DataCore_Software_Offers_SAN_Alternative_for_Windows_Server_Farms

DataCore Software Introduces a New Business Continuity Solution for Windows Microsoft servers running Exchange, SQL Server and SharePoint, eager for better data protection and performance.


The DataCore STAR high availability solution is primarily aimed at the large installed base of Microsoft servers running lines of business applications as well as Exchange, SQL Server and SharePoint, looking for better data protection and performance.

The DataCore STAR HA solution is designed for IT organizations that are seeking the benefits of a SAN, while allowing their data to remain safely stored on their servers.

DataCore also offers alternative, high-availability, high performance packages for architecting solutions using redundant, two node configurations, and scale-out grid designs.

The new business continuity solution improves on the resiliency and performance of Windows server farms enabling those systems to stay up longer, their applications to run faster, and shorten the time to recover from unexpected situations.

It is most ideal for customers that want to keep their data distributed across their server disk drives, while accessing many of the centralized and advanced services made possible by a SAN.

Instead of migrating server data to an external SAN, the DataCore STAR HA software automatically mirrors the data drives on each Windows server to a central DataCore STAR server for safekeeping.

In addition, the Window's host-resident software speeds up applications by caching repeated disk requests locally from high-speed server memory.

If an application server in the farm goes down, another server can resume its operations by retrieving a current copy of its data drive from the central DataCore STAR server.

The DataCore STAR server also offloads replication requests across the farm. It can take centralized snapshots of the data drives and remotely replicate all critical data to a remote disaster recovery site, as well as automatically redirect requests to the central DataCore STAR server when a local server data drive is inaccessible.

Physical machines hosting standalone applications under Windows Server 2008 R2 as well as systems hosting multiple virtual machines under Microsoft Hyper-V can take advantage of the data protection and performance enhancement benefits.

The DataCore STAR HA software is compatible with Windows server applications including Exchange, SQL Server, SharePoint, and line of business applications.

SAN-Averse Customers Can Take Advantage of Business & Financial Benefits

http://vmblog.com/archive/2011/11/01/datacore-software-introduces-a-new-business-continuity-solution-for-windows-server-farms.aspx
“While many IT managers are eager to exploit the benefits of centralized and shared data that SANs deliver, they also often worry that migrating to a central storage system could prove to be risky and disruptive,” said Mark Peters, senior analyst, Enterprise Strategy Group. “The STAR HA software from DataCore provides a solution to this conundrum by combining the comfort and familiarity of distributed data with the attractive attributes of a central SAN in an efficient manner that is both unobtrusive and cost-effective.”