Information, commentary and updates from Australia / New Zealand on virtualization, business continuity solutions, FC SAN, iSCSI, high-availability, remote replication, disaster recovery and storage virtualization and SAN management solutions.
Wednesday, 25 December 2013
Monday, 23 December 2013
DataCore deployed at over 10000 customer sites and selected as Software-defined Storage Vendor in Advantage Phase of Gartner Group “IT Market Clock"
DataCore Surpasses 10,000 Customer Sites Globally as Companies Embrace Software-Defined Storage
Customers Realize Software, Not Hardware, Key to Increasing Performance, Reducing Cost and Simplifying Management
DataCore has experienced significant customer adoption of its ninth-generation SANsymphony-V platform in 2013. As the company surpassed 10,000 customer sites globally, new trends have materialized around the need for businesses to rethink their storage infrastructures with software architecture becoming the real blueprint for the next wave of data centers.
“The remarkable increase in infrastructure-wide deployments that DataCore experienced throughout 2013 reflects an irreversible market shift from tactical, device-centric acquisitions to strategic software-defined storage decisions. Its significance is clear when even EMC concedes the rapid commoditization of hardware is underway. Their ViPR announcement acknowledges the ‘sea change’ in customer attitudes and the fact that the traditional storage model is broken,” said George Teixeira, president and CEO at DataCore. “We are clearly in the age of software defined data centers, where virtualization, automation and across-the-board efficiencies must be driven through software. Businesses can no longer afford yearly ‘rip-and-replace’ cycles, and require a cost-effective approach to managing storage growth that allows them to innovate while getting the most out of existing investments.”
In addition to the mass customer adoption. DataCore’s software was recently selected as a software-defined storage vendor in Gartner’s “IT Market Clock for Storage, 2013,” published September 6, 2013. The report, by analysts Valdis Filks, Dave Russell, Arun Chandrasekan et al., identifies software-defined storage vendors in the Advantage Phase, and recognizes two main benefits of software-defined storage:
“First, in the storage domain, the notion of optimizing, perhaps often lowering, storage expenses via the broad deployment of commodity components under the direction of robust, policy-managed software has great potential value. Second, in the data center as a whole, enabling multitenant data and workload mobility among servers, data centers and cloud providers without disrupting application and data services would be transformational.”
Three major themes in 2013 shaped the software-defined storage market and defined the use cases of DataCore’s new customers:
Adoption and Appropriate Use of Flash Storage in the Data Center
As more companies rely on flash to achieve greater performance, a unique challenge is arising when it comes to redesigning storage architectures. While the rule of thumb is five percent of workloads require top tier performance, flash vendors are doing their best to convince customers to go all flash despite the low ROI. Instead, businesses have turned to auto-tiering software to make sure applications are sharing flash and spinning disk, based on the need to optimize performance and investment. Going beyond other implementations, DataCore has redefined automation and mobility of data storage with a new policy-managed paradigm that makes auto-tiering a true ‘enterprise wide’ capability that works across multiple vendor offerings and the many levels and varied mix of flash devices and spinning disks.
Host.net is a multinational provider of managed infrastructure services focusing on cloud computing and storage, colocation, connectivity and business continuity for enterprise organizations.
“Flash gives us the greatest levels of performance for our mission critical applications,” said Jeffrey Slapp, CTO of Host.net. “While integral, flash is only a small piece of our storage architecture. In order to help ensure our applications are using the right type of storage for peak performance, we use DataCore's SANsymphony-V platform. The software's intelligence makes sure the more demanding applications use flash and less demanding applications use hard disk. We’ve been able to reduce operational expenses by 35% because of the software’s intelligence capabilities, which allows us to tackle other key business initiatives by leveraging the time we never had previously.”
Virtualizing Storage while Accelerating Performance for Tier-One Applications
Demanding business applications like databases, ERP and mail systems create bottlenecks in any storage architecture due to their rapid activity and intensive I/O and transactional requirements. To offset this, many companies buy high-end storage systems while leaving terabytes of storage unused. Now, though, businesses are able to combine all of their available storage and virtualize it, independent of vendor – creating a single storage pool. Beyond virtualization and pooling, DataCore customers report faster application response times and significant performance increases – accelerating I/O speeds up to five times.
Pee Dee Electric Cooperative is a non-profit, electric cooperative located in Darlington, South Carolina that supplies electricity and other services to more than 30,000 consumers.
“Tier-one applications demand high performance and in the past this translated directly into expensive and overprovisioned storage,” said Robbie Howle, IT Manager at Pee Dee Electric Cooperative. “To help allocate the necessary storage that meets the performance demands of the application, without buying new storage, we leverage the SANsymphony-V platform as it accelerates and virtualizes all of the available storage within the organization. In fact, DataCore’s software-based approach to storage virtualization has drastically reduced costs by enabling us to virtualize storage devices we already had – eliminating the need to pay upwards of $500,000 for a traditional, hardware-based SAN. Moreover, the benefits of the DataCore virtualized storage infrastructure continue to manifest themselves perpetually and we have been able to get twice as much storage, better performance and achieve high availability in going with DataCore.”
Software Management of Incompatible Storage Devices and Models
Many data centers feature a wide variety of storage arrays, devices and product models from a number of different vendors – including EMC, NetApp, IBM, Dell and HP – none of which are directly compatible. Interestingly, DataCore customers report that the issue of incompatibility generally surfaces more when dealing with different hardware models from the same vendor than between different vendors, and thus have turned to management tools that treat all hardware the same.
Maimonides Medical Center, based in Brooklyn, N.Y., is the third-largest independent teaching hospital in the U.S. The hospital has more than 800 physicians relying on its information systems to care for patients around-the-clock.
“Over the past 12 years, our data center has featured eight different storage arrays and various other storage devices from three different vendors,” said Gabriel Sandu, chief technology officer at Maimonides Medical Center. “By using DataCore’s SANsymphony software and currently with its latest iteration of SANsymphony-V R9, we have been able to seamlessly go from one storage array to the next with no downtime to our users. We are able to manage our SAN infrastructure without having to worry or be committed to any particular storage vendor. DataCore’s technology has also allowed us to use midrange storage arrays to get great performance – thereby not needing to go with the more expensive enterprise-class arrays from our preferred manufacturers. DataCore’s thin provisioning has also allowed us to save on storage costs as it allows us to be very efficient with our storage allocation and makes sure no storage goes unused.”
QLogic certifies adapters for software-defined storage; announces Fibre Channel Adapters and FabricCache are DataCore Ready
QLogic FlexSuite Gen 5 Fibre Channel and FabricCache Adapters certified DataCore Ready
The emerging software-defined storage space hit a new milestone after QLogic announced that it has added support for DataCore’s SANsymphony-V virtualization offering.
http://siliconangle.com/blog/2013/12/18/qlogic-certifies-adapters-for-software-defined-storage/
http://siliconangle.com/blog/2013/12/18/qlogic-certifies-adapters-for-software-defined-storage/
QLogic® FlexSuite™ 2600 Series 16Gb Gen 5 Fibre Channel adapters and FabricCache™ 10000 Series server-based caching adapters are now certified as DataCore Ready, providing full interoperability with SANsymphony-V storage virtualisation solutions from DataCore Software.
DataCore SANsymphony-V is a comprehensive software-defined storage platform that solves many of the difficult storage-related challenges raised by server and desktop virtualisation in data centres and cloud environments. The software significantly improves application performance and response times, enhances data availability and protection to provide superior business continuity and maximises the utilisation of existing storage investments. QLogic FabricCache adapters and FlexSuite Gen 5 Fibre Channel adapters, combined with SANsymphony-V, allow data centres to maximise their network infrastructure for a competitive advantage.
“QLogic channel partners and end-users can now confidently deploy award-winning QLogic adapters with SANsymphony-V to optimise network performance and make the most of their IT investments,” said Joe Kimpler, director of technical alliances, QLogic Corp. “Customers can choose the best QLogic solution—FabricCache adapters for high-performance clustered caching or FlexSuite Gen 5 adapters for ultra-high performance—to best handle their data requirements.”
“DataCore has a long history of collaborating with QLogic to help solve the storage management challenges of our mutual customers,” said Carlos M. Carreras, vice president of alliances and business development at DataCore Software. “QLogic high-performance Gen 5 Fibre Channel adapters and innovative, server-based caching adapters combine with SANsymphony-V to cost-effectively deliver uninterrupted data access, improve application performance and extend the life of storage investments, while providing organisations with greater peace of mind.”
Wednesday, 11 December 2013
DataCore Software Defined Storage and Fusion-io Reduce Costs and Accelerate ERP, SQL, Exchange, SharePoint Applications
BUHLMANN GRUPPE, a leader in steel piping and fittings, headquartered in Bremen, Germany has implemented a storage management and virtual SAN infrastructure based on DataCore’s SANsymphony-V software. SANsymphony-V manages and optimizes the use of both the conventional spinning disks (i.e. “SAS” drives) and the newly integrated flash memory-based Fusion-io ioDrives through DataCore’s automatic tiering and caching technology. With the new DataCore solution in place, the physical servers, the VMware virtual servers and most importantly the critical applications needed to run the business – including Navision ERP software, Microsoft Exchange, SQL and SharePoint – are now failsafe and run faster.
DataCore and Fusion-io = Turbo Acceleration for Tier 1 applications
After successfully testing the implementation, the migration of the physical and virtual servers onto the DataCore powered SAN was carried out. A number of physical servers, the Microsoft SQL and Exchange system and other file servers now access the high performance DataCore storage environment. In addition, DataCore now manages, protects and boost the performance for storage serving 70 virtual machines under VMware vSphere that host business critical applications – including the ERP system from Navision, Easy Archive, Easy xBase, Microsoft SharePoint and BA software.
"The response times of our mission-critical Tier 1 applications have improved significantly; performance has been doubled by the use of DataCore and Fusion-io," says Mr. Niebur. "The hardware vendor independence provides storage purchasing flexibility. Other benefits include the higher utilization of disk space, the performance of flash based hardware, as well as faster response times to meet business needs that we experience today – and in the future – combined they save us time and money. Even with these new purchases involved, we have realized saving of 50 percent in costs – compared to a traditional SAN solution."
Read the full Case Study:
http://www.datacore.com/Libraries/Case_Study_PDFs/Case_Study_-_Buhlmann_Gruppe_US.sflb.ashx
DataCore and Fusion-io = Turbo Acceleration for Tier 1 applications
After successfully testing the implementation, the migration of the physical and virtual servers onto the DataCore powered SAN was carried out. A number of physical servers, the Microsoft SQL and Exchange system and other file servers now access the high performance DataCore storage environment. In addition, DataCore now manages, protects and boost the performance for storage serving 70 virtual machines under VMware vSphere that host business critical applications – including the ERP system from Navision, Easy Archive, Easy xBase, Microsoft SharePoint and BA software.
"The response times of our mission-critical Tier 1 applications have improved significantly; performance has been doubled by the use of DataCore and Fusion-io," says Mr. Niebur. "The hardware vendor independence provides storage purchasing flexibility. Other benefits include the higher utilization of disk space, the performance of flash based hardware, as well as faster response times to meet business needs that we experience today – and in the future – combined they save us time and money. Even with these new purchases involved, we have realized saving of 50 percent in costs – compared to a traditional SAN solution."
Read the full Case Study:
http://www.datacore.com/Libraries/Case_Study_PDFs/Case_Study_-_Buhlmann_Gruppe_US.sflb.ashx
Wednesday, 4 December 2013
A Defining Moment for the Software-Defined Data Center
Article from: http://elettronica-plus.it/a-defining-moment-for-the-software-defined-data-center/
For some time, enterprise IT heads heard the phrase, “get virtualized or get left behind”, and after kicking the tires, the benefits couldn’t be denied and the rush was on. Now, there’s a push to create software-defined data centers. However, there is some trepidation whether these ground-breaking, more flexible environments can adequately handle the performance and availability requirements of business-critical applications, especially when it comes to the storage part of the equation.
While decision-makers had good reason for concern, they now have an even better reason to celebrate as new storage virtualization platforms have proven to overcome these I/O obstacles.
Just as server hypervisors provided a virtual operating platform, a parallel approach to storage is quickly transforming the economics of virtualization for organizations of all sizes by offering the speed, scalability and continuous availability needed for realizing the full benefits of software-defined data centers. Particular additional benefits being widely reported include:
In the survey, findings showed 42 percent of respondents noted performance degradation or inability to meet performance expectations as an obstacle preventing them from virtualizing more of their workloads. Yet, effective storage virtualization platforms are now successfully overcoming these issues by using device-independent adaptive caching and performance boosting techniques to absorb wildly variable workloads, enabling applications to run faster virtualized.
To further increase tier-1 application responsiveness, companies often spend excessively on flash memory-based solid state disks (SSDs). The survey also reveals that 44 percent of respondents found disproportionate storage-related costs were an obstacle to virtualization. Again, effective storage virtualization platforms are now providing a solution with such features as auto-tiering, which optimize the use of these premium-priced resources alongside more modestly priced, higher capacity disk drives.
Such an intelligent software platform constantly monitors I/O behavior and can intelligently auto-select between server memory caches, flash storage and traditional disk resources in real-time to ensure the most suitable class or tier of storage device is assigned to each workload based on priorities and urgency. As a result, a software defined data center can now deliver unmatched tier-1 application performance with optimum cost efficiency and maximum return on existing storage investments.
Once I/O intensive tier-1 applications are virtualized, the storage virtualization platform ensures high availability. It eliminates single points of failure and disruption through application-transparent physical separation, stretched across rooms or off-site with full auto-recovery capabilities for the highest levels of business continuity. The right platform can effectively virtualize whatever storage is on a user’s floor, whether direct-attached or SAN-connected, to achieve a robust and responsive shared storage environment necessary to support highly dynamic virtual IT environments.
Yes, the storage virtualization platform is a defining moment for the software defined data center. The performance, speed and high availability needed for mission-critical databases and applications in a virtualized environment has been realized. Barriers have been removed and there’s a clear and supported path for realizing greater cost efficiency.
Still, selecting the right platform is critical to a data center. Technology that is full-featured and has been proven “in the field” is essential. Also, it’s important to go with an independent, pure software virtualization solution in order to avoid hardware lock-in, and to take advantage of the future storage developments that undoubtedly will come.
For some time, enterprise IT heads heard the phrase, “get virtualized or get left behind”, and after kicking the tires, the benefits couldn’t be denied and the rush was on. Now, there’s a push to create software-defined data centers. However, there is some trepidation whether these ground-breaking, more flexible environments can adequately handle the performance and availability requirements of business-critical applications, especially when it comes to the storage part of the equation.
While decision-makers had good reason for concern, they now have an even better reason to celebrate as new storage virtualization platforms have proven to overcome these I/O obstacles.
Just as server hypervisors provided a virtual operating platform, a parallel approach to storage is quickly transforming the economics of virtualization for organizations of all sizes by offering the speed, scalability and continuous availability needed for realizing the full benefits of software-defined data centers. Particular additional benefits being widely reported include:
- Elimination of storage-related I/O bottlenecks in virtualized data centers
- Harnessing flash storage resources effectively for even greater application performance
- Ensuring fast and always available applications without a major storage investment
In the survey, findings showed 42 percent of respondents noted performance degradation or inability to meet performance expectations as an obstacle preventing them from virtualizing more of their workloads. Yet, effective storage virtualization platforms are now successfully overcoming these issues by using device-independent adaptive caching and performance boosting techniques to absorb wildly variable workloads, enabling applications to run faster virtualized.
To further increase tier-1 application responsiveness, companies often spend excessively on flash memory-based solid state disks (SSDs). The survey also reveals that 44 percent of respondents found disproportionate storage-related costs were an obstacle to virtualization. Again, effective storage virtualization platforms are now providing a solution with such features as auto-tiering, which optimize the use of these premium-priced resources alongside more modestly priced, higher capacity disk drives.
Such an intelligent software platform constantly monitors I/O behavior and can intelligently auto-select between server memory caches, flash storage and traditional disk resources in real-time to ensure the most suitable class or tier of storage device is assigned to each workload based on priorities and urgency. As a result, a software defined data center can now deliver unmatched tier-1 application performance with optimum cost efficiency and maximum return on existing storage investments.
Once I/O intensive tier-1 applications are virtualized, the storage virtualization platform ensures high availability. It eliminates single points of failure and disruption through application-transparent physical separation, stretched across rooms or off-site with full auto-recovery capabilities for the highest levels of business continuity. The right platform can effectively virtualize whatever storage is on a user’s floor, whether direct-attached or SAN-connected, to achieve a robust and responsive shared storage environment necessary to support highly dynamic virtual IT environments.
Yes, the storage virtualization platform is a defining moment for the software defined data center. The performance, speed and high availability needed for mission-critical databases and applications in a virtualized environment has been realized. Barriers have been removed and there’s a clear and supported path for realizing greater cost efficiency.
Still, selecting the right platform is critical to a data center. Technology that is full-featured and has been proven “in the field” is essential. Also, it’s important to go with an independent, pure software virtualization solution in order to avoid hardware lock-in, and to take advantage of the future storage developments that undoubtedly will come.
Subscribe to:
Posts (Atom)