Wednesday 31 July 2013

Quorn Foods Announces “5 Nines” Availability and Improved SAP ERP Productivity with DataCore SANsymphony-V Storage Virtualization

Read: Quorn Foods Case Study

Leading food brand overhauls virtual infrastructure and optimizes Tier 1 applications: Takes Business Critical SAP ERP into the Virtual World and uses DataCore to reduce data mining times from 20 minutes to 20 seconds; plus enables Information Lifecycle Management via Auto Tiering.

Accelerate Business Applications"No one could fail to notice the dramatic leaps in performance that was now afforded by DataCore.”

Quorn Foods (http://www.quorn.com) have recently adopted DataCore SANsymphony-V software to achieve High Availability, to turbo charged application performance and to implement an intelligent Information Lifecycle Management (ILM) data flow with structured Auto Tiering.

Marlow Foods, better known as the owner of the Quorn brand, offers quality low fat, meat free food products to the discerning, health conscious customer. Employing 600 people across three UK sites, Quorn’s Head of IT is Fred Holmes. Back in 2011 when sold from a large parent company, Quorn had the opportunity to remap the entire existing physical server infrastructure which was rapidly falling outside of warranty. Fred notes:

"This was a three phrase project and had to be classified as a major systems overhaul that we were embarking on. In Phase 1, DataCore’s SANsymphony-V enabled smooth migration within a two-week period and dramatically increased IOPS, even with the high burden that virtual servers place when they are delivering thin client capabilities."

Phase 1: Server side Virtualization Progresses into Greenfield Site with DataCore providing the centralized storage and 99.999% reliability:

They consulted their trusted IT partner and DataCore gold partner, Waterstons, to assist with the major infrastructure overhaul. A greenfield site for virtualization, Fred and the assigned Waterstons project team, provided compelling financial analysis showing dramatic consolidation and resource savings to boot. A working proof of concept was deployed to substantiate findings and test that a Microsoft Remote Desktop Services (RDS) farm could support all applications for test user group, and to prove the benefits of server virtualization.

Two successful months later, the project team implemented full server side virtualization with three additional R710 hosts, all Brocade Fibre Channel attached to a storage area network (SAN) to support the full VMware vSphere Enterprise feature set. In total 30 workloads were virtualized into the new environment to allow older physical servers to be retired. From the Desktop perspective, a new RDS farm replaced 400 traditional desktops with thin client capabilities. DataCore’s SANsymphony-V solution provided the essential cost-effective centralized storage running across two Dell T710 commodity servers. DataCore’s storage hypervisor provided one general purpose synchronously mirrored SAN pool of 7TB usable (across a total of 48 10k SAS spindles in MD1220 SAS-attached storage shelves) to provide 99.999% reliability. The project team knew that the success of any robust, responsive VMware environment hinges on the abilities and performance of the storage infrastructure that sits beneath. This was especially true in Quorn’s highly virtualized infrastructure with users interacting directly with virtual RDS Session Hosts. From a business user perception, the virtualized estate provided them with a turbocharged world.

Phase II – taking Business Critical ERP into the Virtual World and using DataCore to reduce data mining times from 20 minutes to 20 seconds:

Phase II covered virtualization of SAP Enterprise Resource Planning for financial, HR, accounts and sales platforms. With around 8,500 outlets that stock the Quorn brand across the UK alone, Marlow Foods have an extremely high dependency on their SAP ERP servers to drive critical business advantages across all departments. The challenge was to integrate the current SAP physical servers into the virtualized environment, whilst maintaining their 99.999% reliability and not affecting existing virtual machines reliant on the SAN. To address this challenge, the project team added another R710 host to the cluster, and a further 4TB of usable synchronously mirrored storage within a new storage pool dedicated entirely to SAP (across a further 48 10k SAS spindles) and began the process to rebuild their SAP servers into the virtual infrastructure. This meant transitioning huge databases from the old physical environment. Proof would come at the end of the month, when database queries were traditionally the highest and performance expectations were unmet with erratic response times.

In fact, the data mining queries were returned within 20 seconds, compared to 20 minutes in the previous physical environment. This is in no small part down to the way that DataCore’s SANsymphony-V leverages disk resources, assigning I/O tasks to very fast server RAM and CPU to accelerate throughput and to speed up response when reading and writing to disk. And with the wholly mirrored configuration, continuous availability is afforded.

"Like all things in IT, dramatic improvements to the infrastructure remain invisible to the user who only notices when things go wrong. But in this instance, no one could fail to notice the dramatic leaps in performance that was now afforded," Fred notes.

Phase III: Enhancing the Virtualized Estate with Auto-Tiering:

With everything virtualized, Fred and the team gave themselves six months to reflect and monitor the new infrastructure before suggesting additional enhancements. What Fred suspected was that he could also achieve greater intelligence from the SAN itself. Simon Birbeck, Waterstons, one of the U.K.’s DataCore Master Certified Installation Engineers, designed a performance enhancing model to automatically migrate data blocks to the most appropriate class of storage within the estate. Thinly provisioned SAN capacity was at around 80% utilization, but for 2013 planning Fred and the Waterstons team had allocated a 20% year-on-year growth, thereby potentially stretching utilization to the maximum by the end of the year. Simon recommended switching to a three tier SAN design to facilitate the best cascading practices of Information Lifecycle Management (ILM).

A red top tier comprised a new layer of SSD flash storage, designed to be always full and utilized by the most frequently read blocks for extremely fast response. A pre-existing amber mid-tier caters for the average use data blocks served by commodity 10k SAS drives. Sitting beneath is a blue tier as the ‘catch all’ layer for the least frequently accessed data, maintained on low cost, high capacity 7.2k SAS spindles.

Fred summarizes, "What Waterstons recommended was an intelligent usable form of ILM with DataCore’s SANsymphony-V at the front-end making the intelligent decision as to which blocks of data should be allocated where."

Indeed SANsymphony-V has provided both strong reporting and accurate planning for data growth. Built-in diagnostics help to pro-actively identify when a problem is manifesting, changing the management role from reactive to proactive/intelligent. For the future, Marlow Foods will look to expand on the high availability/business continuity environment afforded by SANsymphony-V by adding a further asynchronous replica at another site to further protect the SAP ERP environment. The scalability of SANsymphony-V brings a new level of comfort not possible with other forms of storage.

Fred takes the final words: "DataCore’s SANsymphony-V now reliably underpins their entire virtual estate. From a transformation perspective we have new levels of availability and enhanced decision making for both IT and the users."

To learn more about how DataCore can improve your critical business applications, please click here: http://pages.datacore.com/WP_VirtualizingBusinessCriticalApps.html

About Quorn Foods
Today, Quorn Foods is an independent company focused on creating the world’s leading meat-alternative business. Quorn Foods’ is headquartered in Stokesley, North Yorkshire and we employ around 600 people across three UK sites. Launched nationally in 1995, The Quorn brand offers a wide range of meat-alternative products, made using the proprietary technology of Mycoprotein that uniquely delivers the taste and texture of meat to the increasing number of people who have chosen to reduce, replace or cut out their meat consumption and who still want to eat a normal, healthy diet.

Monday 29 July 2013

The Need for SSD Speed, Tempered by Virtualization Budgets

Augie GonzalezWhat IT department doesn't lust for the hottest new SSDs? Costs usually tempers that lust.

Article from Virtualization Review By Augie Gonzalez: The Need for SSD Speed, Tempered by Virtualization Budgets

What IT department doesn't lust for the hottest new SSDs? You can just savor the look of amazement on users' faces when you amp up their systems with these solid state memory devices. Once-sluggish virtualized apps now peg the speed dial.

Then you wake up. The blissful dream is interrupted by the quarterly budget meeting. Your well-substantiated request to buy several terabytes of server-side flash is “postponed" -- that's finance's code-word for REJECTED. Reading between the lines, they're saying, “No way we're spending that kind of money on more hardware any time soon.”

Ironically, the same financial reviewers recommend fast-tracking additional server virtualization initiatives to further reduce the number of physical machines in the data center. Seems they didn't hear the first part of your SSD argument. Server consolidation slows down mission critical apps like SQL Server, Oracle, Exchange and SAP, to the point where they are unacceptable. Flash memories can buy back quickness.

This not-so-fictional scenario plays out more frequently than you might guess. According to a recent survey of 477 IT professionals conducted by DataCore Software, it boils down to one key concern: storage-related cost. Here are some other findings:
  • Cost considerations are preventing organizations from adopting flash memory and SSDs in their virtualization roll-outs. More than half of respondents (50.2 percent) said they are not planning to use flash/SSD for their virtualization projects due to cost.
  • Storage-related costs and performance issues are the two most significant barriers preventing respondents from virtualizing more of their workloads. 43 percent said that increasing storage-related costs were a “serious obstacle” or “somewhat of an obstacle. 42 percent of respondents said the same about performance degradation or inability to meet performance expectations. 
  • When asked about what classes of storage they are using across their environments, nearly six in ten respondents (59 percent) said they aren't using flash/SSD at all, and another two in ten (21 percent) said they rely on flash/SSD for just 5 percent of their total storage capacity. 
Yet, there is a solution – one far more likely to meet with approval from the budget oversight committee, and still please the user community.

Rather than indiscriminately stuff servers full of flash, I'd suggest using software to share fewer flash cards across multiple servers in blended pools of storage. By blended I mean a small percentage of flash/SSD alongside your current mix of high-performance disks and bulk storage. An effective example of this uses hardware- and manufacturer-agnostic storage virtualization techniques packaged in portable software to dynamically direct workloads to the proper class (or tier) of storage. The auto-tiering intelligence constantly optimizes the price/performance yield from the balanced storage pool. It also thin provisions capacity so valuable flash space doesn't get gobbled up by hoarder apps.

Auto-Tiering

The dynamic data management scheme gets one additional turbo boost. The speed-up comes from advanced caching algorithms in the software, for both storing and retrieving disk/flash blocks. In addition to cutting input/output latencies in half (or better), you'll get back tons of space previously wasted on short-stroking hard disk drives (HDDs). In essence, you no longer need to overprovision disk spindles trying to accelerate database performance, nor do you need to overspend on SSDs.

Of course, there are other ways for servers to share solid state memories. Hybrid arrays, for example, combine some flash with HDDs to achieve a smaller price tag. There are several important differences between buying these specialized arrays and virtualizing your existing storage infrastructure to take advantage of flash technologies, not the least of which is how much of your current assets can be leveraged. You could propose to rip out and replace your current disk farm with a bank of hybrid array, but how likely is that to get the OK?

Instead, device-independent storage virtualization software uniformly brings auto-tiering, thin provisioning, pooling, advanced read/write caching and several replication/snapshot services to any flash/SSD/HDD device already on your floor. For that matter, it covers the latest gear you'll be craving next year including any hybrid systems you may be considering. The software gets the best use of available hardware resources at the least cost, regardless of who manufactured it or what model you've chosen.

Virtualizing your storage also complies with the greater corporate mandate to further virtualize your data center. It's one of the unusual times when being extra frugal pays extra big dividends.

No doubt you've been bombarded by the term software-defined data center, and more recently, software-defined storage. It has become a pseudonym for storage virtualization, with a broader, infrastructure-wide scope; one spanning the many disk and flash-based technologies, as well as the many nuances which differentiate various models and brands. Without getting lost in too many semantics, it boils down to using smart software to get your arms around and derive greater value from all the storage hardware assets at your disposal.

Sounds like a very reasonable approach given how quickly disk technologies and packaging turn over in the storage industry.

One word of caution: Any software-defined storage inextricably tied to a piece of hardware may not be what the label advertises.

Yes, the need for speed is very real, but so is the reality of tight funding. Others have successfully surmounted your same predicament following the guidance above. It will surely ease the injection of flash/SSD technologies into your current environment, even under the toughest of scrutiny. At the same time, you'll strike the difficult but necessary balance between your business requirements for fast virtualized apps and your very real budget constraints; without question, a most desirable outcome.

About the Author
Augie Gonzalez is director of product marketing for storage virtualization software developer DataCore Software.

Saturday 20 July 2013

Best of the Tour de France 2013 Features DataCore's Storage Virtualization Mascot 'Corey'

Check out the Reuters photo coverage of the cycling event!
Featured Slide Show
Cycling fan Dave Brown from Park City of the U.S. poses while waiting for the riders climbing the Alpe d'Huez mountain during the 172.5km eighteenth stage of the centenary Tour de France cycling race from Gap to l'Alpe d'Huez, in the French Alps, July 18, 2013.

REUTERS/Jacky Naegelen
http://sports.yahoo.com/photos/best-of-the-tour-de-france-2013-slideshow/cycling-fan-dave-brown-park-city-u-poses-photo-135555539.html

Additional Pictures
Corey

Thursday 11 July 2013

One of Europe's largest construction project linking Scandinavia to mainland Europe picks DataCore storage virtualization to tier data and speed critical applications

Femern A/S, the managing company behind one of the most ambitious engineering projects connecting Scandinavia to mainland Europe, the Fehmarnbelt Tunnel, has adopted its SANsymphony-V to speed auto tiering of data and critical applications.

fehmarnbelt_tunnel_540

Tim Olsson, the IT manager gearing up to support the forthcoming €5.5 billion construction of the 18km immersed underwater tunnel, due for completion in 2021, said: "We are in the process of commencing 4 major construction contracts this summer to facilitate the start of the build of the Fehmarnbelt tunnel. The number of employees will start to escalate dramatically as we progress through the build; as well as the volume of CAD intensive documents and building designs and engineering specifications that will need to be accessed on the network."

Femern began the process of examining the present infrastructure and anticipating how it would scale as the construction progressed.

Within their Copenhagen data centre, Femern operated a Dell EqualLogic SAN that was reaching end of life with spare parts becoming increasingly hard to obtain and when new replacement drives did arrive, there were frequent incidences of premature failure. The organisation had already adopted virtualisation, with their critical apps running virtualised on VMware but they appeared slow due to disk I/O bottlenecks as workloads contended for the challenged EqualLogic storage. This was therefore not a time to consider allocating more disks, but an opportunity for an entire overhaul to an alternative software defined SAN and infrastructure.

"Not only was the EqualLogic box falling out of warranty and therefore coming with an increased high price tag, it seemed to have a lot of features that we never, or rarely, used. We felt certain that even if we considered an upgraded, latest EqualLogic box, that we would still be constrained by the hardware rigidity and lacking the impending scalability and flexibility that would be required to accommodate the various construction phases."

Femern consulted their day -to-day supplier of all IT solutions, COMM2IG, to make an official technical proposal on alternatives solutions to run in Copenhagen and on the construction sites and exhibition centre at the mouth of the tunnel in Rødbyhavn.

COMM2IG examined the existing environment in detail. Femern were running VMware's ESX for server virtualisation, but availability, ease of management and automatic allocation of storage were sadly lagging behind. COMM2IG grasped the opportunity to recommend the installation of SANsymphony-V software storage hypervisor to support the latest version of VMware's Vsphere delivering their tier 1 applications virtually. To help overcome performance issues, COMM2IG recommended that Femern incorporate flash-based Fusion-io, Inc.'s technology for their most important applications to achieve far faster speeds than from spinning disks and to overcome performance bottlenecks.

The software based solution - offering performance, flexibility and investment protection: Two SANsymphony-V nodes were implemented on two HP DL380 G7 servers. Each server was provisioned with two CPUs, 96GB of RAM for super caching performance boost and 320GB Fusion-io PCIe ioDrive2 cards for flash integration in the storage pool as part of a hybrid, automatically tiered storage configuration.

VMware's server virtualisation was upgraded to vSphere 5.1. Overall, the environment offered 50TB of mirrored storage based on HP direct-attached SASand SATA drives augmented by a flash storage layer. Use of the Fusion acceleration cards would be optimised through DataCore's auto tiering capabilities; reserving the more expensive flash storage layer for the most demanding applications; and ensuring that other, less accessed data, be relocated automatically to Femern's existing SAS and SATA based storage enabling a 1/2/3 auto tiered environment.

Results:Installation commenced with the help of COMM2IG and twelve months down the line, Femern are well placed to comment on the success of their software defined data centre. Most noticeable is the increased performance of their SQL applications which struggled with latency issues previously.

"From a user perspective, they used to experience slow response times and a performance lag from their SQL & Exchange applications running virtually on VMware. Today, and in the future, with DataCore in the background, the applications appear robust, instantaneous and seamless. We have achieved this without the cost prohibitive price tag of pure flash. That's what any IT department strives to achieve."

Olsson summarises: "As the data volumes increase up to ten fold as construction starts in 2015, intelligent allocations to flash will increase the lifespan of the Fusion-ioDrive and offset the overall cost of ownership, delivering less critical data to the most cost effective tier that can deliver acceptable performance. With this multi-faceted approach to storage allocation using DataCore's SANsymphony-V solution, Femern is able to manage all devices under one management interface; regardless of brand and type.  Given that DataCore has been robustly supporting VMware environments for many years, SANsymphony-V is viewed as the perfect complement, offering enhanced storage management and control intelligence. For Femern, this has manifested in reduction of downtime; adding new VMs; taking backups; migrating data and expanding capacity can all now be done without outages.

"What we have achieved here with DataCore storage virtualisation software sets us on the road to affordable, flexible growth to eliminate storage related downtime. Add the blistering speed of Fusion-io acceleration and we have created a super performing, auto tiered storage network, that does as the tunnel itself will do; connects others reliably, super fast and without stoppages," concludes Olsson.

From StorageNewsletter: https://www.storagenewsletter.com/news/customer/femern-fehmarnbelt-tunnel-datacore

A Defining Moment for the Software-Defined Data Center

Original post:  http://www.storagereview.com/a_defining_moment_for_the_softwaredefined_data_center

For some time, enterprise IT heads heard the phrase, “get virtualized or get left behind,” and after kicking the tires, the benefits couldn’t be denied and the rush was on. Now, there’s a push to create software-defined data centers. However, there is some trepidation as to whether these ground-breaking, more flexible environments can adequately handle the performance and availability requirements of business-critical applications, especially when it comes to the storage part of the equation. While decision-makers have had good reason for concern, they now have an even better reason to celebrate as new storage virtualization platforms have proven to overcome these I/O obstacles.
Software-defined Storage

Just as server hypervisors provided a virtual operating platform, a parallel approach to storage is quickly transforming the economics of virtualization for organizations of all sizes by offering the speed, scalability and continuous availability necessary to achieve the full benefits of software-defined data centers. Specifically, these advantages are widely reported:
  • Elimination of storage-related I/O bottlenecks in virtualized data centers
  • Harnessing flash storage resources efficiently for even greater application performance
  • Ensuring fast and always available applications without a substantial storage investment
Performance slowdowns caused by I/O bottlenecks and downtime attributed to storage-related outages are two of the foremost reasons why enterprises have refrained from virtualizing their Tier-1 applications such as SQL Server, Oracle, SAP and Exchange. This comes across clearly in the recent Third Annual State of Virtualization Survey conducted by my company which showed that 42% of respondents noted performance degradation or inability to meet performance expectations as an obstacle preventing them from virtualizing more of their workloads. Yet, effective storage virtualization platforms are now successfully overcoming these issues by using device-independent adaptive-caching and performance-boosting techniques to absorb wildly variable workloads, enabling applications to run faster virtualized.
To further increase Tier-1 application responsiveness, companies often spend excessively on flash memory-based SSDs. The Third Annual State of Virtualization Survey also reveals that 44% of respondents found disproportionate storage-related costs were an obstacle to virtualization. Again, effective storage virtualization platforms are now providing a solution with such features as auto-tiering. These enhancements optimize the use of these more premium-priced resources alongside more modestly priced, higher capacity disk drives.
Such an intelligent software platform constantly monitors I/O behavior and can intelligently auto-select between server memory caches, flash storage and traditional disk resources in real-time. This ensures that the most suitable class or tier of storage device is assigned to each workload based on priorities and urgency. As a result, a software defined data center can now deliver unmatched Tier-1 application performance with optimum cost efficiency and maximum ROI for existing storage.
Once I/O intensive Tier-1 applications are virtualized, the storage virtualization platform ensures high availability. It eliminates single points of failure and disruption through application-transparent physical separation, stretched across rooms or even off-site with full auto-recovery capabilities designed for the highest levels of business continuity. The right platform can effectively virtualize whatever storage is present, whether direct-attached or SAN-connected, to achieve a robust and responsive shared storage environment necessary to support highly dynamic, virtual IT environments.
Yes, the storage virtualization platform is a defining moment for the software defined data center. The performance, speed and high availability required for mission-critical databases and applications in a virtualized environment has been realized. Barriers have been removed, and there is a clear, supported path for realizing greater cost efficiency. Still, selecting the right platform is critical to a data center. Technology that is full-featured and has been proven “in the field” is essential. Also, it’s important to go with an independent, pure software virtualization solution in order to avoid hardware lock-in and to take advantage of future storage developments that will undoubtedly occur.
George Teixeira - Chief Executive Officer & President DataCore Software
George TeixeiraGeorge Teixeira is CEO and president of DataCore Software, the premier provider of storage virtualization software. The company’s software-as infrastructure platform solves the big problem stalling virtualization initiatives by eliminating storage-related barriers that make virtualization too difficult and too expensive.
DataCore Software-Defined Storage

Paul Murphy Joins DataCore as Vice President of Worldwide Marketing to Build on Company’s Software-Defined Storage Momentum

Former VMware and NetApp Marketing and Sales Director to Spearhead Strategic Marketing and Demand Generation to Drive Company’s Growth and Market Leadership in Software-Defined Storage

Paul Murphy“The timing is perfect. DataCore has just updated its SANsymphony-V storage virtualization platform and it is well positioned to take advantage of the paradigm shift and acceptance of software-defined storage infrastructures,” said Murphy. “After doing the market research and getting feedback from numerous customers, it is clear to me that there is a large degree of pent-up customer demand. Needless to say, I’m eager to spread the word on DataCore’s value proposition and make a difference in this exciting and critical role.”

Paul Murphy joins DataCore Software as the vice president of worldwide marketing. Murphy will oversee DataCore’s demand generation, inside sales and strategic marketing efforts needed to expand and accelerate the company’s growth and presence in the storage and virtualization sectors.  He brings to DataCore a proven track-record and a deep understanding of virtualization, storage technologies and the pivotal forces impacting customers in today’s ‘software-defined’ world. Murphy will drive the company’s marketing organization and programs to fuel sales for DataCore’s acclaimed storage virtualization software solution, SANsymphony- V.

“Our software solutions have been successfully deployed at thousands of sites around the world and now our priority is to reach out to a broader range of organizations that don’t yet realize the economic and productivity benefits they can achieve through the adoption of storage virtualization and SANsymphony-V,” said DataCore Software’s Chief Operating Officer, Steve Houck. “Murphy brings to the company a fresh strategic marketing perspective, the ability to simplify our messaging, new ways to energize our outbound marketing activities and the drive to expand our visibility and brand recognition around the world.”

With nearly 15 years of experience in the technology industry, Murphy possesses a diverse range of skills in areas including engineering, services, sales and marketing, which will be instrumental in overseeing DataCore’s marketing activities around the globe. He was previously Director Americas SMB Sales and Worldwide Channel Development Manager at VMware, where he developed go-to-market strategies and oversaw both direct and inside channel sales teams in both domestic and international markets.

Prior to that, Murphy was senior product marketing manager at NetApp, focusing on backup and recovery solutions and their Virtual Tape Library product line. In this role, Murphy led business development activities, sales training, compensation programs and joint-marketing campaigns. An excellent communicator, he has been a keynote speaker at numerous industry events, trade shows, end-user seminars, sales training events, partner/reseller events and webcasts. Before moving into sales and marketing, Murphy had a successful career in engineering.

Tuesday 9 July 2013

Massive engineering project linking Scandinavia to mainland Europe picks DataCore storage virtualization to tier data and speed critical applications

Femern A/S, the managing company behind one of the most ambitious engineering projects connecting Scandinavia to mainland Europe, the Fehmarnbelt Tunnel, has adopted its SANsymphony-V to speed auto tiering of data and critical applications.

fehmarnbelt_tunnel_540

Tim Olsson, the IT manager gearing up to support the forthcoming €5.5 billion construction of the 18km immersed underwater tunnel, due for completion in 2021, said: "We are in the process of commencing 4 major construction contracts this summer to facilitate the start of the build of the Fehmarnbelt tunnel. The number of employees will start to escalate dramatically as we progress through the build; as well as the volume of CAD intensive documents and building designs and engineering specifications that will need to be accessed on the network."

Femern began the process of examining the present infrastructure and anticipating how it would scale as the construction progressed.

Within their Copenhagen data centre, Femern operated a Dell EqualLogic SAN that was reaching end of life with spare parts becoming increasingly hard to obtain and when new replacement drives did arrive, there were frequent incidences of premature failure. The organisation had already adopted virtualisation, with their critical apps running virtualised on VMware but they appeared slow due to disk I/O bottlenecks as workloads contended for the challenged EqualLogic storage. This was therefore not a time to consider allocating more disks, but an opportunity for an entire overhaul to an alternative software defined SAN and infrastructure.

"Not only was the EqualLogic box falling out of warranty and therefore coming with an increased high price tag, it seemed to have a lot of features that we never, or rarely, used. We felt certain that even if we considered an upgraded, latest EqualLogic box, that we would still be constrained by the hardware rigidity and lacking the impending scalability and flexibility that would be required to accommodate the various construction phases."

Femern consulted their day -to-day supplier of all IT solutions, COMM2IG, to make an official technical proposal on alternatives solutions to run in Copenhagen and on the construction sites and exhibition centre at the mouth of the tunnel in Rødbyhavn.

COMM2IG examined the existing environment in detail. Femern were running VMware's ESX for server virtualisation, but availability, ease of management and automatic allocation of storage were sadly lagging behind. COMM2IG grasped the opportunity to recommend the installation of SANsymphony-V software storage hypervisor to support the latest version of VMware's Vsphere delivering their tier 1 applications virtually. To help overcome performance issues, COMM2IG recommended that Femern incorporate flash-based Fusion-io, Inc.'s technology for their most important applications to achieve far faster speeds than from spinning disks and to overcome performance bottlenecks.

The software based solution - offering performance, flexibility and investment protection: Two SANsymphony-V nodes were implemented on two HP DL380 G7 servers. Each server was provisioned with two CPUs, 96GB of RAM for super caching performance boost and 320GB Fusion-io PCIe ioDrive2 cards for flash integration in the storage pool as part of a hybrid, automatically tiered storage configuration.

VMware's server virtualisation was upgraded to vSphere 5.1. Overall, the environment offered 50TB of mirrored storage based on HP direct-attached SASand SATA drives augmented by a flash storage layer. Use of the Fusion acceleration cards would be optimised through DataCore's auto tiering capabilities; reserving the more expensive flash storage layer for the most demanding applications; and ensuring that other, less accessed data, be relocated automatically to Femern's existing SAS and SATA based storage enabling a 1/2/3 auto tiered environment.

Results:Installation commenced with the help of COMM2IG and twelve months down the line, Femern are well placed to comment on the success of their software defined data centre. Most noticeable is the increased performance of their SQL applications which struggled with latency issues previously.

"From a user perspective, they used to experience slow response times and a performance lag from their SQL & Exchange applications running virtually on VMware. Today, and in the future, with DataCore in the background, the applications appear robust, instantaneous and seamless. We have achieved this without the cost prohibitive price tag of pure flash. That's what any IT department strives to achieve."

Olsson summarises: "As the data volumes increase up to ten fold as construction starts in 2015, intelligent allocations to flash will increase the lifespan of the Fusion-ioDrive and offset the overall cost of ownership, delivering less critical data to the most cost effective tier that can deliver acceptable performance. With this multi-faceted approach to storage allocation using DataCore's SANsymphony-V solution, Femern is able to manage all devices under one management interface; regardless of brand and type.  Given that DataCore has been robustly supporting VMware environments for many years, SANsymphony-V is viewed as the perfect complement, offering enhanced storage management and control intelligence. For Femern, this has manifested in reduction of downtime; adding new VMs; taking backups; migrating data and expanding capacity can all now be done without outages.

"What we have achieved here with DataCore storage virtualisation software sets us on the road to affordable, flexible growth to eliminate storage related downtime. Add the blistering speed of Fusion-io acceleration and we have created a super performing, auto tiered storage network, that does as the tunnel itself will do; connects others reliably, super fast and without stoppages," concludes Olsson.


 

Monday 8 July 2013

DataCore Updates and Advances its Proven SANsymphony-V Storage Virtualization Platform

DataCore Software Builds on its Software-Defined Storage Lead with Enhancements to its Proven SANsymphony-V Storage Virtualization Platform
 
DataCore continues to advance and evolve its device-independent storage management and virtualization software, while maintaining focus on empowering IT users to take back control of their storage infrastructure. To that end, the company announced today significant enhancements to the comprehensive management capabilities within version R9 of its SANsymphony™-V storage virtualization platform.

 New advancements in SANsymphony-V include:
  • Wizards to provision multiple virtual disks from templates
  • Group commands to manage storage for multiple application hosts
  • Storage profiles for greater control and auto-tiering across multiple levels of flash, solid state (SSDs) and hard disk technologies
  • A new database repository option for recording and analyzing performance history and trends
  • Greater configurability and choices for incorporating high-performance “server-side” flash technology and cost-effective network attached storage (NAS) file serving capabilities
  • Preferred snapshot pools to simplify and segregate snapshots from impacting production work
  • Improved remote replication and connectivity optimizations for faster and more efficient performance
  • Support for higher speed 16Gbit Fibre Channel networking and more.
For more details, please read: What's New in SANsymphony-V R9.0.3

“Storage is undergoing a sea-change today and traditional hardware manufacturers are suffering because they are in catch-up mode to meet the ‘new world order’ for software-defined storage where automation, fast flash technologies and hardware interchangeability are standard,” said George Teixeira, co-founder, president and CEO of DataCore Software. “We have listened to our customers and stayed true to our vision. With the latest release of SANsymphony-V, we are well-positioned to help organizations manage growth and leverage existing investments, while making it simple to incorporate current and future innovations. Our software features and flexibility empowers CIOs and IT admins to overcome the many storage challenges faced in a dynamic virtual world.”

Real-World Software-Defined Storage: Customer-driven Enhancements Overcome Challenges
Many of the new features which extend the scope and breadth of storage management would not even occur to companies just developing a software-defined package. They are the product of DataCore’s 15 years of customer feedback and field-proven experience in broad scenarios across the globe.

The enhancements introduced in the latest version of SANsymphony-V take on major challenges faced by large scale IT organizations and more diverse mid-size data centers. Aside from confronting explosive storage growth (multi-petabyte disk farms), organizations are experiencing massive virtual machine (VM) sprawl where provisioning, partitioning and protecting disk space taxes both staff and budget. Problems are further aggravated by the insertion of flash technologies and SSDs used to speed up latency-sensitive workloads. The time and resource demands required to manage a broadening diversity of different storage models, disk devices and flash technologies – even when standardized with a single manufacturer – are a growing burden for organizations already struggling to meet application performance needs on limited budgets.

The bottom line is that companies are forced to confront many unknowns in terms of storage. With traditional storage systems, the conventional practice has been to oversize and overprovision storage with the hope that it will meet new and unpredictable demands, but this drives up costs and too often fails to meet performance objectives. As a result, companies have become smarter and have realized that it is no longer feasible or sensible to simply throw expensive, purpose-built hardware at the problem. Companies today are demanding a new level of software flexibility that endures over time and adds value over multiple generations and types of hardware devices. What organizations require is a strategic – rather than an ad hoc – approach to managing storage.

Notable Advances with SANsymphony-V Update 9.0.3