Monday, 7 April 2014

Menora Foods Australia's Leading Food Distributor Achieves 100% Uptime and Performance with DataCore Software-Defined Storage

DataCore’s Software Defined Storage Platform Delivers Increased Application Performance and Reduced Costs As a Result of SANsymphony-V

Menora Foods, Australia’s leading food marketing and distribution business, has successfully implemented DataCore’s SANsymphony-V storage virtualization platform. Backed by DataCore, Menora Foods software-defined storage architecture can now ensure that their critical business applications, including the Menora-developed ERP system, the company’s warehousing systems, distribution systems and more, remain available and don’t interrupt business productivity.
“DataCore’s SANsymphony-V storage virtualization platform protects and mirrors all of our vital storage and VMs across our campus-wide sites and has made our lives far easier,” explained Vikash Reddy, IT Manager at Menora Foods. “Now we can sleep peacefully without the worry of our systems failing as SANsymphony-V provides the high availability and business continuity results we were seeking.”
Menora Foods owns and distributes some of Australia’s favorite and most trusted brands. The leading food marketing and distribution company imports food products from many different countries and distributes them to major retail supermarket chains. Without SANsymphony-V, Menora Foods risked physical disk failure. Prior to DataCore, if a server went down a day or two would be spent trying to restore the primary branch – and secondary branches would consequently fail. The result were damaging to the company’s overhead in production and cost. All of Menora Foods’ data and VMs are now being supported by SANsymphony-V storage.
“With DataCore’s virtualized storage platform, we noticed an increase in performance on the report coming out of our ERP system. When the ERP system was on physical storage, some of the reports were noticeably slow to generate,” explained Vikash. “However, with DataCore in place there has been around a 30% increase in speed and time. The performance has definitely increased and DataCore was the difference.”
DataCore’s SANsymphony-V has increased Menora Foods’ uptime to 99.999%. Two servers are now implemented on DataCore’s virtualization platform and synchronously mirrored resulting in all systems being accessible and operational, eliminating downtime if one server room powered off. According to Menora Foods, the cost savings between DataCore versus traditional hardware SAN systems was somewhere between $60,000 and $100,000. Menora Foods will continue to cut costs in the future due to the company’s ability to utilize DataCore on existing hardware without upgrading to new models every three to five years, as compared to a traditional SAN vendor.
“As the largest food distributor in Australia, Menora Foods needs to ensure that their critical business applications are always on and running at peak performance,” said Steve Houck, COO at DataCore. “By leveraging DataCore’s storage virtualization platform, Menora Foods has noticeably increased application uptime and performance, all while reducing their storage related costs.”

Friday, 4 April 2014

Italy's Ministry of Economy and Finance Chooses DataCore Software-Defined Storage to Modernize their Storage Infrastructure




The "Ministero dell'Economia e delle Finanze" (the Italian Ministry of Economy and Finance, MEF) located in Rome deployed DataCore's SANsymphony-V solution to improve productivity and modernize its mission-critical IT infrastructure. The DataCore software-defined storage platform has been implemented to centralize storage management and improve the utilization across a wide range of storage hardware systems and devices from different vendors including EMC and HP. DataCore consolidates and simplifies the provisioning of storage resources, significantly accelerates performance and adds high availability and a new level of flexibility to the existing mix as well as to future storage additions.

Italy's Ministry of Finance software defines its storage "We chose DataCore because we wanted a solution that would allow MEF to modernize and virtualize our storage and IT infrastructure without being locked- in to specific hardware vendors or technologies. This gives us flexibility, scalability and freedom in our choice. SANsymphony-V allows us to use the most appropriate and innovative offerings on the market and makes it easy if needed to grow or adapt our environment to meet future requirements. DataCore not only reduces our storage related costs by consolidating management, it enables us to buy less expensive hardware and it also enables us to protect our existing investments. Moreover, the software-defined layer in our infrastructure gives us flexibility to optimize whatever we use and puts us back in control to shop storage for the best value to allow us to cost-effectively deal with growth," comments the information systems manager at MEF.

The Ministry of Economy and Finance, also known by the acronym MEF, is one of the most important and influential ministries within the Italian Government. It is the executive body responsible for economic, financial and budget policy. The organization manages the planning of public investments, coordinates public expenditures and verifies its trends, revenue policies and the overall tax system. The MEF operates the State's public land and heritage, land register and customs; it plans, coordinates and verifies operations to foster economic, local and sectorial development, and is responsible for setting out cohesive policies, processes and the requirements pertaining to the public budget.

As part of the project to optimize and consolidate IT data centers, MEF selected DataCore's software-defined storage solution with the primary objective of preserving their existing and very diverse set of storage investments made over many years - comprising of a range of systems from EMC VMax and EMC Centera to HP EVAs. In addition, it was critical for MEF to streamline and centralize management in order to gain productivity and to provision highly available storage capacity when needed, where needed quickly within minutes.

The DataCore SANsymphony-V software is deployed on four standard x86 server platforms, providing redundancy and data protection together with centralized management of over 200TB of storage across multiple EMC VMax, EMC Centera and HP EVA systems. MEF will also benefit from the addition of high-end advanced storage features including thin provisioning, metro-wide mirroring, high-speed adaptive caching, replication and auto-tiering, all of which can be applied to their existing and future storage investments.

Tuesday, 25 March 2014

IMATION DELIVERS A COMPELLING COST-EFFECTIVE SOLUTION TO EXPENSIVE STORAGE SYSTEMS WITH DATACORE READY NEXSAN E-SERIES STORAGE ARRAYS

Nexsan E-Series storage arrays coupled with the SANsymphony-V storage virtualization software provide powerful and cost-effective storage infrastructure for Microsoft, VMware, Oracle and other virtualized environments


“The combined DataCore SANsymphony-V and Nexsan E-Series solution allows IT administrators to create the optimal virtualized storage architecture on highly reliable, proven Nexsan E-Series hardware,” said Mike Stolz, vice president of marketing and technical services for Imation’s Nexsan solutions.



Mike Stolz continues “Without virtualization, storage and compute capacity are often wasted in application-specific silos. In a virtualized environment, hardware resources can be pooled and optimized from heterogeneous devices, which significantly improves flexibility for the IT administrator.”  

Imation (NYSE:IMN), a global data storage and information security company, announced that its Nexsan™ E-Series™ storage arrays have been certified as DataCore Ready. When combined with the DataCore SANsymphony-V storage virtualization platform, the solution offers data centers rapid ROI through significant improvements in data availability, agility, performance and flexibility.

The integrated solution offers users a variety of benefits, including:

Performance – Using Nexsan E-Series arrays as a high-performing storage foundation, SANsymphony-V further increases performance by supporting RAM caches of up to a terabyte to dramatically accelerate reads and writes. SANsymphony-V also automatically rebalances loads due to hotspots to further improve response and throughput.

Efficient provisioning – The Nexsan E-Series system allows IT professionals to mix-and-match SAS, SATA and solid state drives. SANsymphony-V then provisions virtual disks as required to support different types of workloads. Granular thin provisioning and automated capacity reclamation gives administrators more options for improving efficiency.

Auto-tiering – SANsymphony-V continuously analyzes what blocks of data need higher I/O throughput and automatically assigns those blocks to the appropriate storage tier. Throughout this automated process, priority workloads like SQL databases can be given preference to fast storage including flash, while cooler data or lower prioritized workload data can be moved to lower-cost drives. E-Series storage arrays deliver virtually unlimited flexibility for configuring storage tiers according to capacity, performance and price characteristics, and the solution enables multiple tiers to be configured in a single Nexsan E-Series system or across other existing storage hardware.

Business continuity – By keeping data in two physically separate locations at the same time with the help of synchronous mirroring, the combined solution prevents storage from becoming a single point of failure for stretched cluster configurations. Imation and DataCore offer an appealing alternative to very expensive storage arrays. Disaster recovery scenarios also are supported by enabling asynchronous replication across distant sites.

Migration – Storage investments risk creating more complexity and islands of incompatible devices. DataCore SANsymphony-V and Nexsan E-Series work together to pool existing storage assets, eliminating risk of incompatibility and improving ROI across the entire infrastructure. Also, DataCore SANsymphony-V fully virtualizes data from the underlying hardware, enabling migration of data from legacy systems to Nexsan E-Series with minimal or no interruption to running workloads.

“DataCore and Imation have a long-standing history of collaboration as evidenced by numerous joint customers worldwide who leverage our solutions as a part of their storage architectures,” said Carlos M. Carreras, vice president of alliances and business development at DataCore. “Nexsan E-Series storage arrays combined with SANsymphony-V offer IT departments a highly efficient blueprint for optimizing IT resources and maximizing the effective value of storage investments.”


For the complete announcement, please see:  Imation News Release

Wednesday, 12 March 2014

DataCore at Cebit 2014 showcases new solutions and the many advantages of DataCore’s Software-defined Storage platform

DataCore’s latest SANsymphony-V platform is being demonstrated and displayed at one of the world's largest IT exhibits at Cebit in Germany; DataCore's booth is within the “Virtualisation & Storage Forum“ (hall 2, booth A44). 







Additionally, DataCore has also introduced its next release of DataCore Virtual Desktop Server (VDS) with new enhancements and pricing that make it simple to deploy cost-effective and easy to use virtual desktop environments.



Saturday, 8 March 2014

ARN: IDS snares A/NZ distributorship for DataCore Software Defined Storage; SANsymphony-V storage virtualisation solutions

The Australian value-added distributor, which was founded in 2004, now has offices in Sydney, Brisbane and Canberra and is close to expanding to Melbourne.
IDS has signed a distributorship with storage virtualisation company DataCore to distribute SANsymphony-V and lists tier one resellers Fujitsu, Datacom, Data#3, Di Data, Hitachi, Fujitsu and AlphaWest on it’s books.
“We doubled our revenue last year and we expect to do the same again this year,” IDS director Ian Deane said.
“We are an expanding company. Two years ago we were unheard of, now the big boys are sitting up and listening.”
ARN: IDS snares A/NZ distributorship for DataCore Software

Wednesday, 5 March 2014

SiliconAngle: What’s the future of the data center? What about Software-defined Storage?

 
As detailed in a recent article on Wikibon, “Data centers are at the center of modern software technology, serving a critical role in the expanding capabilities for enterprises.” Data centers have enabled the enterprise to do much more with much less, both in terms of physical space and the time required to create and maintain mission-critical information.

But the technology surrounding the data center is positioned to evolve even more dramatically, in terms of conception, configuration, and utilization. More importantly, the technologies surrounding the data center will both have an impact and be impacted over time. Towards the end of last year, Gartner identified eight areas to consider when developing a data center strategy that balances cost, risk and agility. We wanted to take that analysis further, reaching out to thought leaders across the enterprise technology space, for their perspectives on the future of data center technology.

DataCore CEO on the traditional Storage Model is Broken
George Teixiera, President, DataCore Software The traditional storage model is broken. Behavior has shifted and disrupted how businesses buy storage as they can no longer afford to rip out the old and throw more costly new hardware at their problems. Instead they are seeking smart automated software that runs on lower cost hardware and optimizes their existing investments and provides the agility to easily add new technologies non-disruptively. Bottom-line, the path to a software-defined data center where users gain freedom of hardware choice and control of their resources is inevitable and that means storage must also be software-defined.

Mr. Teixeira creates and executes the overall strategic direction and vision for DataCore Software. Mr. Teixeira co-founded the company and has served as CEO and President of DataCore Software since 1998.

Full story: What’s the future of the data center? The big list of thought leadership perspective

Friday, 28 February 2014

The Register interviews DataCore CEO who waxes lyrically on software-defined storage and future storage technologies

El Reg: DataCore had a terrific year in 2013, breaking the 10,000 customer site number. It's a solidly successful supplier with a technology architecture that's enabled it and its customers to take advantage of industry standard server advances and storage architecture changes gracefully. 2014 should be no different with DataCore being stronger still by the end of it.”

Article: Future storage tech should KILL all-in-one solutions

El Reg had a conversation with DataCore president, CEO and co-founder George Teixeira about what’s likely to happen in 2014 with DataCore. Software-defined storage represents a trend that his company is well-positioned to take advantage of and he reckons DataCore could soar this year.

El Reg: How would you describe 2013 for DataCore?

George Teixeira: 2013 was the ‘tip of the iceberg’, in terms of the increasing complexity and the forces in play disrupting the old storage model, which has opened the market up. As a result, DataCore is positioned to have a breakout year in 2014 … Our momentum surged forward as we surpassed 25,000 license deployments at more than 10,000 customer sites [last year].

What’s more, the EMC ViPR announcement showcased the degree of industry disruption. It conceded that commoditisation and the movement to software-defined storage are inevitable. It was the exclamation point that the traditional storage model is broken.

El Reg: What are the major trends affecting DataCore and its customers in 2014?

George Teixeira: Storage got complicated as flash technologies emerged for performance, while SANs continued to optimise utilisation and management - two completely contradictory trends. Add cloud storage to the mix and all together, it has forced a redefinition of the scope and flexibility required by storage architectures. Moreover, the complexity and need to reconcile these contradictions put automation and management at the forefront, thus software.

A new refresh cycle is underway … the virtualisation revolution has made software the essential ingredient to raise productivity, increase utilisation, and extend the life of … current [IT] investments.

Last year set the tone. Sales were up in flash, commodity storage, and virtualization software. In contrast, look what happened to the expensive, higher margin system sales last year – they were down industry-wide. Businesses realized they no longer can afford inflexible and tactical hardware-defined models. Instead, they are increasingly applying their budget dollars to intelligent software that leverages existing investments and lower cost hardware – and less of it!

Server-side and flash technology for better application performance has taken off. The concept is simple. Keep the disks close to the applications, on the same server and add flash for even greater performance. Don’t go out over the wire to access storage for fear that network latency will slow down I/O response. Meanwhile, the storage networking and SAN supporters contend that server-side storage wastes resources and that flash is only needed for five percent of the real-world workloads, so there is no need to pay such premium prices. They argue it is better to centralize all your assets to get full utilization, consolidate expensive common services like back-ups and increase business productivity by making it easier to manage and readily shareable.

The major rub is obvious. It appears when local disk and flash storage resources, which improve performance, defeat the management and productivity gains from those same resources by unhooking them from being part of the centralised storage infrastructure. Software that can span these worlds appears to be the only way to reconcile the growing contradiction and close the gap. Hence the need for an all-inclusive software-defined storage architecture.

A software-defined storage architecture must manage, optimise and span all storage, whether located server-side or over storage networks. Both approaches make sense and need to be part of a modern software-defined architecture.

Why force users to choose? Our latest SANsymphony-V release allows both worlds to live in harmony since it can run on the server-side, on the SAN or both. Automation in software and auto-tiering across both worlds is just the beginning. Future architectures must take read and write paths, locality, cache and path optimizations and a hundred other factors into account and generally undermine the possibility of all-in-one solutions. A true ‘enterprise-wide’ software-defined storage architecture must work across multiple vendor offerings and across the varied mix and levels of RAM devices, flash technologies, spinning disks and even cloud storage.

El Reg: How will this drive DataCore's product (and service?) roadmap this year?

George TeixeiraThe future is an increasingly complex, but hidden environment, which will not allow nor require much if any human intervention. This evolution is natural. Think about what one expects from today’s popular virtualization offerings and major applications, but exactly this type of sophistication and transparency. Why should storage be any different?
Unlike others who are talking roadmaps and promises, DataCore is already out front and has set the standard for software-defined storage as our software is in production at thousands of real world customer sites today. DataCore offers the most comprehensive and universal set of features and these features work across all the popular storage vendor brands, models of storage arrays and flash devices. Automating and optimizing the use of these devices no matter where they reside within servers or in storage networks.

DataCore will continue to focus on evolving its server-side capabilities, enhance the performance and productive use of in-memory technologies like DRAM, flash and new wave caching technologies across storage environments. [We'll take] automation to the next level.

DataCore already scales out to support up 16 node grid architectures and we expect to quadruple that number this year.

[We] will continue to abstract complexity away from the users and reduce mundane tasks through automation and self-adaptive technologies to increase productivity. ... For larger scale environments and private cloud deployments, there will be a number of enhancements in the areas of reporting, monitoring and storage domain capabilities to simplify management and optimise ‘enterprise-wide’ resource utilisation.

VSAN "opens the door without walking through it."
El Reg: How does DataCore view VMware's VSAN? Is this a storage resource it can use?

George Teixeira: Simply put, it opens the door without walking through it. It introduces virtual pooling capabilities for server-side storage that meets lower-end requirements while delivering promises of things to come for VMware-only environments. It sets the stage for DataCore to fullfil customers’ need to seamlessly build out a production class, enterprise-wide software-defined storage architecture.

It opens up many opportunities for DataCore for users who want to upscale. VSAN builds on DataCore concepts but is limited and just coming out of beta, whereas DataCore has a 9th generation platform in the marketplace.

Beyond VSAN, DataCore spans a composite world of storage running locally in servers, in storage networks, or in the cloud. Moreover, DataCore supports both physical and virtual storage for VMware, Microsoft Hyper-V, Citrix, Oracle, Linux, Apple, Netware, Unix and other diverse environments found in the real world.

Will you be cheeky enough to take a bite at EMC?
El Reg: How would you compare and contrast DataCore's technology with EMC's ScaleIO?
George Teixeira: We see ourselves as a much more comprehensive solution. We are a complete storage architecture. I think the better question is how will EMC differentiate its ScaleIO offerings from VMware VSAN solutions?
George Teixeira: Both are trying to address the server-side storage issues using flash technology, but again why have different software and separate islands or silos of management, one for the VMware world, one for the Microsoft Hyper-V world, one for the physical storage world, one for the SAN?
This looks like a divide and complicate approach when consolidation, automation and common management across all these worlds are the real answers for productivity. That is the DataCore approach. Instead, this silo approach of many offerings appears to be driven by the commercial requirements of EMC versus the real needs of the marketplace.
El Reg: Does DataCore envisage using an appliance model to deliver its software?
George Teixeira: Do we envisage an appliance delivery model? Yes, in fact, it already exists. We work with Fujitsu, Dell and a number of system builders to package integrated software/hardware bundles. We are rolling out the new Fujitsu DataCore SVA storage virtualization appliance platform within Europe this quarter.
We also have the DataCore appliance builder program in place targeting the system builder community and have a number of partners including Synnex Hyve Solutions which provides virtual storage appliances using Dell, HP or Supermicro Appliances Powered by DataCore and Fusion-io.
El Reg: How will DataCore take advantage of flash in servers used as storage memory? I'm thinking of the SanDisk/SMART ULLtraDIMM used in IBM's X6 server and Micron's coming NVDIMM technology.
George Teixeira: We are already positioned to do so. New hybrid NVDIMM type technology is appealing. It sits closer to the CPU and promises flash speeds without some of the negatives. But it is just one of the many innovations yet to come.
Technology will continue to blur the line between memory and disk technologies going forward, especially in the race for faster application response times by getting closer to the CPU and central RAM.
Unlike others who will have to do a lot of hard coding, and suffer the delays and reversals of naivety trying to keep up with new innovations, DataCore learned from experience and designed early on software to rapidly absorb new technology and make it work. Bring on these types of innovations; we are ready to make the most of them.
El Reg: Does DataCore believe object storage technology could be a worthwhile addition to its feature set?
George Teixeira: Is object storage worthwhile? Yes, we are investing R&D and building out our OpenStack capabilities as part of our object storage strategy; other interfaces and standards are also underway. It is part of our software-defined storage platform, but it makes no sense to bet simply on the talk in the marketplace and on these emerging strategies exclusively.
The vast bulk of the marketplace is file and block storage capabilities and that is where we are primarily focused this year. Storage technology advances and evolves, but not necessarily in the way we direct it to, therefore object storage is one of the possibilities for the future along with much more relevant, near term advances that customers can benefit from today.
El Reg: DataCore had a terrific year in 2013, breaking the 10,000 customer site number. It's a solidly successful supplier with a technology architecture that's enabled it and its customers to take advantage of industry standard server advances and storage architecture changes gracefully. 2014 should be no different with DataCore being stronger still by the end of it. ®

Wednesday, 26 February 2014

Opinion and Insights: Revenues Trump Innovation at Large Tech Companies

Let’s face it. Large B2B tech companies don’t come up with groundbreaking innovations. They are too big and are too busy with action items, corporate politics and intense pressure from the street to develop new innovations and bring those ideas to market in a reasonable timeframe. They also have a big reason not to monkey with the status quo, billions of reasons actually. Faster, cheaper, better typically means less revenue.
Turns out Wall Street isn’t a big fan of cool innovations that decrease revenue, which shouldn’t come as a surprise. Perched high on the list of career limiting moves in Silicon Valley is driving a plan to develop innovative products and services that will decrease revenue.
The truth is, most great ideas from the major hardware and software vendors leave with their creators, stop on Sand Hill road for funding and end up in a little office in the Valley. Big ideas can mean big money. Visionary engineers aren’t turning their moment of genius over to a big political machine to get kicked around meeting rooms for three years only to resurface neutered and too late to matter anyway.
Big tech companies, especially hardware vendors, will continue to fight to the death to keep their antiquated products moving off the shelves. Sluggish economics made 2013 a great year to buy dated technology; 70, 80, even 90 percent discounts on six-figure hardware deals. What’s happening here? While these products are selling for next to nothing, vendors are now charging 20 percent of the list price for support and updates.
This is a plan that certainly works for Wall Street and it is hard for customers to resist. What would you do if you bought a new car two years ago for $50,000 and your dealer called you at the end of the year saying they’d give you a new care for $5,000? Anyone would take that deal!
VMware had this same problem getting their groundbreaking technology off the ground. Customers could install their technology and use 90 percent less server hardware. While this was great for the customers, HP, Dell and IBM certainly weren’t happy with VMware. The same thing could be said for resellers, as they certainly weren’t interested in selling 90 percent less hardware.
The resellers are just as weary about messing with their nest egg. However large or small, the few VARs that actually survive their first two years in business did so for a reason — they realized that running a tech resale business is about making money. The rash of leads, rebates and SPIFs coming from their big vendors are too lucrative for the resellers with influence to take a chance on anything new.
The hardest part of getting new innovations to market is cutting through the big tech companies marketing machines and their incredibly talented (and well paid) sales people to get prospective customers to consider new technologies. There really isn’t a villain to blame, but there is a hero.
Ultimately, the buyers of technology are the heroes of our industry. They are our saving grace. Despite pressure from their upper management, the people in the trenches that do this because they love it are the ones that keep us moving in the right direction.
To those visionary technologists, don’t believe all the marketing hype from big tech. Keep innovating. By doing so, you will help shape the future of the industry.
Paul Murphy is VP of Worldwide Marketing at DataCore.


Thursday, 20 February 2014

Legacy Pharmaceuticals Trusts DataCore to Scale Out Its Virtual Infrastructure and Support its Critical ERP, Oracle and VMware Systems


“DataCore SANsymphony-V gave us more flexiblity and scalablity than a hardware SAN and it was able to meet our high availability and performance requirements perfectly. Not only did we gain a redundant SAN architecture, but we achieved a 40 percent savings in the initial investment in comparison to conventional hardware SAN solutions and now we can adjust our storage infrastructure to meet our requirements without having any storage vendor lock-in", Sascha Fritz, IT Expert at Legacy Pharmaceuticals Switzerland.

Legacy Pharmaceuticals have entrusted DataCore Software’s SANsymphony-V to implement and scale out their virtual infrastructure to support the introduction of their critical ERP system based on IFS and Oracle databases. The software-defined virtual infrastructure is based on DataCore for storage virtualisation and VMware for server virtualisation. 

Overall, the DataCore solution provided higher performance and higher availability compared to all other evaluated alternatives and the total cost, including what was required for a redundant SAN hardware and software design, was around 60 per cent lower than the price of alternative non-redundant single SAN systems.

As a leading company in the Swiss pharmaceutical industry, Legacy Pharma is committed to the highest quality and safety standards. These are mandated by the Swiss regulatory and supervisory authority and have specific compliance requirements and remedies which are regularly audited.

The compliance guidelines, required by the state as well as those established based on voluntary standards, must also fit within the practical bounds of economic feasibility in regards to the information technology that is implemented. The need for compliance, cost-effectiveness and innovative IT productivity are key competitive advantages to maintain leadership in the industry. For these reasons, Legacy Pharma decided to implement a new IT infrastructure and ERP validated system according to the latest GAMP 5 guidelines. "Good Automated Manufacturing Practice" (short: GAMP) is a standard framework for the validation of computerised systems within the pharmaceutical industry. However, for the ERP system with IFS, the older AS400 environment with NetApp storage but critically without full redundancy, were no longer suitable.

"Therefore our IT environment had to be upgraded. In cooperation with IT service provider Steffen Informatik, we decided to go with a virtual infrastructure and put our ERP system on virtual machines running on VMware," says Sascha Fritz.

At the same time, a new storage solution with appropriate availability and performance needed to be found. The SAN solutions offered by NetApp, EMC and HP that were evaluated did not allow a fully redundant environment within budget. Finally, the project virtualisation experts at Steffen Informatik presented DataCore’s SANsymphony-V storage virtualisation software that provided sophisticated enterprise storage features and using standard industry storage components.

SANsymphony-V virtualises storage resources regardless of manufacturer or model and provisions virtual disks to physical and virtual servers, and mirrors them synchronously to achieve high-availably between any two DataCore powered x86 based servers. The storage management for all resources in the infrastructure can be centralised, made flexible and automated. Furthermore, DataCore integrates seamlessly into the VMware server management systems.

The flexibility of DataCore allowed for the existing NetApp systems to be easily integrated, but instead they were replaced due to their high continuing maintenance costs.

The new DataCore and VMware based software-defined virtual infrastructure was preconfigured by Steffen Informatik and installed in parallel to the old solution. Then the subsequent migration of physical servers to virtual machines took place gradually. The new IFS ERP system with all the components for production, development and test systems, including the underlying Oracle databases and MS Exchange, MS SQL, domain server and print services were transferred to the virtual environment. In addition, the Citrix XenApp environment used for virtualisation of applications on thin clients was set up to run on the high-availability DataCore storage.


"The project was managed by our staff and by Steffen Informatik who were very competent and efficient. Importantly they were able to implement the GAMP 5-validation successfully. Compared to the previous solution, we now have a virtual infrastructure that combines VMware and DataCore as a highly flexible and thus more economical platform to meet our needs. DataCore SANsymphony-V enabled greater performance and higher availability at a cost point that other solutions could not equal or provide. We are very satisfied," says Sascha Fritz.

Monday, 17 February 2014

DataCore SANsymphony-V R9.0.3 Storage Virtualization Software Picks Up Storage Magazine Software-defined Storage Award

Please note: Storage Magazine reviewed submissions from earlier in the year and selected DataCore SANsymphony-V R9.0.3 for review and the award, since then, DataCore has had a major update R9.0.4 which has greatly enhanced the functionality, please see: DataCore Software’s Newest SANsymphony-V R.9.0.4 Update Release Sets the Standard for Software-Defined Storage Platforms
See today's announcement: Storage Magazine Storage Product of the Year 2013 Awards http://searchvirtualstorage.techtarget.com/feature/DataCore-SANsymphony-V-R903 
The DataCore SANsymphony-V update scored highest in functionality among the finalists in the category. New features included wizards to provision multiple virtual disks from templates, group commands to manage storage for multiple application hosts, storage profiles for improved control of resources, auto-tiering and replication, heat maps to optimize performance, and a database repository option for recording and analyzing performance history.
One judge called DataCore SANsymphony-V R9.0.3 a "nice upgrade to scale-out SAN storage as a virtual appliance" from "one of the original storage virtualization vendors."
DataCore Software SANsymphony-V R9.0.3
"Their GUI has made administration easier," said one long-time user. "Performance Counter Recording is helpful when troubleshooting issues and is easy to use, and Performance Counter keeps tabs on everything to see quickly if there are any issues."
SANsymphony's ninth release also boosted scalability to eight nodes per centralized group, improved response time through I/O tuning and DRAM cache optimizations, and added support for persistent Tier-0 flash memory storage, 16 Gbps Fibre Channel ports and Windows Server 2012. The product scope expanded from external SANs to server-side virtual SANs with internal and direct-attached storage.p

Thursday, 13 February 2014

Friday, 7 February 2014

Why All the Buzz Around Software Defined Storage?

Check out this video presentation and Q&A session on Software-defined Storage.

Paul Murphy, VP of Marketing at DataCore Software explains why the idea of software-defined storage has picked up momentum in the storage industry and covers the key features and benefits of a software-defined storage solution. View: Why All the Buzz Around Software Defined Storage?


View: Why All the Buzz Around Software Defined Storage?

Wednesday, 25 December 2013

Monday, 23 December 2013

DataCore deployed at over 10000 customer sites and selected as Software-defined Storage Vendor in Advantage Phase of Gartner Group “IT Market Clock"

DataCore Surpasses 10,000 Customer Sites Globally as Companies Embrace Software-Defined Storage

Customers Realize Software, Not Hardware, Key to Increasing Performance, Reducing Cost and Simplifying Management
DataCore has experienced significant customer adoption of its ninth-generation SANsymphony-V platform in 2013. As the company surpassed 10,000 customer sites globally, new trends have materialized around the need for businesses to rethink their storage infrastructures with software architecture becoming the real blueprint for the next wave of data centers.
“The remarkable increase in infrastructure-wide deployments that DataCore experienced throughout 2013 reflects an irreversible market shift from tactical, device-centric acquisitions to strategic software-defined storage decisions. Its significance is clear when even EMC concedes the rapid commoditization of hardware is underway. Their ViPR announcement acknowledges the ‘sea change’ in customer attitudes and the fact that the traditional storage model is broken,” said George Teixeira, president and CEO at DataCore. “We are clearly in the age of software defined data centers, where virtualization, automation and across-the-board efficiencies must be driven through software. Businesses can no longer afford yearly ‘rip-and-replace’ cycles, and require a cost-effective approach to managing storage growth that allows them to innovate while getting the most out of existing investments.”
In addition to the mass customer adoption. DataCore’s software was recently selected as a software-defined storage vendor in Gartner’s “IT Market Clock for Storage, 2013,” published September 6, 2013. The report, by analysts Valdis Filks, Dave Russell, Arun Chandrasekan et al., identifies software-defined storage vendors in the Advantage Phase, and recognizes two main benefits of software-defined storage:
“First, in the storage domain, the notion of optimizing, perhaps often lowering, storage expenses via the broad deployment of commodity components under the direction of robust, policy-managed software has great potential value. Second, in the data center as a whole, enabling multitenant data and workload mobility among servers, data centers and cloud providers without disrupting application and data services would be transformational.”
Three major themes in 2013 shaped the software-defined storage market and defined the use cases of DataCore’s new customers:
Adoption and Appropriate Use of Flash Storage in the Data Center
As more companies rely on flash to achieve greater performance, a unique challenge is arising when it comes to redesigning storage architectures. While the rule of thumb is five percent of workloads require top tier performance, flash vendors are doing their best to convince customers to go all flash despite the low ROI. Instead, businesses have turned to auto-tiering software to make sure applications are sharing flash and spinning disk, based on the need to optimize performance and investment. Going beyond other implementations, DataCore has redefined automation and mobility of data storage with a new policy-managed paradigm that makes auto-tiering a true ‘enterprise wide’ capability that works across multiple vendor offerings and the many levels and varied mix of flash devices and spinning disks.
Host.net is a multinational provider of managed infrastructure services focusing on cloud computing and storage, colocation, connectivity and business continuity for enterprise organizations.
“Flash gives us the greatest levels of performance for our mission critical applications,” said Jeffrey Slapp, CTO of Host.net. “While integral, flash is only a small piece of our storage architecture. In order to help ensure our applications are using the right type of storage for peak performance, we use DataCore's SANsymphony-V platform. The software's intelligence makes sure the more demanding applications use flash and less demanding applications use hard disk. We’ve been able to reduce operational expenses by 35% because of the software’s intelligence capabilities, which allows us to tackle other key business initiatives by leveraging the time we never had previously.”
Virtualizing Storage while Accelerating Performance for Tier-One Applications
Demanding business applications like databases, ERP and mail systems create bottlenecks in any storage architecture due to their rapid activity and intensive I/O and transactional requirements. To offset this, many companies buy high-end storage systems while leaving terabytes of storage unused. Now, though, businesses are able to combine all of their available storage and virtualize it, independent of vendor – creating a single storage pool. Beyond virtualization and pooling, DataCore customers report faster application response times and significant performance increases – accelerating I/O speeds up to five times.
Pee Dee Electric Cooperative is a non-profit, electric cooperative located in Darlington, South Carolina that supplies electricity and other services to more than 30,000 consumers.
“Tier-one applications demand high performance and in the past this translated directly into expensive and overprovisioned storage,” said Robbie Howle, IT Manager at Pee Dee Electric Cooperative. “To help allocate the necessary storage that meets the performance demands of the application, without buying new storage, we leverage the SANsymphony-V platform as it accelerates and virtualizes all of the available storage within the organization. In fact, DataCore’s software-based approach to storage virtualization has drastically reduced costs by enabling us to virtualize storage devices we already had – eliminating the need to pay upwards of $500,000 for a traditional, hardware-based SAN. Moreover, the benefits of the DataCore virtualized storage infrastructure continue to manifest themselves perpetually and we have been able to get twice as much storage, better performance and achieve high availability in going with DataCore.”
Software Management of Incompatible Storage Devices and Models
Many data centers feature a wide variety of storage arrays, devices and product models from a number of different vendors – including EMC, NetApp, IBM, Dell and HP – none of which are directly compatible. Interestingly, DataCore customers report that the issue of incompatibility generally surfaces more when dealing with different hardware models from the same vendor than between different vendors, and thus have turned to management tools that treat all hardware the same.
Maimonides Medical Center, based in Brooklyn, N.Y., is the third-largest independent teaching hospital in the U.S. The hospital has more than 800 physicians relying on its information systems to care for patients around-the-clock.
“Over the past 12 years, our data center has featured eight different storage arrays and various other storage devices from three different vendors,” said Gabriel Sandu, chief technology officer at Maimonides Medical Center. “By using DataCore’s SANsymphony software and currently with its latest iteration of SANsymphony-V R9, we have been able to seamlessly go from one storage array to the next with no downtime to our users. We are able to manage our SAN infrastructure without having to worry or be committed to any particular storage vendor. DataCore’s technology has also allowed us to use midrange storage arrays to get great performance – thereby not needing to go with the more expensive enterprise-class arrays from our preferred manufacturers. DataCore’s thin provisioning has also allowed us to save on storage costs as it allows us to be very efficient with our storage allocation and makes sure no storage goes unused.”

QLogic certifies adapters for software-defined storage; announces Fibre Channel Adapters and FabricCache are DataCore Ready

QLogic FlexSuite Gen 5 Fibre Channel and FabricCache Adapters certified DataCore Ready

The emerging software-defined storage space hit a new milestone after QLogic announced that it has added support for DataCore’s SANsymphony-V virtualization offering. 
http://siliconangle.com/blog/2013/12/18/qlogic-certifies-adapters-for-software-defined-storage/ 

QLogic® FlexSuite™ 2600 Series 16Gb Gen 5 Fibre Channel adapters and FabricCache™ 10000 Series server-based caching adapters are now certified as DataCore Ready, providing full interoperability with SANsymphony-V storage virtualisation solutions from DataCore Software.


DataCore SANsymphony-V is a comprehensive software-defined storage platform that solves many of the difficult storage-related challenges raised by server and desktop virtualisation in data centres and cloud environments. The software significantly improves application performance and response times, enhances data availability and protection to provide superior business continuity and maximises the utilisation of existing storage investments. QLogic FabricCache adapters and FlexSuite Gen 5 Fibre Channel adapters, combined with SANsymphony-V, allow data centres to maximise their network infrastructure for a competitive advantage.

“QLogic channel partners and end-users can now confidently deploy award-winning QLogic adapters with SANsymphony-V to optimise network performance and make the most of their IT investments,” said Joe Kimpler, director of technical alliances, QLogic Corp. “Customers can choose the best QLogic solution—FabricCache adapters for high-performance clustered caching or FlexSuite Gen 5 adapters for ultra-high performance—to best handle their data requirements.”

“DataCore has a long history of collaborating with QLogic to help solve the storage management challenges of our mutual customers,” said Carlos M. Carreras, vice president of alliances and business development at DataCore Software. “QLogic high-performance Gen 5 Fibre Channel adapters and innovative, server-based caching adapters combine with SANsymphony-V to cost-effectively deliver uninterrupted data access, improve application performance and extend the life of storage investments, while providing organisations with greater peace of mind.”

Wednesday, 11 December 2013

DataCore Software Defined Storage and Fusion-io Reduce Costs and Accelerate ERP, SQL, Exchange, SharePoint Applications

BUHLMANN GRUPPE, a leader in steel piping and fittings, headquartered in Bremen, Germany has implemented a storage management and virtual SAN infrastructure based on DataCore’s SANsymphony-V software. SANsymphony-V manages and optimizes the use of both the conventional spinning disks (i.e. “SAS” drives) and the newly integrated flash memory-based Fusion-io ioDrives through DataCore’s automatic tiering and caching technology. With the new DataCore solution in place, the physical servers, the VMware virtual servers and most importantly the critical applications needed to run the business – including Navision ERP software, Microsoft Exchange, SQL and SharePoint – are now failsafe and run faster.

DataCore and Fusion-io = Turbo Acceleration for Tier 1 applications
After successfully testing the implementation, the migration of the physical and virtual servers onto the DataCore powered SAN was carried out. A number of physical servers, the Microsoft SQL and Exchange system and other file servers now access the high performance DataCore storage environment. In addition, DataCore now manages, protects and boost the performance for storage serving 70 virtual machines under VMware vSphere that host business critical applications – including the ERP system from Navision, Easy Archive, Easy xBase, Microsoft SharePoint and BA software.

"The response times of our mission-critical Tier 1 applications have improved significantly; performance has been doubled by the use of DataCore and Fusion-io," says Mr. Niebur. "The hardware vendor independence provides storage purchasing flexibility. Other benefits include the higher utilization of disk space, the performance of flash based hardware, as well as faster response times to meet business needs that we experience today – and in the future – combined they save us time and money. Even with these new purchases involved, we have realized saving of 50 percent in costs – compared to a traditional SAN solution."

Read the full Case Study:
http://www.datacore.com/Libraries/Case_Study_PDFs/Case_Study_-_Buhlmann_Gruppe_US.sflb.ashx

Wednesday, 4 December 2013

A Defining Moment for the Software-Defined Data Center

Article from: http://elettronica-plus.it/a-defining-moment-for-the-software-defined-data-center/

George Teixeira
For some time, enterprise IT heads heard the phrase, “get virtualized or get left behind”, and after kicking the tires, the benefits couldn’t be denied and the rush was on. Now, there’s a push to create software-defined data centers. However, there is some trepidation whether these ground-breaking, more flexible environments can adequately handle the performance and availability requirements of business-critical applications, especially when it comes to the storage part of the equation.

While decision-makers had good reason for concern, they now have an even better reason to celebrate as new storage virtualization platforms have proven to overcome these I/O obstacles.

Just as server hypervisors provided a virtual operating platform, a parallel approach to storage is quickly transforming the economics of virtualization for organizations of all sizes by offering the speed, scalability and continuous availability needed for realizing the full benefits of software-defined data centers. Particular additional benefits being widely reported include:
  • Elimination of storage-related I/O bottlenecks in virtualized data centers
  • Harnessing flash storage resources effectively for even greater application performance
  • Ensuring fast and always available applications without a major storage investment
Performance slowdowns caused by I/O bottlenecks and downtime attributed to storage-related outages are two of the foremost reasons why enterprises have held back from virtualizing their tier-1 applications, like SQL Server, Oracle, SAP and Exchange. This fact comes across clearly in the recent Third Annual State of Virtualization Survey conducted by my company.

In the survey, findings showed 42 percent of respondents noted performance degradation or inability to meet performance expectations as an obstacle preventing them from virtualizing more of their workloads. Yet, effective storage virtualization platforms are now successfully overcoming these issues by using device-independent adaptive caching and performance boosting techniques to absorb wildly variable workloads, enabling applications to run faster virtualized.

To further increase tier-1 application responsiveness, companies often spend excessively on flash memory-based solid state disks (SSDs). The survey also reveals that 44 percent of respondents found disproportionate storage-related costs were an obstacle to virtualization. Again, effective storage virtualization platforms are now providing a solution with such features as auto-tiering, which optimize the use of these premium-priced resources alongside more modestly priced, higher capacity disk drives.

Such an intelligent software platform constantly monitors I/O behavior and can intelligently auto-select between server memory caches, flash storage and traditional disk resources in real-time to ensure the most suitable class or tier of storage device is assigned to each workload based on priorities and urgency. As a result, a software defined data center can now deliver unmatched tier-1 application performance with optimum cost efficiency and maximum return on existing storage investments.

Once I/O intensive tier-1 applications are virtualized, the storage virtualization platform ensures high availability. It eliminates single points of failure and disruption through application-transparent physical separation, stretched across rooms or off-site with full auto-recovery capabilities for the highest levels of business continuity. The right platform can effectively virtualize whatever storage is on a user’s floor, whether direct-attached or SAN-connected, to achieve a robust and responsive shared storage environment necessary to support highly dynamic virtual IT environments.

Yes, the storage virtualization platform is a defining moment for the software defined data center. The performance, speed and high availability needed for mission-critical databases and applications in a virtualized environment has been realized. Barriers have been removed and there’s a clear and supported path for realizing greater cost efficiency.

Still, selecting the right platform is critical to a data center. Technology that is full-featured and has been proven “in the field” is essential. Also, it’s important to go with an independent, pure software virtualization solution in order to avoid hardware lock-in, and to take advantage of the future storage developments that undoubtedly will come.