Friday 28 February 2014

The Register interviews DataCore CEO who waxes lyrically on software-defined storage and future storage technologies

El Reg: DataCore had a terrific year in 2013, breaking the 10,000 customer site number. It's a solidly successful supplier with a technology architecture that's enabled it and its customers to take advantage of industry standard server advances and storage architecture changes gracefully. 2014 should be no different with DataCore being stronger still by the end of it.”

Article: Future storage tech should KILL all-in-one solutions

El Reg had a conversation with DataCore president, CEO and co-founder George Teixeira about what’s likely to happen in 2014 with DataCore. Software-defined storage represents a trend that his company is well-positioned to take advantage of and he reckons DataCore could soar this year.

El Reg: How would you describe 2013 for DataCore?

George Teixeira: 2013 was the ‘tip of the iceberg’, in terms of the increasing complexity and the forces in play disrupting the old storage model, which has opened the market up. As a result, DataCore is positioned to have a breakout year in 2014 … Our momentum surged forward as we surpassed 25,000 license deployments at more than 10,000 customer sites [last year].

What’s more, the EMC ViPR announcement showcased the degree of industry disruption. It conceded that commoditisation and the movement to software-defined storage are inevitable. It was the exclamation point that the traditional storage model is broken.

El Reg: What are the major trends affecting DataCore and its customers in 2014?

George Teixeira: Storage got complicated as flash technologies emerged for performance, while SANs continued to optimise utilisation and management - two completely contradictory trends. Add cloud storage to the mix and all together, it has forced a redefinition of the scope and flexibility required by storage architectures. Moreover, the complexity and need to reconcile these contradictions put automation and management at the forefront, thus software.

A new refresh cycle is underway … the virtualisation revolution has made software the essential ingredient to raise productivity, increase utilisation, and extend the life of … current [IT] investments.

Last year set the tone. Sales were up in flash, commodity storage, and virtualization software. In contrast, look what happened to the expensive, higher margin system sales last year – they were down industry-wide. Businesses realized they no longer can afford inflexible and tactical hardware-defined models. Instead, they are increasingly applying their budget dollars to intelligent software that leverages existing investments and lower cost hardware – and less of it!

Server-side and flash technology for better application performance has taken off. The concept is simple. Keep the disks close to the applications, on the same server and add flash for even greater performance. Don’t go out over the wire to access storage for fear that network latency will slow down I/O response. Meanwhile, the storage networking and SAN supporters contend that server-side storage wastes resources and that flash is only needed for five percent of the real-world workloads, so there is no need to pay such premium prices. They argue it is better to centralize all your assets to get full utilization, consolidate expensive common services like back-ups and increase business productivity by making it easier to manage and readily shareable.

The major rub is obvious. It appears when local disk and flash storage resources, which improve performance, defeat the management and productivity gains from those same resources by unhooking them from being part of the centralised storage infrastructure. Software that can span these worlds appears to be the only way to reconcile the growing contradiction and close the gap. Hence the need for an all-inclusive software-defined storage architecture.

A software-defined storage architecture must manage, optimise and span all storage, whether located server-side or over storage networks. Both approaches make sense and need to be part of a modern software-defined architecture.

Why force users to choose? Our latest SANsymphony-V release allows both worlds to live in harmony since it can run on the server-side, on the SAN or both. Automation in software and auto-tiering across both worlds is just the beginning. Future architectures must take read and write paths, locality, cache and path optimizations and a hundred other factors into account and generally undermine the possibility of all-in-one solutions. A true ‘enterprise-wide’ software-defined storage architecture must work across multiple vendor offerings and across the varied mix and levels of RAM devices, flash technologies, spinning disks and even cloud storage.

El Reg: How will this drive DataCore's product (and service?) roadmap this year?

George TeixeiraThe future is an increasingly complex, but hidden environment, which will not allow nor require much if any human intervention. This evolution is natural. Think about what one expects from today’s popular virtualization offerings and major applications, but exactly this type of sophistication and transparency. Why should storage be any different?
Unlike others who are talking roadmaps and promises, DataCore is already out front and has set the standard for software-defined storage as our software is in production at thousands of real world customer sites today. DataCore offers the most comprehensive and universal set of features and these features work across all the popular storage vendor brands, models of storage arrays and flash devices. Automating and optimizing the use of these devices no matter where they reside within servers or in storage networks.

DataCore will continue to focus on evolving its server-side capabilities, enhance the performance and productive use of in-memory technologies like DRAM, flash and new wave caching technologies across storage environments. [We'll take] automation to the next level.

DataCore already scales out to support up 16 node grid architectures and we expect to quadruple that number this year.

[We] will continue to abstract complexity away from the users and reduce mundane tasks through automation and self-adaptive technologies to increase productivity. ... For larger scale environments and private cloud deployments, there will be a number of enhancements in the areas of reporting, monitoring and storage domain capabilities to simplify management and optimise ‘enterprise-wide’ resource utilisation.

VSAN "opens the door without walking through it."
El Reg: How does DataCore view VMware's VSAN? Is this a storage resource it can use?

George Teixeira: Simply put, it opens the door without walking through it. It introduces virtual pooling capabilities for server-side storage that meets lower-end requirements while delivering promises of things to come for VMware-only environments. It sets the stage for DataCore to fullfil customers’ need to seamlessly build out a production class, enterprise-wide software-defined storage architecture.

It opens up many opportunities for DataCore for users who want to upscale. VSAN builds on DataCore concepts but is limited and just coming out of beta, whereas DataCore has a 9th generation platform in the marketplace.

Beyond VSAN, DataCore spans a composite world of storage running locally in servers, in storage networks, or in the cloud. Moreover, DataCore supports both physical and virtual storage for VMware, Microsoft Hyper-V, Citrix, Oracle, Linux, Apple, Netware, Unix and other diverse environments found in the real world.

Will you be cheeky enough to take a bite at EMC?
El Reg: How would you compare and contrast DataCore's technology with EMC's ScaleIO?
George Teixeira: We see ourselves as a much more comprehensive solution. We are a complete storage architecture. I think the better question is how will EMC differentiate its ScaleIO offerings from VMware VSAN solutions?
George Teixeira: Both are trying to address the server-side storage issues using flash technology, but again why have different software and separate islands or silos of management, one for the VMware world, one for the Microsoft Hyper-V world, one for the physical storage world, one for the SAN?
This looks like a divide and complicate approach when consolidation, automation and common management across all these worlds are the real answers for productivity. That is the DataCore approach. Instead, this silo approach of many offerings appears to be driven by the commercial requirements of EMC versus the real needs of the marketplace.
El Reg: Does DataCore envisage using an appliance model to deliver its software?
George Teixeira: Do we envisage an appliance delivery model? Yes, in fact, it already exists. We work with Fujitsu, Dell and a number of system builders to package integrated software/hardware bundles. We are rolling out the new Fujitsu DataCore SVA storage virtualization appliance platform within Europe this quarter.
We also have the DataCore appliance builder program in place targeting the system builder community and have a number of partners including Synnex Hyve Solutions which provides virtual storage appliances using Dell, HP or Supermicro Appliances Powered by DataCore and Fusion-io.
El Reg: How will DataCore take advantage of flash in servers used as storage memory? I'm thinking of the SanDisk/SMART ULLtraDIMM used in IBM's X6 server and Micron's coming NVDIMM technology.
George Teixeira: We are already positioned to do so. New hybrid NVDIMM type technology is appealing. It sits closer to the CPU and promises flash speeds without some of the negatives. But it is just one of the many innovations yet to come.
Technology will continue to blur the line between memory and disk technologies going forward, especially in the race for faster application response times by getting closer to the CPU and central RAM.
Unlike others who will have to do a lot of hard coding, and suffer the delays and reversals of naivety trying to keep up with new innovations, DataCore learned from experience and designed early on software to rapidly absorb new technology and make it work. Bring on these types of innovations; we are ready to make the most of them.
El Reg: Does DataCore believe object storage technology could be a worthwhile addition to its feature set?
George Teixeira: Is object storage worthwhile? Yes, we are investing R&D and building out our OpenStack capabilities as part of our object storage strategy; other interfaces and standards are also underway. It is part of our software-defined storage platform, but it makes no sense to bet simply on the talk in the marketplace and on these emerging strategies exclusively.
The vast bulk of the marketplace is file and block storage capabilities and that is where we are primarily focused this year. Storage technology advances and evolves, but not necessarily in the way we direct it to, therefore object storage is one of the possibilities for the future along with much more relevant, near term advances that customers can benefit from today.
El Reg: DataCore had a terrific year in 2013, breaking the 10,000 customer site number. It's a solidly successful supplier with a technology architecture that's enabled it and its customers to take advantage of industry standard server advances and storage architecture changes gracefully. 2014 should be no different with DataCore being stronger still by the end of it. ®

Wednesday 26 February 2014

Opinion and Insights: Revenues Trump Innovation at Large Tech Companies

Let’s face it. Large B2B tech companies don’t come up with groundbreaking innovations. They are too big and are too busy with action items, corporate politics and intense pressure from the street to develop new innovations and bring those ideas to market in a reasonable timeframe. They also have a big reason not to monkey with the status quo, billions of reasons actually. Faster, cheaper, better typically means less revenue.
Turns out Wall Street isn’t a big fan of cool innovations that decrease revenue, which shouldn’t come as a surprise. Perched high on the list of career limiting moves in Silicon Valley is driving a plan to develop innovative products and services that will decrease revenue.
The truth is, most great ideas from the major hardware and software vendors leave with their creators, stop on Sand Hill road for funding and end up in a little office in the Valley. Big ideas can mean big money. Visionary engineers aren’t turning their moment of genius over to a big political machine to get kicked around meeting rooms for three years only to resurface neutered and too late to matter anyway.
Big tech companies, especially hardware vendors, will continue to fight to the death to keep their antiquated products moving off the shelves. Sluggish economics made 2013 a great year to buy dated technology; 70, 80, even 90 percent discounts on six-figure hardware deals. What’s happening here? While these products are selling for next to nothing, vendors are now charging 20 percent of the list price for support and updates.
This is a plan that certainly works for Wall Street and it is hard for customers to resist. What would you do if you bought a new car two years ago for $50,000 and your dealer called you at the end of the year saying they’d give you a new care for $5,000? Anyone would take that deal!
VMware had this same problem getting their groundbreaking technology off the ground. Customers could install their technology and use 90 percent less server hardware. While this was great for the customers, HP, Dell and IBM certainly weren’t happy with VMware. The same thing could be said for resellers, as they certainly weren’t interested in selling 90 percent less hardware.
The resellers are just as weary about messing with their nest egg. However large or small, the few VARs that actually survive their first two years in business did so for a reason — they realized that running a tech resale business is about making money. The rash of leads, rebates and SPIFs coming from their big vendors are too lucrative for the resellers with influence to take a chance on anything new.
The hardest part of getting new innovations to market is cutting through the big tech companies marketing machines and their incredibly talented (and well paid) sales people to get prospective customers to consider new technologies. There really isn’t a villain to blame, but there is a hero.
Ultimately, the buyers of technology are the heroes of our industry. They are our saving grace. Despite pressure from their upper management, the people in the trenches that do this because they love it are the ones that keep us moving in the right direction.
To those visionary technologists, don’t believe all the marketing hype from big tech. Keep innovating. By doing so, you will help shape the future of the industry.
Paul Murphy is VP of Worldwide Marketing at DataCore.


Thursday 20 February 2014

Legacy Pharmaceuticals Trusts DataCore to Scale Out Its Virtual Infrastructure and Support its Critical ERP, Oracle and VMware Systems


“DataCore SANsymphony-V gave us more flexiblity and scalablity than a hardware SAN and it was able to meet our high availability and performance requirements perfectly. Not only did we gain a redundant SAN architecture, but we achieved a 40 percent savings in the initial investment in comparison to conventional hardware SAN solutions and now we can adjust our storage infrastructure to meet our requirements without having any storage vendor lock-in", Sascha Fritz, IT Expert at Legacy Pharmaceuticals Switzerland.

Legacy Pharmaceuticals have entrusted DataCore Software’s SANsymphony-V to implement and scale out their virtual infrastructure to support the introduction of their critical ERP system based on IFS and Oracle databases. The software-defined virtual infrastructure is based on DataCore for storage virtualisation and VMware for server virtualisation. 

Overall, the DataCore solution provided higher performance and higher availability compared to all other evaluated alternatives and the total cost, including what was required for a redundant SAN hardware and software design, was around 60 per cent lower than the price of alternative non-redundant single SAN systems.

As a leading company in the Swiss pharmaceutical industry, Legacy Pharma is committed to the highest quality and safety standards. These are mandated by the Swiss regulatory and supervisory authority and have specific compliance requirements and remedies which are regularly audited.

The compliance guidelines, required by the state as well as those established based on voluntary standards, must also fit within the practical bounds of economic feasibility in regards to the information technology that is implemented. The need for compliance, cost-effectiveness and innovative IT productivity are key competitive advantages to maintain leadership in the industry. For these reasons, Legacy Pharma decided to implement a new IT infrastructure and ERP validated system according to the latest GAMP 5 guidelines. "Good Automated Manufacturing Practice" (short: GAMP) is a standard framework for the validation of computerised systems within the pharmaceutical industry. However, for the ERP system with IFS, the older AS400 environment with NetApp storage but critically without full redundancy, were no longer suitable.

"Therefore our IT environment had to be upgraded. In cooperation with IT service provider Steffen Informatik, we decided to go with a virtual infrastructure and put our ERP system on virtual machines running on VMware," says Sascha Fritz.

At the same time, a new storage solution with appropriate availability and performance needed to be found. The SAN solutions offered by NetApp, EMC and HP that were evaluated did not allow a fully redundant environment within budget. Finally, the project virtualisation experts at Steffen Informatik presented DataCore’s SANsymphony-V storage virtualisation software that provided sophisticated enterprise storage features and using standard industry storage components.

SANsymphony-V virtualises storage resources regardless of manufacturer or model and provisions virtual disks to physical and virtual servers, and mirrors them synchronously to achieve high-availably between any two DataCore powered x86 based servers. The storage management for all resources in the infrastructure can be centralised, made flexible and automated. Furthermore, DataCore integrates seamlessly into the VMware server management systems.

The flexibility of DataCore allowed for the existing NetApp systems to be easily integrated, but instead they were replaced due to their high continuing maintenance costs.

The new DataCore and VMware based software-defined virtual infrastructure was preconfigured by Steffen Informatik and installed in parallel to the old solution. Then the subsequent migration of physical servers to virtual machines took place gradually. The new IFS ERP system with all the components for production, development and test systems, including the underlying Oracle databases and MS Exchange, MS SQL, domain server and print services were transferred to the virtual environment. In addition, the Citrix XenApp environment used for virtualisation of applications on thin clients was set up to run on the high-availability DataCore storage.


"The project was managed by our staff and by Steffen Informatik who were very competent and efficient. Importantly they were able to implement the GAMP 5-validation successfully. Compared to the previous solution, we now have a virtual infrastructure that combines VMware and DataCore as a highly flexible and thus more economical platform to meet our needs. DataCore SANsymphony-V enabled greater performance and higher availability at a cost point that other solutions could not equal or provide. We are very satisfied," says Sascha Fritz.

Monday 17 February 2014

DataCore SANsymphony-V R9.0.3 Storage Virtualization Software Picks Up Storage Magazine Software-defined Storage Award

Please note: Storage Magazine reviewed submissions from earlier in the year and selected DataCore SANsymphony-V R9.0.3 for review and the award, since then, DataCore has had a major update R9.0.4 which has greatly enhanced the functionality, please see: DataCore Software’s Newest SANsymphony-V R.9.0.4 Update Release Sets the Standard for Software-Defined Storage Platforms
See today's announcement: Storage Magazine Storage Product of the Year 2013 Awards http://searchvirtualstorage.techtarget.com/feature/DataCore-SANsymphony-V-R903 
The DataCore SANsymphony-V update scored highest in functionality among the finalists in the category. New features included wizards to provision multiple virtual disks from templates, group commands to manage storage for multiple application hosts, storage profiles for improved control of resources, auto-tiering and replication, heat maps to optimize performance, and a database repository option for recording and analyzing performance history.
One judge called DataCore SANsymphony-V R9.0.3 a "nice upgrade to scale-out SAN storage as a virtual appliance" from "one of the original storage virtualization vendors."
DataCore Software SANsymphony-V R9.0.3
"Their GUI has made administration easier," said one long-time user. "Performance Counter Recording is helpful when troubleshooting issues and is easy to use, and Performance Counter keeps tabs on everything to see quickly if there are any issues."
SANsymphony's ninth release also boosted scalability to eight nodes per centralized group, improved response time through I/O tuning and DRAM cache optimizations, and added support for persistent Tier-0 flash memory storage, 16 Gbps Fibre Channel ports and Windows Server 2012. The product scope expanded from external SANs to server-side virtual SANs with internal and direct-attached storage.p

Thursday 13 February 2014

Friday 7 February 2014

Why All the Buzz Around Software Defined Storage?

Check out this video presentation and Q&A session on Software-defined Storage.

Paul Murphy, VP of Marketing at DataCore Software explains why the idea of software-defined storage has picked up momentum in the storage industry and covers the key features and benefits of a software-defined storage solution. View: Why All the Buzz Around Software Defined Storage?


View: Why All the Buzz Around Software Defined Storage?