Tuesday, 28 March 2017

Lenovo launches turnkey software-defined storage (SDS) appliances Powered by DataCore

Lenovo has announced a series of turnkey software-defined storage (SDS) appliances that leverages its robust server platform combined with advanced storage virtualization software and multi-core optimizing parallel I/O – SANsymphony™ from DataCore Software. These new offerings are the latest milestone in Lenovo’s effort to drive adoption of software-defined data center technology, which offers compelling customer benefits such as higher agility and simplicity, as well as better cost economics.

Press Release: LENOVO DX8200D TO SIMPLIFY SOFTWARE-DEFINED STORAGE ADOPTION



For more details, product guides and videos, please click: 
http://www3.lenovo.com/au/en/data-center/storage/software-defined-storage/Lenovo-Storage-DX8200D-powered-by-DataCore/p/77X2SHESH08

Thursday, 23 March 2017

Big Day for DataCore: Strategic Partnership with Lenovo to Deliver Turnkey Simple Appliances to Fuel Mainstream Market Adoption of Software-defined Storage

Lenovo Storage DX8200D DataCore
Contributed by George Teixeira, President and CEO, DataCore Software
In every company’s life there are inflection points that create new possibilities and introduce new ways to serve its customers and become an even greater force in the marketplace.  This last week, with the announcement of the Lenovo Storage DX8200D Powered by DataCore, DataCore had that kind of moment.
While this exciting announcement is the culmination of a lot of work, and there are many people to thank, it is also a new beginning. 
DataCore has always been a software-only company, and in fact, the flexibility inherent in software has always been a strength. That’s still true, but we know there are customers who prefer a fast-to-deploy all-in-one hardware appliance, with everything pre-installed, configured, optimized, and validated. With the Lenovo DX8200D, they can have it.
This new turnkey appliance makes it simple to gain flexibility and reduce costs, and it’s a great fit:
The DataCore and Lenovo combined appliance includes multi-core optimizing parallel I/O software that achieved the world’s record for storage performance in terms of response times and best price-performance, according to the industry’s most respected and audited Storage Performance Council benchmark.
We’re very proud that Lenovo is leveraging our adaptive and self-tuning storage management and parallel I/O technology, experience and know-how to make it easier for the broader market to deploy software-defined storage in a simple, turnkey appliance backed by Lenovo’s one-stop shopping and support.
Whether you’re an end-user IT professional, a channel partner, or just an interested industry follower, we’d love to tell you more. Please take a moment to learn more or watch this brief video overview.

Tuesday, 21 March 2017

LENOVO DX8200D TO SIMPLIFY SOFTWARE-DEFINED STORAGE ADOPTION WITH APPLIANCES POWERED BY DATACORE

Lenovo announced a turnkey software-defined storage (SDS) appliance that leverages its robust server platform combined with advanced storage virtualization software and multi-core optimizing parallel I/O – SANsymphony™ from DataCore Software. This new offering is the latest milestone in Lenovo’s effort to drive adoption of software-defined data center technology, which offers compelling customer benefits such as higher agility and simplicity, as well as better cost economics.



Lenovo plans to offer the DX8200D as a pre-integrated appliance, which promises to greatly simplify deployment and reduce management expenses while providing a single point of support.  The latest offering in Lenovo’s SDS portfolio, it enables data centers to rapidly deploy a turnkey solution that harnesses the capabilities of existing SAN arrays.  It can optimize heterogeneous storage infrastructures, enabling them to scale as needs grow and easily replace older storage arrays. Through the centralized interface, the DX8200D provides data protection, replication, de-duplication, compression, and other enterprise storage capabilities at a much lower price point than traditional SAN arrays.

Storage for Software Defined Data Centers

http://shop.lenovo.com/us/en/systems/storage/sds/datacore/ 



 As data growth continues to outpace IT budgets, you need a brand new approach to storage. The Lenovo Storage DX8200D powered by DataCore appliance allows you to architect block and file storage for faster performance while simplifying operations. This validated, turnkey solution provides easy scalability and single pane management at a fraction of the cost of legacy systems—without trade-offs in availability, reliability or functionality. Lenovo’s global leadership and innovation includes industry-leading reliability and world-class performance for business-critical applications and cloud deployments. You can directly obtain a single point of contact for 24/7 technical assistance from Lenovo’s support organization, which is rated #1 for overall x86 server customer satisfaction.


The Register: Lenovo gets into server-based storage virtualization with DataCore

https://www.theregister.co.uk/2017/03/15/lenovo_server_based_storage_virtualization_datacore/

Our server, your software - let's play a SAN symphony together

The two have history with their set of SPC-1 benchmark results showing how parallel IO-processing DataCore software running on simple Lenovo servers apparently equals that of multi-million dollar storage array rigs from other suppliers.
The DX8200D is a turnkey system, coming with SANsymphony software from DataCore, and Lenovo says it ships preconfigured, tested and optimised, and harnesses the capabilities of existing SAN arrays. The software virtualizes other arrays and adds their storage to the SANsymphony pool.
The server is a Lenovo x3650 M5, as used in the SPC-1 benchmark runs, and comes with predictive failure analysis and a diagnostic panel for serviceability. IT comes with Lenovo XClarity management software which automates discovery, inventory tracking, real-time monitoring, configuration, fault detection, and alert handling.

Lenovo_DX200D...The SANsymphony software provides provides storage virtualization, data protection, replication, de-duplication, compression, and other enterprise storage capabilities, we’re told, at a much lower price point than traditional SAN arrays.
Lenovo cites a TechValidate research effort to claim customers can “realise lower total cost of ownership, with an up to 90 per cent decrease in time spend on storage management and support tasks, up to a 75 per cent reduction in storage costs and up to 100 per cent reduction in storage-related downtime. With a 10-fold increase in performance, data centres also can realise higher availability of mission-critical data.” There’s grist for the channel mill.

Radhika Krishnan, Lenovo’s executive director and GM for software-defined data centre and networking in its Data Centre Group, issued a quote from the cannery, saying the DX820D “is in stark contrast to traditional storage offerings, from legacy vendors, which often-times require compromises in performance, availability, reliability and functionality — limiting the ability to scale and increasing CAPEX, power, cooling and footprint costs.”

Lenovo said its support provides 24 x 7 technical assistance for both hardware and software questions...

Tuesday, 21 February 2017

Healthcare Customers Gain Critical Performance and Flexibility Benefits with DataCore Hyperconverged and Software-Defined Storage Solutions


Healthcare IT departments are challenged every day to deliver life-saving system performance while keeping costs within budget. That's why a growing number of healthcare institutions are turning to DataCore Software, a leading provider of Hyper-converged Virtual SANSoftware-Defined Storage and Adaptive Parallel I/O Software. DataCore enables these organizations to address mission-critical IT challenges while maximizing the performance, availability and utilization of IT resources - enhancing patient outcomes while keeping costs low.
"In healthcare, ultra-fast application response times are critical," said George Teixeira, president and CEO of DataCore Software. "Slow response from systems such as X-Rays, MRIs, or CAT scans, or the inability to immediately access critical patient information can have life-altering consequences, and as a result, delays are simply unacceptable."
Furthermore, with the ongoing and massive data growth from medical images, including multi-dimensional, 3D and even motion-based image formats, as well as the continuing move to electronic health records, storage requirements and the cost to manage them are also on the rise.
DataCore software delivers record-breaking performance via its Parallel I/O technology, which pairs well with Lenovo's powerful servers featuring x86-64 processors. The combination offers the industry's fastest I/O response time and the best price-performance with self-tuning features that automatically move data between spinning disks and flash based on workload priorities. An example of a healthcare customer that relies on DataCore and Lenovo is the Comprehensive Cancer Center at the University of Puerto Rico (CCCUPR) -- one of the most advanced hospitals and cancer research facilities in North America.
The Comprehensive Cancer Center at the University of Puerto Rico (CCCUPR)
CCCUPR needed a powerful, easy to operate and flexible solution to manage its critical medical records and the growing oncology imaging requirements from its Picture Archive & Communications System (PACS). Since the hospital and the research center are separated by about two miles, patient information also needed to be shared and protected from unplanned events at all times.  
According to Luis M. Wilkes, director of Information Systems for CCCUPR, "The combination of DataCore and Lenovo has maximized IT infrastructure performance, availability and utilization by delivering a high availability, software-defined storage solution to support our operations. The DataCore-Lenovo solution ensures that critical health information systems, such as our PACS, are available online and on demand. Going forward, we have the flexibility to meet changing demands with DataCore software running on Lenovo and virtualizing, protecting and accelerating our systems and applications."
CCCUPR now has six Lenovo Series x3650 servers running DataCore software at the primary site. For disaster recovery, the solution includes advanced DataCore replication to two additional Lenovo Series x3650 servers at the secondary location.
A common thread among the many new healthcare organizations that have deployed DataCore is that all have done so to achieve significant gains in performance, scalability and reliability. DataCore enables users to:
  • Speed Up Applications - Faster applications (databases, critical applications, virtualized applications, etc.) means more transactions are processed in less time, and more data is analyzed faster, leading to increased productivity.
  • Scale within Budget - DataCore ensures the lowest TCO to scale-up or scale-out. This enables users to run more workloads, with better performance and availability, on far fewer servers and utilize the infrastructure already in place for remarkable cost savings, both direct and indirect (less power, cooling and space). Hardware-independent software ensures services live beyond current generation of infrastructure technology and change.
  • Protect Data and Applications - DataCore provides the highest availability with the fewest nodes. Highly-available infrastructure reduces disruptions to business operations and decreases risk.
The result is greater consolidation savings, better performance and higher availability for critical healthcare applications, databases, and other virtualized applications.

For more information about DataCore's healthcare customer experiences with hyper-converged and software-defined storage,                                    

Tuesday, 14 February 2017

Digitalisation World: Learning Loves Core IT Challenges

digitalization-world
DataCore Software says that a growing number of UK educational institutions are deploying its scalable storage services platform, SANsymphony™ to address their critical IT challenges, increase performance and reduce infrastructure costs.
From leading data and research-rich university seats of learning - including Oxford University and University of Birmingham - through to independent and state secondary schools, DataCore’s solutions are seeing an increased uptake for storage savings, failover, resilience and much faster performance. Spencer Webb, the University of Birmingham noted:-

 “Prior to the install of DataCore, failover was complicated and fully manual. We needed automatic resilience without human intervention. It needed to be fast, easy and cost effective, to work with our existing storage and support VMware.”

Modern Educational Instituions Today Demand a Faster and More Reliable Architecture to Drive their Key Applications and Workloads without the Enterprise Price Tag:

Just as enterprises need better performance and higher SLAs from their key applications, educational institutions now need the same, but typically have stricter budget considerations than their enterprise cousins. Successful education establishments need to seamlessly run any application, 24x7, deploying apps on any storage across many different environments. They can now do this and protect their existing investments by allowing legacy storage to sit behind DataCore to gain intelligent storage services such as auto-tiering, faster performance and a single management interface and view of their storage infrastructure.
So no matter how diverse the storage may be, or which topology the education establishment has chosen or inherited, DataCore’s software-defined solution offers the following benefits:

- Applications run faster and uninterrupted.
- Existing storage is pooled, tiered and data protected automatically.

- Storage assets are centralised and managed universally.

The net result is better performance and availability for databases, VDI, and other applications, both virtualised and physical, at a much lower cost. That’s a critical point as another leading London University noted:

“DataCore is installed at both data centres and is critical to keep services running. We used to suffer downtime, now we can fail over to either site and we can still meet our SLAs – upgrades and maintenance can occur without any downtime on critical apps. As a result of running DataCore, Regents University have reduced storage related costs by 25%.” Zubair Fakir, Regents University, London.

Meanwhile, in the secondary school sector, whilst the number of uses and data sets are reduced and planned windows of maintenance are increased in the school holidays, availability of data and apps are now deemed as critical. One leading independent grammar school in the North of England noted:

“We are now thrilled with our optimised and highly available virtual environment. You simply get what you pay for in life. With DataCore, the install has been a breath of fresh air and I’m very confident in its ability to protect and optimise us for years to come.” Simon Thompson, Network Manager, Bradford Grammar School.

Benefits these institutions are receiving with DataCore include the ability to:

-        Maximise the value from storage investments, current and future.
-        Optimise performance of latency-sensitive applications.
-        Automate and centralise storage management.
-        Enable “zero downtime, zero touch” availability of data.

Schools & Further Education Colleges Need Hyper-converged Too – To Gain Greater Productivity, Ease and Flexibility:

One such further education college in Southampton that wanted all the benefits of compute, storage, networking and virtualisation from a single hardware appliance, was Richard Taunton Sixth Form College, with over 1,300 students. Here, the IT team were keen to adopt hyper-converged across 40TB of usable storage space, which was delivered as a mirrored hyper-converged host using DataCore’s Virtual SAN Hyper-converged solution. At the sixth form college, local storage was presented as an iSCSI target to local VMs, and mirrored in an active-active configuration to the other host. Using DataCore’s inbuilt Auto Tiering functionality, the college is now able to utilise SSD Flash technology for rapid access to all their hot data, while seamlessly apportioning less utilised ‘cooler’ data to large SAS disks, saving on budget. The sixth form college was also able to downgrade their former maintenance contracts from the costly 24x7 with 4 hour response SLAs, to next business day SLAs, given that their hyper-converged mirrored system now seamlessly defaults to the other host.

“For schools and colleges of all sizes, it’s about maximising IT infrastructure performance, availability and utilisation by productively using smart software to virtualize and add the needed flexibility to meet changing demands,” said George Teixeira, CEO and Co-Founder of DataCore. “Educational institutions’ IT infrastructures are often complex and decentralized, being the product of many years of accumulations and different departmental agendas. Additionally, their high-performance applications require predictable performance and scalability for a wide variety of mission-critical workloads such Oracle and SQL Server databases. By offering the best price-performance on the market, DataCore can ensure industry-best response times – making institutions’ IT faster and meaning their infrastructure can be massively consolidated, eliminating complexity.”

The last words revert back to Spencer Wood, University of Birmingham:

“Within the data centres, DataCore has exceeded our requirements. Day to day, we are not sure how we would operate without it. It immediately improved the performance of the VMs. We are now Auto Tiered to make the most of our storage which has lowered our costs, as we have been able to move off Fibre Channel. Downtime is now really simple for us. We can simply take a data centre offline and no services will be disrupted.”

Tuesday, 7 February 2017

Promise Technology's VTrak E5000 Series of Storage Solutions Certified as DataCore Ready

Leading storage solutions provider Promise Technology Inc. today announced that its VTrak E5000 Series of Fibre Channel to SAS storage solutions have been certified as DataCore Ready for DataCore Software's SANsymphony software-defined storage and virtualization platform.
As consumers and businesses become more mobile, the need for data access, retrieval, and distribution from anywhere at any time means that data must be protected and available at all times. The changing nature of the data center also means that growing infrastructures are pushing the limits of bandwidth. A full-featured, affordable enterprise-level storage system that can accommodate business environments of all sizes, Promise's E5000 Series is versatile and scalable enough to meet the demands of IT departments, data centers, virtual environments, and high-performance computing. Redundant and active-active components of controllers, power supplies and cooling units provide optimal data availability and ensure continuous operation. The E5000 gives IT managers the ability to deploy 6/12 Gb SAS/SATA hard drives and SSDs, and contains flash arrays to optimize speed for key enterprise applications that need high-speed transfer rates and reduced latency.
When combined with DataCore's SANsymphony software-defined storage virtualization solution, the VTrak E5000 maximizes the performance, availability and utilization of IT infrastructures by virtualizing the storage hardware. This enables the E5000 to leverage SANsymphony's data services, and further augment reliability, functionality and performance. Data services supported by SANsymphony include synchronous mirroring, asynchronous replication, CDP, snapshots/backups, storage pooling, thin provisioning, data migration, and deduplication/compression. To learn more about SANsymphony, visit https://www.datacore.com/products/SANsymphony.aspx.
"DataCore's key strengths, in addition to parallel processing of I/O to increase workload productivity, include speeding up the response of mission-critical, enterprise-level applications and reducing the cost to meet performance expectations," said Carlos Carreras, senior vice president of worldwide business development and strategic alliances, DataCore Software. "As a result, we are pleased to certify the VTrak E5000 Series as DataCore Ready to help Promise Technology deliver the ultimate benefits of an affordable, high-performance Fibre Channel to SAS storage solution with advanced enterprise-level reliability and functionality."
The DataCore Ready Program identifies solutions that are trusted to enhance DataCore SANsymphony infrastructures. While DataCore solutions interoperate with common open and industry standard products, those that earn the "DataCore Ready" designation have completed additional verification testing. The DataCore Ready designation is awarded to third party products that have successfully met the verification criteria set by DataCore through the successful execution of a functional test plan and performance envelope tests.
"Promise has been working closely with DataCore for years to bring our customers additional capabilities to meet the bandwidth and storage requirements of IT departments, data centers and virtual environments," noted Vijay Char, president, Promise Technology USA. "With the VTrak E5000 Series now certified DataCore Ready, customers can seamlessly integrate our solutions with SANsymphony storage virtualization software for a superior level of compatibility and optimized performance."

Thursday, 2 February 2017

The Register: NetApp Launches Two New All-Flash Arrays and Comparison with DataCore's Recent SPC-1 Results

the-register
SPC-1 result
NetApp claims that the A700s is the fastest enterprise storage, citing a Storage Performance Council SPC-1 Result, saying:
The AFF A700s achieved 2,400,059.26 SPC-1 IOPS at an average response time of 0.69 milliseconds. It is the top-performing enterprise all-flash array among the major storage providers and in the top three overall on the SPC-1 Performance list.
Huawei OceanStor 18800 V3 is number 2, scoring 3,010,007.37 IOPS at an average 0.92ms and a price/performance rating of $0.79. The A700s’ price/performance was better, at $0.62.
A 2-node DataCore Parallel Server holds the SPC-1 record, scoring 5,120,098.98 SPC-01 IOPS with an average 0.28ms response time and $0.10 price performance rating. It did so with a pair of Lenovo X3650 M5 servers, a mix of SSDs and HDDs mounted internally and externally, and 1.54TB of DRAM for caching plus parallel IO-serving software having multiple CPU cores handle the IO.
spc-1-iops
The A700s configuration in the benchmark featured a 12-node cluster (6 x 2-node HA pairs), each node having 512GB of DRAM/cache, meaning a total of 6TB DRAM.
How would NetApp describe DataCore's SPC-1 result coming from a system costing $506,525.24 while the A700s was priced at $1,493,103.71? Roughly speaking that's NetApp offering half the DataCore performance for more than twice the price.
spc-1-iops-ranking
Adam Fore, NetApp's director for product and solutions marketing, said: "We can't speculate on how DataCore got its results. However, in our view, comparing NetApp and DataCore's offerings is like comparing apples to oranges. The NetApp AFF A700s brings the full suite of enterprise-grade data management and data protection that customers are looking for as they build out the cloud-connected data centre."
Comment
The point of the SPC-1 benchmark is to compare systems in an apples-to-apples way with submitted systems subject to review. Here's what the Storage Performance Council says:
The SPC-1 Benchmark is designed to be vendor/platform independent and are applicable across a broad range of storage configuration and topologies. Any vendor should be able to sponsor and publish an SPC-1 Result, provided their tested configuration satisfies the requirements of the SPC-1 benchmark specification.
In effect NetApp says, yes, the DataCore system is faster and costs less but it doesn't run our proprietary software, and that makes it unsuitable for enterprises.
It would be very interesting to see Dell EMC VMAX, Unity and XtremIO SPC-1 benchmarks, as well as ones for HPE's 3PAR, IBM's FlashSystems and also Pure's FlashArray. We're not holding our breath.
The rate of AFF innovation looks high, and this leaves us wondering when a new generation SolidFire array, one designed and engineered under NetApp ownership of SolidFire, will emerge.
We haven't seen any pricing but expect that the A200 will significantly lower the AFF entry-level pricing, while the A700s should do the same for entering the A700 performance level.

Wednesday, 1 February 2017

Oxford University and University of Birmingham - UK Educational Institutions Turn to DataCore to Overcome Critical IT Challenges

DataCore Software has announced that a growing number of UK educational institutions are deploying its scalable storage services platform, SANsymphony to address their critical IT challenges, increase performance and reduce infrastructure costs. From leading data and research-rich university seats of learning - including Oxford University and University of Birmingham - through to independent and state secondary schools, DataCore's solutions are seeing an increased uptake for storage savings, failover, resilience and much faster performance. Spencer Webb, the University of Birmingham noted:
"Prior to the install of DataCore, failover was complicated and fully manual. We needed automatic resilience without human intervention. It needed to be fast, easy and cost effective, to work with our existing storage and support VMware."
Modern Educational Institutions Today Demand a Faster and More Reliable Architecture to Drive their Key Applications and Workloads without the Enterprise Price Tag:
Just as enterprises need better performance and higher SLAs from their key applications, educational institutions now need the same, but typically have stricter budget considerations than their enterprise cousins. Successful education establishments need to seamlessly run any application, 24x7, deploying apps on any storage across many different environments. They can now do this and protect their existing investments by allowing legacy storage to sit behind DataCore to gain intelligent storage services such as auto-tiering, faster performance and a single management interface and view of their storage infrastructure.
So no matter how diverse the storage may be, or which topology the education establishment has chosen or inherited, DataCore's software-defined solution offers the following benefits:
  • Applications run faster and uninterrupted.
  • Existing storage is pooled, tiered and data protected automatically.
  • Storage assets are centralised and managed universally.
The net result is better performance and availability for databases, VDI, and other applications, both virtualised and physical, at a much lower cost. That's a critical point as another leading London University noted:
"DataCore is installed at both data centres and is critical to keep services running. We used to suffer downtime, now we can fail over to either site and we can still meet our SLAs - upgrades and maintenance can occur without any downtime on critical apps. As a result of running DataCore, Regents University have reduced storage related costs by 25%." Zubair Fakir, Regents University, London.
Meanwhile, in the secondary school sector, whilst the number of uses and data sets are reduced and planned windows of maintenance are increased in the school holidays, availability of data and apps are now deemed as critical. One leading independent grammar school in the North of England noted:
"We are now thrilled with our optimised and highly available virtual environment. You simply get what you pay for in life. With DataCore, the install has been a breath of fresh air and I'm very confident in its ability to protect and optimise us for years to come." Simon Thompson, Network Manager, Bradford Grammar School.
Benefits these institutions are receiving with DataCore include the ability to:
  • Maximise the value from storage investments, current and future.
  • Optimise performance of latency-sensitive applications.
  • Automate and centralise storage management.
  • Enable "zero downtime, zero touch" availability of data.
Schools & Further Education Colleges Need Hyper-converged Too - To Gain Greater Productivity, Ease and Flexibility:
One such further education college in Southampton that wanted all the benefits of compute, storage, networking and virtualisation from a single hardware appliance, was Richard Taunton Sixth Form College, with over 1,300 students. Here, the IT team were keen to adopt hyper-converged across 40TB of usable storage space, which was delivered as a mirrored hyper-converged host using DataCore's Virtual SAN Hyper-converged solution. At the sixth form college, local storage was presented as an iSCSI target to local VMs, and mirrored in an active-active configuration to the other host. Using DataCore's inbuilt Auto Tiering functionality, the college is now able to utilise SSD Flash technology for rapid access to all their hot data, while seamlessly apportioning less utilised ‘cooler' data to large SAS disks, saving on budget. The sixth form college was also able to downgrade their former maintenance contracts from the costly 24x7 with 4 hour response SLAs, to next business day SLAs, given that their hyper-converged mirrored system now seamlessly defaults to the other host. 
"For schools and colleges of all sizes, it's about maximising IT infrastructure performance, availability and utilisation by productively using smart software to virtualize and add the needed flexibility to meet changing demands," said George Teixeira, CEO and Co-Founder of DataCore. "Educational institutions' IT infrastructures are often complex and decentralized, being the product of many years of accumulations and different departmental agendas. Additionally, their high-performance applications require predictable performance and scalability for a wide variety of mission-critical workloads such Oracle and SQL Server databases. By offering the best price-performance on the market, DataCore can ensure industry-best response times - making institutions' IT faster and meaning their infrastructure can be massively consolidated, eliminating complexity."
The last words revert back to Spencer Wood, University of Birmingham:
"Within the data centres, DataCore has exceeded our requirements. Day to day, we are not sure how we would operate without it. It immediately improved the performance of the VMs. We are now Auto Tiered to make the most of our storage which has lowered our costs, as we have been able to move off Fibre Channel. Downtime is now really simple for us. We can simply take a data centre offline and no services will be disrupted."

Friday, 20 January 2017

DABCC: Hyper-converged Storage Podcast with Doug Brown and Sushant Rao

DABCC-Radio-Feature-Image
In episode 268, Douglas Brown interviews Sushant Rao, Sr. Director of Product & Solutions Marketing at DataCore Software. Sushant and Douglas discuss DataCore Software’s hyper-converged storage solution. Sushant is another deep technical expert from DataCore and does a great job diving deep in to hyperconvergence, DataCore, storage, parallel processing and much more! This is a very technical deep dive from one of the industries true experts.

This is 4th in a series of podcasts with DataCore Software, if you missed the previous issues then look no further:

Monday, 16 January 2017

Parallel Processing Software Will be a 'Productivity Disrupter' and Game Changer in 2017

VMblog 2017 prediction

VMblog 2017 Virtualization and Cloud Prediction 
Contributed by George Teixeira, President and CEO, DataCore Software
With so much computing power still sitting idle - despite all of the incredible technology advancements that have occurred - in 2017, the time is right for parallel processing software to go mainstream and unleash the immense processing power of today's multicore systems to positively disrupt the economic and productivity impact of what computing can do and where it can be applied.
New software innovations will make 2017 a breakout year for parallel processing. The key is that the software has to become simple to use and non-disruptive to applications to allow it to move from specialized use cases to general application usage. By doing so, the impact of this will be massive because application performance, enterprise workloads and greater consolidation densities on virtual platforms and in cloud computing that have been stifled by the growing gap between compute and I/O will no longer be held back. This will be realized with new parallel I/O software technologies now available that are easy to use, require no changes to the applications and are capable of fully leveraging the power of multicores to dramatically increase productivity and overcome the I/O bottleneck that has been holding back our industry; this is the catalyst of change.
Parallel processing software can now go beyond the realm of specialized uses such as HPC and areas like genomics that have focused primarily on computation, and impact the broader world of applications that require real-time responses and interactions. This includes mainstream applications and storage that drive business transactions, cloud computing, databases, data analytics, as well as the interactive worlds of machine learning and the Internet of Things (IoT).
The real driver of change is the economic and productivity disruption. Today, many new applications such as analytics are not practical because they require hundreds if not thousands of servers to get the job done; yet each server is becoming capable of supporting hundreds of multi-threading computing cores, all available to drive workloads that until now have sat there idle, waiting for work to do. We are ushering in an era where one server will do the work of 10 -- or 100 servers -- of the past. This will be the result of parallel processing software that unlocks the full utilization of multicores, leading to a revolution in productivity and making a new world of applications affordable to mainstream IT in 2017.
The Impact on Real-time Analytics and Big Data Performance will be Profound
The combination of faster response times and the multiplying impact on productivity through parallelization will fuel the next step forward in ‘real-time' analytics, big data and database performance. DataCore sees this as the next step forward in 2017. Our background in parallel processing, real-time I/O and software-defined storage has made our company uniquely well positioned to take advantage of the next big challenge in a world that requires the rate and amount of interactions and transactions to happen at a far faster pace with much faster response times.
The ability to do more work by doing it in parallel -- and to react quickly -- is the key. DataCore sees itself as helping to drive the step function change needed to make real-time analytics and big data performance practical and affordable. The implications on productivity and business decision making based on insights from data in areas such as financial, banking, retail, fraud detection, healthcare, and genomics, as well as machine learning and Internet of Things type applications, will be profound.
The Microsoft Impact Arrives: Azure Stack, Hybrid Cloud, Windows and SQL Server 2016
The success and growth of Microsoft's Azure Cloud has already become evident, however the real impact is the larger strategy of how Microsoft has worked to reconcile the world of on-premise and cloud computing. Microsoft was one of the first cloud vendors to recognize that the world is not just public clouds but that it will continue to be a mix of on-premise and cloud. Microsoft's Azure Stack continues to advance in making it seamless to get the benefits of cloud-like computing whether in the cloud or within a private cloud. It has become the model for hybrid cloud computing. Likewise, Microsoft continues to further integrate its Windows and server solutions to work more seamlessly with cloud capabilities.
While Windows and Azure get most of the attention, one of the most dramatic changes at Microsoft has been how it has reinvented and transformed its database offerings into a true big data and analytics platform for the future. It is time to take another look at SQL Server 2016; it is far more powerful and capable, and now deals with all types of data. As a platform, it is primed to work with Microsoft's large eco-system of marketplace partners, including DataCore with its parallel processing innovations, to redefine what is possible in the enterprise, the cloud, and with big data performance and real-time analytic use cases for traditional business applications, as well as new developing use cases in machine learning, cognitive computing and the Internet of Things.
Storage has Transformed; It's Servers + Software-Defined Infrastructure!
We are the midst of an inevitable and increasing trend in which servers are defining what storage is. Escalating this trend DataCore used parallel I/O software technologies to power off-the-shelf multicore servers to drive the world's fastest storage systems in terms of performance, lowest latencies and best price-performance. Traditional storage systems can no longer keep up and are on the decline, and as a result, are increasingly being replaced by commodity servers and software-defined infrastructure solutions that can leverage their power to solve the growing data storage problem. The storage function and associated data services are now being driven by software and becoming another "application workload" running on these cost-efficient server platforms, and this wave of flexible server-based storage systems are already having a disruptive industry impact.
Marketed as server-SANs, virtual SANs, web-scale, scale-out and hyper-converged systems, they are a collection of standard off-the-shelf servers, flash cards and disk drives - but it is the software that truly defines their value differentiation. Storage has become a server game. Parallel processing software and the ability to leverage multicore server technology is the major game-changer. In combination with software-defined infrastructure, it will lead to a productivity revolution and further solidify "servers as the new storage." For additional information, see the following report:http://wikibon.com/server-san-readies-for-enterprise-and-cloud-domination/
What's Beyond Flash?
Remember when flash was the next big thing? Now it's here. What is the next step -- how do we go faster and do more with less? The answer is obvious; if flash is now here and yet performance and productivity are still an issue for many enterprise applications especially database use cases, then we need to parallelize the I/O processing. Why? It multiplies what can be done as a result of many compute engines working in parallel to process and remove bottlenecks and queuing delays higher up in the stack, near the application, so we avoid as much device level I/O as possible and drive performance and response times far beyond any single device level optimization that flash/SSD alone can deliver. The power of the ‘many' far exceed what only ‘one' can do - combining flash and parallel I/O enables users to drive more applications faster, do more work and open up applications and use cases that have been previously impossible to do.
Going Beyond Hyper-Convergence: Hyper-Productivity is the Real Objective
As 2017 progresses, hyper-converged software will continue to grow in popularity but to cement its success, users will need to be able take full advantage of its productivity promise. The incredible power of parallel processing software will enable users to take advantage of what their hardware and software can do (see this video from ESG as an example).
Hyper-converged systems today are in essence a server plus a software-defined infrastructure, but often they are severely restricted in terms of performance and use cases and too often lack needed flexibility and a path for integration within the larger IT environment (for instance not supporting fibre channel, which often is key to enterprise and database connectivity). Powerful software-defined storage technologies that can do parallel I/O effectively provide a higher level of flexibility and leverage the power of multicore servers so fewer nodes are needed to get the work done, making them more cost-effective. Likewise, the software can incorporate existing flash and disk storage without creating additional silos; migrate and manage data across the entire storage infrastructure; and effectively utilize data stored in the cloud.
Data infrastructures including hyper-converged systems can all benefit from these advances through advanced parallel I/O software technologies that can dramatically increase their productivity by untapping the power that lies within standard multicore servers. While hyper-converged has become the buzzword of the day, let's remember the real objective is to achieve the most productivity at the lowest cost, therefore better utilization of one's storage and servers to drive applications is the key.
The Next Giant Leap Forward - Leveraging the Multiplier Impact of Parallel Processing on Productivity
This combination of powerful software and servers will drive greater functionality, more automation, and comprehensive services to productively manage and store data across the entire data infrastructure. It will lead to a new era where the benefits of multicore parallel processing can be applied universally. These advances (which are already before us) are key to solving the problems caused by slow I/O and inadequate response times that have been responsible for holding back application workload performance and cost savings from consolidation. The advances in multicore processing, parallel processing software and software-defined infrastructure, collectively, are fundamental to achieving the next giant leap forward in business productivity.