Wednesday 19 December 2012

Jon Toigo Video on Disaster Recovery and Business Continuity: Use a storage hypervisor for data replication

Toigo Video

Toigo: Use a storage hypervisor for data replication

Video: http://searchdisasterrecovery.techtarget.com/video/Toigo-Use-a-storage-hypervisor-for-data-replication

In this video, Jon Toigo, founder and CEO of Toigo Partners International, says that relying on a storage hypervisor like DataCore's SANsymphony-V provides greater ease of management and greater resistance to disruption.

Toigo said that storage environments have isolated islands of capability on hardware from different vendors and software products that add functionality for services like provisioning and mirroring, but these capabilities don't scale as you need more capacity and have to purchase another array. But the hardware behind the different vendor's product names is "generic," said Toigo, and a storage hypervisor can bring those disparate elements together by presenting storage as logical volumes rather than physical volumes -- allowing users to pool disparate hardware together into a common storage repository.

And for disaster recovery, storage virtualization can be a real benefit because you do not need to replicate between identical devices. "I could make a copy of data over to any other platform. I don't have to use the most expensive gear to make a copy of the most expensive gear … it doesn't matter, because it's all virtual volumes that we're dealing with," said Toigo...

Friday 7 December 2012

Former VMware Managing Director GM of APAC and Japan Joins DataCore; Virtualization Veterans to Spearhead Asia Pacific Expansion and Meet Rising Global Demand

Company’s Continued Growth in Software-Defined Storage Prompts Additional Sales and Service Emphasis in Asia Pacific, China and Other Rapidly Developing Markets
 
DataCore Software has  named two well-recognized sales and operations executives with exemplary track records in the virtualization industry to spearhead the company’s business in Asia Pacific (APAC) and to develop sales in other rapidly emerging markets across the globe. The appointments facilitate DataCore’s continued worldwide growth, driven by high demand for its SANsymphony™-V Storage Hypervisor and the pivotal role it plays in software-defined data centers and cloud solutions.

 
Former VMware Managing Director/GM of APAC and Japan Joins DataCore
Manish Sharma’s extensive domain expertise and intimate familiarity with the region will be instrumental in the role of general manager (GM) and VP for APAC. He adds more than two decades of continuous sales and management experience to DataCore’s established presence throughout the territory. Sharma was VP and GM for Appirio Inc., a technology-enabled services company, leading the organization to 85 percent year-over-year growth while helping enterprise customers adopt cloud solutions.
 
Prior to that, Sharma spent six years in executive management posts within VMware’s APAC and Japan (APJ) market. He was largely responsible for managing and building out the VMware virtualization partner network, overseeing enterprise product sales and integration, channel enablement and field operations. He started at VMware as GM of the APJ Partner Organization and progressed rapidly to become the Managing Director and GM of Emerging Products for all of APJ. Sharma is also very familiar with the countries within the Association of Southeast Asian Nations (ASEAN), having served as managing director for BEA, as well as director of sales and business development in Australia and New Zealand. His earlier sales management tenure took him regularly to Bangkok, Kuala Lumpur and India. He will be strategically based in Singapore.

New Emerging Markets Position
Assuming DataCore’s newly created role of VP Emerging Markets is Peter Thompson. He will focus on driving the business expansion, key programs and partnerships needed within high-growth areas in APAC and the Americas regions, with a major emphasis on developing the Chinese marketplace. In addition, Thompson, who is based in Silicon Valley, will work with major industry partners, technology alliances and their global teams both locally and overseas to develop joint business opportunities and programs focused on emerging markets.

Thompson possesses more than two decades of expertise in APAC markets and was instrumental in developing DataCore’s storage virtualization business throughout the region, as well as its presence, partner network and offices in Japan, China and Australia. Formerly DataCore’s Managing Director, Asia Pacific, Thompson oversaw all sales, marketing and operations activities and was based in Japan for a number of years. He joined DataCore in 2000 to help develop the company’s APAC market strategy after a decade with Inabata & Co., a 200 year old trading company based in Osaka, where he led sales and new business efforts, mainly for the semiconductor industry.

“We see tremendous global demand for our solutions and are investing in an executive team with the proven experience to drive and significantly grow our market share in APAC and other emerging regions,” said Steve Houck, COO, DataCore Software. “Manish and Peter have proven records of success and they understand the challenges of growing virtualization companies in these markets. Furthermore, the IT industry is rife with businesses looking for cost-effective solutions to overcome the storage challenges related to virtualizing Tier 1 business-critical applications, transitioning to cloud computing, and taking on Big Data. Bottom-line, there’s growing demand for the powerful software-defined storage infrastructure that our SANsymphony-V Storage Hypervisor can provide.”

Priorities for the new leaders include channel expansion, partner enablement and closer alignment with the adjacent server and desktop virtualization ISVs and hardware suppliers (e.g., SSD/Flash) critical to modern datacenters. Their proven instincts for building rich relationships throughout APAC and the worldwide virtualization and storage community will immediately be put to use in enabling customers and partners to maximize the performance, availability and utilization of their IT environments.

VARs, system builders, managed service providers and cloud hosters operating in APAC and other emerging markets can learn how to extend their product and services portfolios with complementary storage virtualization solutions from DataCore by contacting sanvantage@datacore.com.
 

Friday 30 November 2012

High Performance Storage Virtualization and Streamlining Virtualizing Tier 1 Apps

By Steve Houck

The magic to making tier 1 apps perform happens in the adaptive technology known as high-performance storage virtualization. http://virtualizationreview.com/articles/2012/11/14/vv-streamline-tier-1-apps.aspx?sc_lang=en

Back in 2010 when I was at VMware, I would have bet you money that within months, the virtualization movement was going to sweep up enterprise apps with the roar of an unabated forest fire. But it didn't.

What seemed like fait accompli at the time turned out to be far more elusive than any of us could have predicted. The naïve, invigorated by the thrill of consolidating dozens of test/development systems in a weekend, bumped hard against a tall, massive wall. On the vendor side, we fruitlessly threw more disk drives, sheet metal, and plumbing at it. The price climbed, but the wall ceded not.

Fast forward to late 2012. Many still nurse their wounds from those attempts, unwilling to make another run at the ramparts which keep Tier 1 apps on wasteful, isolated servers until someone they trust gets it done first. To this day, they put up with a good deal of ribbing from the wise systems gurus, who enjoy reminding us why business critical apps absolutely must have their own, dedicated machines.

The seasoned OLTP consultants offer a convincing argument. Stick more than one instance of a heavily loaded Oracle, SQL Server or SAP image on a shared machine and all hell breaks loose. You might as well toss out the secret book on tuning, because it just doesn't help.

To some degree, that's true, even though the operating systems and server hypervisors do a great job of emulating the bare metal. It's an I/O problem, Mr. Watson.

It's an I/O problem indeed, into and out of disks. Terms like I/O blending don't begin to hint at the complexity and chaos that such consolidation introduces. Insane collisions at breakneck speeds may be more descriptive. Twisted patterns of bursty reads queued up behind lengthy writes, overtaken by random skip-sequential tangents. This is simply not something one can manually tune for, no matter how carefully you separate recovery logs from database files.

That's before factoring in the added pandemonium when the shared array used by a DB cluster gets whacked by a careless construction worker, or a leaky pipe drips a little too much water on the redundant power supplies.

Enter the adaptive technology of high-performance storage virtualization. Whether by luck or design, the bedlam introduced when users collapse multiple enterprise apps onto clustered, virtualized servers, mirrors the macro behavior of large scale, discreet servers banging on scattered storage pools. The required juice to pull this off spans several crafts. A chunk of it involves large scale, distributed caching. Another slice comes from auto-sensing, auto-tuning and auto-tiering techniques capable of making priority decisions autonomically at the micro level. Mixed in the skillset is the mysterious art of fault-tolerant I/O re-direction across widely dispersed resources. You won't find many practitioners on LinkedIn proficient in cooking up this jambalaya. More importantly, you won't have to.

In the course of the past decade, this enigmatic mojo and the best practices that surround it have been progressively packaged into a convenient, shrink-wrapped software stack. To play off the similarities with its predecessors, the industry calls it a storage hypervisor.

But I stray. What owners of business critical apps need to know is that they can now confidently virtualize those enterprise apps without fear of slow erratic service levels, given, of course, that they employ a high-performance, fully redundant storage hypervisor to yield fast, predictable response from their newly consolidated environment. Instead of throwing expensive hardware at the problem, or giving up altogether, leave it to the intelligent software to manage the confluence of storage traffic that characterizes virtualized Tier 1 programs. The storage hypervisor cost-effectively resolves the contention for shared disks and the I/O collisions that had previously disappointed users. It takes great advantage of new storage technology like SSDs and Flash memories, balancing those investments with more conventional and lower-cost HDDs to strike the desired price/performance/capacity objectives.

The stuff shines in classic transactional ERP and OLAP settings, and beyond SQL databases does wonders for virtualized Exchange and SharePoint as well.

Sure, the advanced new software won't stop the veterans from showing off their scars while telling picturesque stories about how hard this was in the old days. Though, it will give the current pros in charge of enterprise apps something far more remarkable that they too can brag about -- without getting their bodies or egos injured on the way.

Thursday 29 November 2012

Brennercom Adds DataCore Storage Hypervisor for Business Continuity and High-Performance Storage for their New VMware View Desktops and Cloud Services

Brennercom, an Italian-based Telecommunication Technology Company, has Extended its Redundant, High-availability Storage Infrastructure to Service its Private Cloud and Virtual Desktop Requirements.
http://www.it-director.com/technology/storage/news_release.php?rel=35287

DataCore Software today announced that information and communication technology (ICT) company Brennercom has attained a new level of business continuity and performance for its virtual desktop infrastructure (VDI) and cloud services using the DataCore SANsymphony™-V Storage Hypervisor.

By making use of the SANsymphony-V storage hypervisor, corporate data is centrally administrated on a variety of different hardware storage solutions and all of it is protected by DataCore’s synchronous mirroring capability. The virtualization software from DataCore required only a quarter of the investment that would have been needed for a hardware-based SAN to provide a stable, high performance VMware View VDI for 160 desktop platforms, in addition to supporting the storage needs for all of its virtual machines running on VMware vSphere.

"The virtualization and common central management provided by the DataCore storage hypervisor considerably reduced our IT division's work load. New systems can now be fully set up for users within 10 minutes, unlike the laborious, many hours and days set-up required to install physical server storage systems in the past. Fast provisioning and capacity expansions can now be easily implemented via the central console with a few mouse clicks in the event of an acute need for storage," explained Roberto Sartin, head of Technical Division at Brennercom.

Establishment of Virtual Desktop and Cloud Infrastructure
At Brennercom, internal IT services are provided by the IT Management Division. The Division decided to extend its existing virtual VMware server infrastructure and to establish a VDI and cloud services based on VMware. To handle the expansion, it was decided that a new approach to managing data storage was needed.

The primary drivers were two-fold: First, Brennercom needed to expand its external computer center to accommodate new cloud computing services. Central and efficient system administration was a core objective of this extension. Second, Brennercom needed greater high-availability due to the increased business continuity requirements that would result from its move to centralization. The project plan and investments also encompassed the need for a later partial move of a number of the systems to a second location in Trento (about 30 miles away) to ensure that the immediate, high-availability system could be backed up by a two-site disaster recovery model for the purposes of required ISO audits.

The company also had to consider the planned consolidation of its current, heterogeneous IT landscape at the Bozen site. While the central computer center services were based on a fiber channel infrastructure, some divisions were making use of iSCSI storage. Apart from vSphere virtual machines, Citrix XenServer was also used in some areas.

Cost-effective VDI with 160 desktops
The VDI with VMware View is based on an integrated system supporting virtualized storage and virtualized servers. The decision to use this platform was taken after the positive experience gained with the VMware hypervisor. The 160 desktops are successfully being migrated to the notebooks or thin clients of the field workers and the helpdesk, using the centralized infrastructure. The long-term benefits of VDI lie in the lower costs and the less cumbersome, centralized administration needed when it comes to the setup, updating and maintenance of these virtual desktops.

"On the storage side, the VDI and the virtual servers are supported by DataCore’s round-the-clock, failsafe storage infrastructure and performance has been enhanced by intelligent caching, fulfilling all our expectations," comments Sartin. "By using the DataCore storage hypervisor, we were able to integrate a technically complex solution with a universal range of services to meet the short-term performance and high-availability requirements of our VDI needs. In addition, the integrated migration and replication features have created the basis for efficiently implementing the planned model we need for disaster recovery."

Flexible Infrastructure for Cloud Services
The next large-scale project to be concluded by year-end 2012 is dividing and synchronizing the existing systems between the computer centers in Bolzano and Trento so that operations can continue at one location in the event of a catastrophe.

"As is the case in other industries, business continuity is an absolute necessity for us. By making use of the DataCore solution within the virtual infrastructure created by VMware vSphere and VDI, we cannot only ensure that we meet these corporate requirements, but also guarantee optimal cost efficiency as a result of the hardware independence of the solution. This affects both the direct investment and the indirect and long-term cost of refreshes, expansions and added hardware acquisitions. We have thus created the technical basis for our external IT services, and within this framework we are creating the most flexible and varied range of cloud services possible," concludes Brennercom CEO, Dr. Karl Manfredi.

To read more regarding this deployment, please access a complete case study concerning DataCore’s implementation at Brennercom: Brennercom SpA Case Study.

Thursday 22 November 2012

Storage virtualization solutions help make the most of virtualization

Good recent article worth to share: GARTNER analyst recently spoke on why Storage Virtualization solutions make the most of virtualization, SSDs, Auto-tiering http://searchvirtualstorage.techtarget.com/news/2240169260/Storage-virtualization-solutions-help-make-the-most-of-virtualization

...Server virtualization allows much higher rates of system usage, but the resulting increases in network traffic pose significant challenges for enterprise storage. The simple "single server, single network port" paradigm has largely been displaced by servers running multiple workloads and using numerous network ports for communication, resiliency and storage traffic.
Virtual workloads are also stressing storage for tasks, including desktop instances, backups, disaster recovery (DR), and test and development.
At Gartner Symposium/ITxpo recently, Stanley Zaffos, a Gartner research vice president, outlined the implications of server virtualization on storage and explained how storage virtualization solutions, the right approach, and the proper tool set can help organizations mitigate the impact on enterprise storage.
Consider using storage virtualization. Gartner's Zaffos urges organizations to deploy storage virtualization as a means of better storage practice, and he underscores core benefits of the technology:
  • Storage virtualization supports storage consolidation/pooling, allowing all storage to be "seen" and treated as a single resource. This avoids orphaned storage, improves storage utilization and mitigates storage costs by reducing the need for new storage purchases. The benefits of storage consolidation increase with the amount of storage being managed.
  • Storage virtualization supports agile and thin provisioning, allowing organizations to create larger logical storage areas than the actual disk space allocated. This also reduces storage costs because a business does not need to purchase all of the physical storage up front -- simply add more storage as the allocated space fills up. Later tools may allow dynamic provisioning where the logical volume size can be scaled up or down on demand. Management and capacity planning is important here.
  • Storage virtualization supports quality of service (QoS) features that enhance storage functions. For example, auto-tiering can automatically move data from faster and more expensive storage to slower and less expensive storage (and back) based on access patterns. Another feature is prioritization, where some data is given I/O priority over other data.
Consider using solid-state drives (SSDs). One of the gating issues for storage is the lag time caused by mechanical delays that are unavoidable in conventional hard-disk technologies. This limits storage performance, and the effects are exacerbated for virtual infrastructures where I/O streams are randomly mixed together and funneled across the network to the storage array, creating lots of disk activity. Storage architects often opt to create large disk groups. By including many spindles in the same group, the mechanical delays are effectively spread out and minimized because one disk is writing/reading a portion of the data while other disks are seeking. Zaffos points to SSDs as a means of reducing spindle count and supplying much higher IOPS for storage tasks.
Plan the move to virtualization carefully. Data center architects must develop a vision of their infrastructure and operation as they embrace virtualization. Zaffos suggested IT professionals start by identifying and quantifying the impact server virtualization, data growth and the need for 24/7 operation will have on the storage infrastructure and services.
Next, determine what you actually need to accomplish and align storage services with the operational abilities and physical infrastructure. For example, if you need to emphasize backup/restoration capabilities, support data analytics, or handle desktop virtualization, it's important to be sure that the infrastructure can support those needs. If not, you may need to upgrade or make architectural changes to support those capabilities.
When making decisions for virtualization, Zaffos notes the difference between strategic and tactical issues. Strategic decisions create lock-in, and tactical decisions yield short-term benefits. For example, the move to thin provisioning is a tactical decision, but the choice to use replication like SRDF would be a strategic decision.
...Ultimately, Zaffos notes that storage virtualization solutions can be a key enabling technology for server and desktop virtualization -- both of which place extreme demands on the storage infrastructure. But, he said, the move to storage virtualization takes a thorough understanding of the benefits, careful planning to ensure proper alignment with business and technical needs, and judicious use of storage technologies like tiers and SSD.

Wednesday 21 November 2012

Storage hypervisor: Storage's future? It' the software that matters: Software-defined Storage; Storage Virtualization

By now you may have heard the term "storage hypervisor." You probably don't know exactly what it means, but that isn't your fault. Vendors that use the term to describe their products disagree on the exact meaning, although they mostly agree on why such a technology is useful.

A vendor panel at the Storage Networking World (SNW) show in Santa Clara, Calif., last month set out to define storage hypervisor. The represented vendors sell different types of products, though. The panel included array-based virtualization vendor Hitachi Data Systems Corp., network-based storage virtualization vendor IBM, software SAN virtualization vendor DataCore Software Corp. and virtual machine storage management vendor Virsto Software Corp.

Can all of these vendors' products be storage hypervisors? It's more accurate to say that, taken together, the storage hypervisor products make up an overview of storage virtualization under a new name. And that new name is already giving way to a newer term. "Software-defined storage" was used interchangeably with "storage hypervisor" during the SNW panel.

Software-defined storage is no better defined than storage hypervisor, but it includes the "software-defined" phrase taking over the data center and networking these days.

DataCore Software Corp. CEO George Teixeira said his company was ahead of the current trend when it started back in the 20th century with the premise that software gives storage its value.

"Today we have fancy terms for it like software-defined storage, but we started DataCore in 1998 with a very basic [PowerPoint] slide that said, 'It's the software that matters, stupid,'" Teixiera said. "And we've seen storage from the standpoint of really being a software design."

Teixiera said any talk of a storage hypervisor must focus on software.

"Can you download it and run it? And beyond that, it should allow users to solve a huge economic problem because the hardware is interchangeable underneath," he said. "Storage is no longer mechanical drives. Storage is also located in flash. Your architecture can incorporate all the latest changes, whether it's flash memory or new kinds of storage devices. When you have software defining it, you really don't care.

"Just like with VMware today," said Teixiera, you really don't care whether it's Intel, HP, Dell or IBM servers underneath. Why should you care about the underlying storage?"

Read more at: http://searchvirtualstorage.techtarget.com/Storage-hypervisor-Hypothetical-or-storages-future

The Red Cross Embraces DataCore Software's SANsymphony-V Storage Hypervisor to accelerate data mining speed by 300 Percent

http://finance.yahoo.com/news/red-cross-embraces-datacore-softwares-120000196.html

DataCore Significantly Optimizes Online Analytical Processing Performance

DataCore Software today announced that the British Red Cross Society has deployed the SANsymphony™-V Storage Hypervisor to provide a significant performance acceleration on its new Online Analytical Processing (OLAP) system, dramatically shortening response times and increasing the reliability of data extraction. The performance improvements were achieved by installing SANsymphony-V on an HP Proliant DL 370 server. The DataCore™ software has reduced the time window needed to perform the Extract, Transform and Load (ETL) operations from an average of 12 hours down to four, with the load spread across half the original number of internal hard disk drives achieved through the efficiency of SANsymphony-V's thin provisioning.

The British Red Cross Society is the United Kingdom's registered charity arm of the worldwide humanitarian organization, the International Red Cross. Formed in 1870, the Red Cross has over 31,000 volunteers and 3,300 staff providing assistance and aid to all people in crisis, both in the UK and overseas, without discrimination and regardless of their ethnic origin, nationality or religion.

"In order to sustain the Charity's considerable ongoing work worldwide, the Red Cross needs to continually generate additional income from new and existing donors," said Kevin Bush, technical architect for the Charity's MIS Enterprise Architecture Team in London. "It is our function in MIS to ensure the relevant departmental units have the appropriate infrastructure available to allow them to complete automated processes in time to fulfil marketing campaigns to drive further donations."

To help facilitate ongoing fundraising, a new suite of hardware and business intelligence tools were deployed six months ago for the British Red Cross utilizing OLAP - an approach that swiftly answers multi-dimensional analytical queries through accurate Business Intelligence (BI) tools deployed on British Red Cross' SQL Server database. BI data marts are created to track behavioral changes, creating campaign relevancy trends for business units. This level of data profiling, specifying individual campaigns with matched targets, entails significant I/O (Input/Output) processing demands and depends on a stable, optimized infrastructure.

Working in conjunction with the MIS Enterprise Architecture Team, the British Red Cross's partner, Adapto, recommended that deploying DataCore's SANsymphony-V software would significantly decrease I/O strain and dramatically increase performance in a cost effective, non-invasive way. The SANsymphony-V storage hypervisor could dramatically improve performance levels by increasing the speed of read/write requests across the entire British Red Cross storage infrastructure using the storage server memory as the caching engine. This caching could dramatically accelerate application response times, manifesting in a dramatic increase in the speed of database queries and data extraction for the business units.

Critical to the effectiveness of the Extract/Transform and Load (ETL) from the database is achieving ongoing consistency within a predefined extraction window. The speed of I/O to process workloads determines these two factors; a slow I/O equates to a long and erratic extraction window. In practice, prior to the performance caching gains, each ETL was taking between nine and 15 hours, being set to run overnight with the resultant data marts ready in time for the next working day.

Following Adapto's suggestion, Kevin downloaded the easy to install SANsymphony-V test drive and right away ran a test ETL that displayed immediate benefits through DataCore's mega caching ability, with the software recognizing I/O patterns to anticipate which blocks to read next into RAM from the back-end disks. Requests became fulfilled quickly from memory at electronic speeds, eliminating the delay associated with the physical disk I/O. The findings were impressive. The production-ready, easy to use GUI allowed the ETL to perform at a blistering pace, similar to that achieved by SSD but without the associated cost overheads. This manifested in a shorter four hour query extraction timeframe.

"From the point of evaluation onwards, we haven't looked back with SANsymphony-V," said Bush. "It's caching and performance acceleration has certainly addressed the consistency of extraction, whilst reducing the window to an acceptable level, so that as a Charity, we can concentrate on effective fundraising to help those most in need. We are so impressed that we are now looking at installing another node of SANsymphony-V for high availability and mirroring."

Saturday 17 November 2012

Check out all the latest Product Reviews on SANsymphony-V

Most recent: SANsymphony-V R9.0 Product Review by NT4ADMINS Magazine

Greater scalability, improved administration functions and close integration with vSphere environments and system management suites are the main core characteristics of Release 9 of the SANsymphony™-V storage hypervisor. Above all, the 'group operations' make life easier for the administrator:

Friday 16 November 2012

Set your data free with Dell Fluid Data™ and DataCore SANsymphony-V

When businesses change, whether in response to a new opportunity or a competitive challenge, the applications and the data they depend on have to change too. That can be really hard with legacy storage solutions, whose rigid boundaries tend to hold data captive. This is especially true if, as is often the case, the storage infrastructure has been built up over time out of various “point solutions.” This creates inefficient data silos that make it hard to optimize the match between storage capabilities and application needs or take advantage of new hardware capabilities. Availability and disaster recovery capabilities can suffer as well.

The Fluid Data™ architecture from Dell is designed to overcome these storage challenges by making data as dynamic as the businesses that depend upon it. DataCore is a long-time Dell ISV partner, and we’ve been working with our reseller partners around the world to help our customers realize the benefits of Fluid Data. “We have thousands of DataCore storage hypervisor customers using Dell storage platforms,” says Carlos Carreras, DataCore’s Vice President of Alliances & Business Development. “We see many DataCore partners like The Mirazon Group and Sanity Solutions penetrating non-Dell accounts and leveraging SANsymphony-V to make it easier for customers to meet their storage needs with Dell solutions.”

The DataCore SANsymphony-V storage hypervisor lets Dell resellers seamlessly harness the Dell Fluid Data architecture and its wide range of products to address the storage appetite of their customers, including platforms such as Compellent, EqualLogic, and the PowerVault MD Series. Customers can add these cost-effective Dell solutions to their storage portfolio without a forklift upgrade, preserving their storage investments and prolonging the useful life of existing storage (e.g., moving it down-tier) while leveraging the power of Fluid Data for increased storage efficiency and performance. The DataCore storage hypervisor and enterprise-wide auto-tiering makes it easy to penetrate and refresh existing storage installations and add new Dell storage to modernize the infrastructure and lower overall costs. With the SANsymphony Cloud Gateway, customers can even add popular public cloud hosting services as a low-cost tier in their storage strategy. With DataCore and Dell, customers get infrastructure-wide storage management and the compelling benefits of Fluid Data across all their storage investments.

For DataCore partners, the new DataCore SANsymphony-V Migration Suite makes it easy to introduce new customers to Dell storage with completely non-disruptive data migration. The suite enables a DataCore partner to set up a temporary dual-node SANsymphony-V installation that can turn a hardware refresh into a zero-impact process. A pass-through architecture assures that the customer’s environment remains “hot” the entire time. Users never even know a migration has taken place.

“In customer meetings, I am often met with skepticism that there is no way to do an easy migration without a lot of disruption. After they see the power of DataCore storage virtualization software in action, their jaws literally drop because they cannot believe that it can be that simple to migrate their storage and VMs,” said Barry Martin, partner and chief technology officer at The Mirazon Group.

Barry also notes that the heat map recently introduced in SANsymphony-V 9.0 is an especially powerful analytical tool. “While the migration suite is in place, you could show the customer all their storage I/O ‘hot spots,’ and where, for instance, a SSD tier could boost the performance of critical applications. Being able to give that kind of strategic advice is key to our business success, and the visual impact makes it all the more powerful.”

These and other features make SANsymphony-V a natural complement to the Dell Fluid Data architecture. You can start learning more about SANsymphony-V here, or check out case studies in a variety of industries and applications to see how the DataCore storage hypervisor can go to work for you.

SEE DATACORE SOFTWARE AT THIS UPCOMING DELL EVENT

Dell Storage Forum Paris 2012

DataCore will be a Petabyte Sponsor at the upcoming Dell Storage Forum.

The event takes place 14-16 November 2012 in Paris. Address follows:

Marriott Rive Gauche Hotel & Conference Center
17 Boulevard Saint Jacques
Paris, 75014
France

Description
This is a channel partner and an end-user focused event. DataCore will present the newest version of its storage hypervisor – SANsymphony-V 9.0.

For more information on the show, visit Dell Storage Forum Paris 2012.

Tuesday 13 November 2012

SC12 Supercomputing Conference: Storage technology leaders Fusion-io and DataCore Software team up to showcase new joint solution for data-intensive, HPC applications

DataCore Software Featured in Fusion-io Booth #2012 at SC12 Conference

DataCore Software, the storage hypervisor leader and premier provider of storage virtualization software, invites attendees of SC12, the international conference for high performance computing (HPC), networking, storage and analysis, to explore innovative new ways to take advantage of Fusion-io flash memory technologies in data-intensive environments. DataCore will be exhibiting in Fusion-io booth #2201 at the Salt Palace Convention Center in Salt Lake City, Utah, November 12-15, 2012.

DataCore will showcase its SANsymphony™-V storage hypervisor integrated with the Fusion ioMemory platform to meet the large scale, low-latency needs of HPC applications common to many SC12 visitors. Attendees will learn how DataCore applies state-of-the-art auto-tiering technology to dynamically distribute I/O workloads between blazing fast Fusion-io flash memory and conventional high-density disk farms for an optimal price/performance balance.

Experts will also be on hand to give advice on how to eliminate crippling single points of failure by using the SANsymphony-V software to mirror data between redundant, multi-tier storage pools.

Fusion-io products are well known for accelerating databases, cloud computing, big data and HPC applications in a variety of industries, including e-commerce, social media, finance, government and telecommunications. Combined with DataCore’s™ storage hypervisor, customers not only enjoy higher performance and availability, but also superior flexibility and exceptional value from their storage investments.

“High performance computing requires applications to process data at speeds that transform data into discovery,” said Tyler Smith, Fusion-io vice president of alliances. “Like other data-driven webscale and enterprise organizations, HPC innovators are also cost-conscious and mindful of data protection. Our collaboration with DataCore Software provides a powerful integrated solution that ensures data and applications are available and ready to efficiently deliver peak performance.”

“SC12 provides a fantastic backdrop to convey the joint value resulting from DataCore’s long-standing relationship with Fusion-io. We are seeing great results with many customers leveraging our combined hardware and software capabilities as the centerpiece for their most demanding workloads,” adds Carlos Carreras, vice president of alliances and business development
at DataCore Software.

SC12 is the premier international conference for high-performance computing, networking, storage and analysis. The conference is expecting 10,000 attendees representing more than 50 countries and 366 exhibitors. Exhibits and technical presentations at SC12 will offer a look at the state-of-the art solutions in high performance computing and a glimpse of the future.
  • What: DataCore and Fusion-io demonstrations at SC12 Conference
  • Where: Booth 2201, Salt Palace Convention Center, Salt Lake City, Utah
  • When: November 12-15, 2012 

Network Computing: DataCore's Storage Hypervisor - An Overview & Customer Use Cases –New Release Features

Network Computing: DataCore's Storage Hypervisor - An Overview –Part 1: New Release Features

By David Hill, David Hill is an IT Author and Chief Analyst and Principal of Mesabi Group LLC. DataCore Software is not a client of David Hill and the Mesabi Group.

A storage hypervisor is an emerging term used by some vendors to describe their approach to storage virtualization. Several companies offer storage hypervisors, including IBM, Virsto and DataCore. I've already written about IBM and Virsto in previous blogs.

Now it's DataCore's turn. DataCore is an independent software vendor (ISV), so it has no financial interest in selling the underlying storage hardware. It supports both virtualized servers and traditional physical hosts and legacy storage with the same feature stack and automation. DataCore's storage hypervisor is a software product called the SANsymphony-V. This blog will examine some enhanced and new features of the version 9 release.

Auto-tiering Auto-tiering is a "hot" topic (pun intended!) with not only tier-0 solid state devices, but also performance (SAS or FC) hard disk drives, capacity (SATA), and archived storage that can even be rented from public cloud providers at a distance. This feature also includes automatic tuning that creates heat maps to reveal heavy disk activity, so that the hottest data gets the most attention (in order to meet performance service level requirements). It also automates load balancing across the available disk resources.


Network Computing: DataCore's Storage Hypervisor - An Overview –Part 2: Two Customer Use Cases

Host.net is a service provider that offers VM and enterprise storage platforms in multiple virtual private data centers (i.e., Host.net hosts customer compute and storage resources at its data centers) that are all connected to a Cisco-based10Gbps multinational backbone. Among the many services the company offers are virtual enterprise servers, storage, backup/restore, disaster recovery and colocation.

DataCore is at the heart of Host.net's enterprise SAN storage platform. Host.net believes DataCore offers the necessary performance and data integrity (every byte of data is written twice within a synchronous mirror) at a competitive price. Among the things Host.net likes about DataCore are hardware independence (for example, in a SAN hardware refresh it can add and migrate data on the fly with no downtime), operating system independence and robust I/O performance, as DataCore's use of hundreds of gigabytes of high-speed cache essentially turns a traditional SAN into a high-speed hybrid solid-state SAN at a fraction of the cost.

X-IO (formerly Xiotech) builds hardware with its Hyper ISE (Intelligent Storage Elements) storage system. With a great deal of engineering experience and innovation, the goal is to deliver high performance to accelerate enterprise applications at good price/performance level. However, X-IO has decided to shed itself of the storage and data management software (such as snapshot and replication software) that typically characterizes enterprise-class storage.

But customers still need storage and data management software. DataCore comes provides those capabilities in X-IO products. As a result, X-IO can take a hardware-intensive focus and improve price/performance while DataCore picks up the slack.

Wednesday 31 October 2012

DataCore is a Platinum Sponsor and Keynote presenter at SNW Europe, Datacenter & Virtualization World 2012; Learn What’s New and Why Thousands of Customers Choose to Run their Business with DataCore Software

“Thousands of customers throughout the world  have already realized the compelling performance and productivity advantage of DataCore™ SANsymphony-V,“ states Christian Hagen, Vice President and Managing Director EMEA. "The DataCore storage hypervisor greatly improves the economics and harnesses the full power of server caches, solid state disks (SSDs) and existing storage assets so that application owners no longer need to 'rip and replace' storage infrastructures and pay much higher costs to meet their performance and uptime objectives."

At SNW Europe 2012, attendees are learning: How to Run Faster Virtualized: By Eliminating the I/O Bottlenecks in Clouds and Virtualized Data Centers
30. Oct, 11:20 – 11:50:
Keynote: How new storage solutions enable organizations to reinvent themselves
Christian Hagen, Vice President EMEA & Managing Director, DataCore Software
30. Oct, 11:55 and 15:20 + 31. Oct. 11:20:
Hands-On-Labs: How to configure your SAN with the one and only true Storage Hypervisor SANsymphony-V
Presented by Christian Marczinke, Director Strategic Systems Engineering & Chief Solutions Architect EMEA


30. Oct, 14:40 - 15:00:
Focus Session: Speeding the Transition to a Responsive, Virtualized Storage Infrastructure
Alexander Best, Director Technical Business Development EMEA, DataCore Software

31. Oct, 10:15 – 10:50:
Vendor Updates: SANsymphony™- V 9.0 – What's New in the Storage Hypervisor for the Enterprise
Alexander Best, Director Technical Business Development EMEA, DataCore Software

SNW Europe 2012: DataCore Software Presents 'What's New in the Storage Hypervisor for the Enterprise' and Powers Cloud and Business Applications to Run Faster Virtualized

DataCore is a Platinum Sponsor and Keynote presenter at SNW Europe, Datacenter Technologies & Virtualization World 2012; Learn What’s New and Why Thousands of Customers Choose to Run their Business with DataCore Software



Today at the SNW Europe, Datacenter Technologies & Virtualization World 2012 event in Germany, DataCore Software showcased many of the powerful storage management capabilities of SANsymphony™-V 9.0, "The Storage Hypervisor for the Cloud" including ‘Heat maps’ and tools to pinpoint storage bottlenecks and optimize storage pool management, high availability stretch-site mirroring, automatic Continuous Data Protection features for fast application recovery and enterprise-wide flash SSD storage auto-tiering and adaptive caching capabilities that significantly boost the speed, throughput and availability of virtualized, I/O intensive business applications like SAP, Oracle, Microsoft SQL Server, Microsoft SharePoint and Microsoft Exchange. Attendees can stop by booth #2 under the motto "DataCore Software: Elevator to the Cloud" and find out why thousands of customers report significantly faster performance and better than 99.999% uptime after virtualizing their existing storage with SANsymphony-V.

 
“Thousands of customers throughout Europe have already realized the compelling performance and productivity advantage of DataCore™ SANsymphony-V,“ states Christian Hagen, Vice President and Managing Director EMEA. "The DataCore storage hypervisor greatly improves the economics and harnesses the full power of server caches, solid state disks (SSDs) and existing storage assets so that application owners no longer need to 'rip and replace' storage infrastructures and pay much higher costs to meet their performance and uptime objectives."

 
Run Faster Virtualized: By Eliminating the I/O Bottlenecks in Clouds and Virtualized Data Centers
"DataCore's impact on performance was dramatic in every metric we measured. Even more impressive is how SANsymphony-V simplifies management and how easily it can make data center storage more resilient. With a single mouse click disk capacity is served and all the normal error-prone steps to configure, tune and set best paths for high availability get done auto-magically," said Tony Palmer, senior engineer and analyst with Enterprise Strategy Group Lab.
In the ESG Lab Validation report, the benchmark tests confirmed that Microsoft SQL Server and Exchange workloads were able to improve their performance by nearly 5x as compared to running the same workloads on non-virtualized physical servers.

To further increase tier 1 business critical application responsiveness, companies often spend excessively on flash memory-based SSDs. SANsymphony-V's auto-tiering and adaptive caching feature optimize performance and the use of these premium-priced flash resources alongside more modestly priced, higher capacity disk drives. SANsymphony-V constantly monitors I/O behavior and intelligently auto-selects between server memory caches, flash storage and traditional disk resources in real-time to ensure that the most suitable class or tier of storage device is assigned to each workload based on priorities and urgency.
Keynote Presentation, Hands-on-lab demos and ‘What’s new and important to know’ sessions
At SNW Europe DataCore will demonstrate functional versatility and performance scalability of its hypervisor under the motto "Elevator to the Cloud" at booth 2 with its distribution partner ADN and in the Hands-On-Lab. The conference program of the Platinum sponsor is complemented by a key note of Vice President and Managing Director EMEA Christian Hagen and further presentations and demos for the architecture of dynamic virtual storage infrastructures.
Additional DataCore Programs and Features being showcased:

DataCore at the event will also spotlight:
  • New system builder partners and DataCore’s commitment to working with partners to build an ecosystem of appliance-focused, value-add versions of its storage hypervisor that meet their individual customer needs.
  • New Cloud Service Providers and hosters that are using SANsymphony-V under the DataCore Cloud Service Provider Program.
  • A SANsymphony-V 9.0.1 update release this month to enable support for Windows Server 2012 applications hosts and a follow-up update release expected early next year to support running SANsymphony-V on Windows Server 2012.
  • How to empower VMware SRM and VAAI benefits across heterogeneous storage arrays and the announcement of an update release of the vSphere plug-in that supports storage reclamation and enables VMware administrators to control and schedule SANsymphony-V services, provisioning, taking snapshots and tasks directly from their VMware vCenter Server Management Platform.
  • New simpler to use and time saving recovery with Continuous Data Protection (CDP) to rapidly rollback in time and recover critical Tier 1 business workloads and VMs.
  • A sneak preview of partner-integrated ‘datacenter in a box’ unified SAN/NAS storage systems and a pre-packaged Virtual Desktop Server reference architecture that features the cost and performance advantages of running Microsoft Hyper-V and DataCore SANsymphony-V co-resident on the same platform.
Test Drive the DataCore Storage Hypervisor; Free License Key of SANsymphony-V

Download a 30 Day Free Trial download of SANsymphony-V at: www.datacore.com/Software/Closer-Look/Demos.aspx

Tuesday 30 October 2012

DataCore Software Awarded “Innovation Leader of the Year” at CIO Summit 2012


DataCore Software has been awarded “Innovation Leader of the Year” by CIO Europe Summit 2012. IDC analysts and C-Level executives of over 60 European IT departments who attended the summit selected DataCore based on its latest release of its storage software SANsymphony-V 9.0 – the storage hypervisor for the Cloud optimized for private clouds, cloud service providers and large scale data centres.

“CIO Europe Summit Frankfurt 2012 gathered over 60 C-level executives from the CIO industry across Europe. It is the arena for senior level executives to engage in focused dialogue with their peers, examine management objectives and meet with the solution providers who can best meet their needs,” said Marc Baker, EMEA CIO Summit Director at GDS International. “One of the most impressive participants at the event in 2012 was DataCore. A large number of the CIOs and delegates were really impressed with the workshop and one-to-one meetings. Huge congratulations to DataCore for being awarded ‘Innovation Leader of the Year’ following the release of SANsymphony-V 9.0.”
 
Introduced in June 2012, DataCore’s SANsymphony™-V 9.0 offers customers superior flexibility, powerful automation and exceptional value. It is transforming the economics of virtualization for organizations of all sizes worldwide, by delivering flexibility, performance, value and scale, regardless of the storage hardware they use. "The Storage Hypervisor for the Cloud" optimizes storage pool management, high availability stretch-site mirroring and automatic continuous data protection features for fast application recovery. SANsymphony-V 9.0 also features enterprise-wide flash SSD storage auto-tiering and adaptive caching capabilities that significantly boost the speed, throughput and availability of virtualized business applications like SAP, Oracle, Microsoft SQL Server, Microsoft SharePoint and Microsoft Exchange.

“We are very pleased with the CIO Summit 2012 Award as it witnesses the acceptance and market footprint of our storage hypervisor technology in larger scale enterprises. Thousands of customers throughout Europe have already realized the compelling performance and productivity advantage of DataCore™ SANsymphony-V,” said Stefan von Dreusche, Sales Director EMEA, Central Europe Region. “The DataCore storage hypervisor greatly improves the economics of new and existing storage assets so that application owners no longer need to 'rip and replace' storage infrastructures and pay much higher costs to meet their performance and uptime objectives."

At the CIO Summit in October 2012, 65 of Europe's most influential CIOs attended the eighth Chief Information Officer Europe Summit (CIO EU 8) to strategize on IT issues and share best practices and key subject presentations. The CIOs and the event’s analyst partner IDC presented and focused on managing data challenges, the ubiquitous question of cloud computing and the array of regulatory and security requirements surrounding it. https://twitter.com/cioeurope / http://pic.twitter.com/xHdhqkAd


 

Saturday 27 October 2012

Video: George Teixeira, CEO of DataCore Software talks about Storage Virtualization and DataCore

In this Video, during a recent visit to DataCore’s HQ, George Teixeira (CEO, president and co-founder) talks about his company and the relationship between DataCore and the business in Europe.
http://juku.it/en/articles/video-george-teixeira-talks-datacore.html


George Teixeira talks DataCore from Juku on Vimeo.

Friday 26 October 2012

CIO Summit Presents 'Innovation Leader of the Year' Award to DataCore for SANsymphony-V 9.0 Storage Hypervisor

CIO Summit: DataCore was presented the 'Innovation Leader of the Year' Award for SANsymphony-V 9.0 Storage Hypervisor.

At the CIO Summit this week, 65 of Europe's most influential CIOs attended the eighth Chief Information Officer Europe Summit (CIO EU 8) to strategize on IT issues and share best practices and key subject presentations. The CIOs and the analyst firm IDC presented and focused on managing data challenges, the ubiquitous question of cloud computing and the array of regulatory and security requirements surrounding it.
Huge Congratulations to @DataCore awarded Innovation Leader of the Year following the release of SANsymphony(tm)-V 9.0 #cioeu

Thursday 25 October 2012

DataCore on why it's the last man standing

By Chris Mellor
Read the full article: http://www.theregister.co.uk/2012/10/24/datacore_picture/
It's also down to a solid product, though other companies with great products have failed where DataCore did not.

DataCore has absorbed $100m worth of funding and developed nine generations of storage virtualisation software – which it calls storage hypervisors – while other software storage virtualisation developers have withered and died in the face of storage array and server vendor selling their own storage virtualisation software paired with hardware; IBM's SAN Volume Controller (SVC) for example. How has DataCore managed to prosper - it has prospered - and survive in the face of such competition? The answer lies in the unique nature of the company and its founders.

DataCore was founded in 1998 with the realisation that a SAN disk drive array was, or could be, an X86 server running drive array controller code, using its attached storage and presenting as virtual disks of networked, block-access storage. Customers could buy standard servers, provision them with commodity disk drives, and so have freedom of choice rather than being restricted to storage array manufacturers' disk drive prices, often quite high, software licensing and functionality. They can also buy external storage arrays and have DataCore's SANsymphony software control them too. Thus DataCore helped pioneer the SAN virtualisation appliance idea.

SANsymphony, a storage hypervisor as DataCore views it, and its associated products support a variety of server operating systems and hypervisors, and provide modern SAN array features such as virtualisation, high-availability and thin provisioning. It can run in a dedicated server or as a virtual machine and there are more than 6,000 DataCore customers with more than 20,000 software licenses bought.

Why is DataCore the last man standing?

Firstly, the product is good; it does what it says it does on the box...

Read the full 3 page article: http://www.theregister.co.uk/2012/10/24/datacore_picture/

Law firms gain peace of mind and more with DataCore storage hypervisor


The practice of law makes stringent demands on storage technology, most notably in terms of availability, disaster recovery, and efficiency. A modern law firm’s income—based on hourly billing--depends on constant access to critical applications for case management, e-discovery and litigation support, billing, document management, and more. Both the application servers and the storage underlying these applications must be able to run non-stop. Preservation of the firm’s data, and perhaps more important, that of their clients, against any threat from user error to a hurricane, is of course a top priority. And unless data storage assets are used efficiently, the growing volume of data involved in even a small-to-medium-sized legal practice--encompassing anything from PDF contracts to multi-gigabyte video depositions—can quickly overwhelm any reasonable storage budget.

Read More: Rennert Vogel Mandler & Rodriguez, P.A. and Stikeman-Elliott share their DataCore experience...

Sunday 21 October 2012

Storage Strategies Now Analyst Report: Virtualizing business-critical applications without hesitation using the DataCore SANsymphony™-V storage hypervisor

"DataCore practically invented the concept of storage virtualization and has the years of experience in the field across thousands of customers and multiple generations of its product to claim a leadership position in the space now known as software-defined storage infrastructure. With SANsymphony-V R9, this experience is embodied in the comprehensive functionality and scalability of the product. The benefits it yields with regards to performance and availability are even more pronounced in scenarios where business-critical (Tier 1) applications must be virtualized and consolidated."
Read the full snapshot report by James E. Bagley, Senior Analyst &
Deni Connor, Founding Analyst
Storage Strategies NOW Snapshot Report: 
 
The virtualization and consolidation of business-critical applications is a high priority for IT operations in organizations of all sizes. But the owners of these applications often balk at virtualization because of a set of unknowns that surround the loss of dedicated server hardware.

The truth is that applications perform differently in a virtualized environment as opposed to dedicated server hardware. Virtualization causes contention for shared storage resources and the performance of formerly well-behaved applications can become unpredictable. When performance becomes erratic and response times suffer, users grumble and application owners want their physical machines back. Storage equipment outages for routine maintenance, upgrades and expansion compound the problem because many virtual machines rely on those same resources.






Wednesday 17 October 2012

Critical Business Applications: Storage virtualization can boost performance, Too

Interesting DataCenter Acceleration Post: http://www.datacenteracceleration.com/author.asp?section_id=2412&doc_id=252221

Storage virtualization appears to be a good way to give key apps greater performance.

The good thing about the computer industry is that every so often, the stars align and the opportunity arises for an explosion of new products in a particular sector. Over the past decade, for instance, all sorts of new data-storage schemes have come into being, each with its own special function or strength...

...The result for many IT operations has been a landscape of disparate storage solutions, not all of which are on such good speaking terms with each other. Can you say, proprietary stacks? Heterogeneity?

And right now, a whole new tier of storage is coming into play: flash-based solid-state drives (SSD), which add still more complexity and difference to the mix.

Fortunately, there's a good solution at hand, a way to effectively pave over many of the differences between disparate storage products and create what amounts to a unified architecture. It's virtualization, the same idea that has been playing out so well in the server world, making it possible to decouple applications and their software stacks from underlying hardware.

Or, put another way, virtualization can effectively hide and insulate applications from many if not all complexities, peculiarities, and incompatibilities by using software to abstract a set of resources and create the illusion of a single resource. That can bring down operational costs.

One of the many companies pushing the idea of cross-vendor storage virtualization has been DataCore Software, which sells to midsized and large enterprises running Windows Server. And lately, it has been pushing storage virtualization as a way to boost performance, too, especially of apps that have been virtualized on the server.

Company COO Steve Houck explains the problem: Once a bunch of apps have been virtualized and made to share a physical host under control of a hypervisor, their varying I/O patterns may easily conflict with each other. One app's fairly random traffic spikes can easily disrupt another's more regular I/Os, for instance. Indeed, it's widely accepted that this I/O problem is one of the most difficult hurdles to face newcomers to server virtualization.

Once storage resources themselves are virtualized, however, the movement and caching of data can be better managed and adjusted on the fly to meet the changing needs of specific apps and maintain or even boost their performance, Houck tells us. In fact, a storage hypervisor like DataCore's can work hand-in-hand with the server hypervisor -- supplied by VMware, for instance -- to resize caches in its own RAM memory, watch I/O patterns with an eye to identifying most-used data for caching closer to or even inside the server, and quickly redirect I/Os away from defective storage devices.

Typically, DataCore's hypervisor runs on a pair of Windows-based servers situated between the enterprise's compute servers and its physical storage devices. These servers manage their own in-memory data caches while also overseeing data mirroring, backups, and other operational chores.

According to Enterprise Strategy Group, an IT consulting outfit, benchmark tests (performed on behalf of DataCore) showed Microsoft SQL Server and Exchange workloads enjoying a nearly fivefold improvement in performance when storage was managed by DataCore's SANSymphony-V product.

Numbers like that -- and I imagine DataCore's not alone in achieving such performance gains through storage virtualization -- are hard to ignore. This clearly is a technology that any enterprise should investigate if it's truly interested in accelerating the performance of critical applications.