Wednesday 29 May 2013

DataCore’s SANsymphony-V Storage Virtualization Technical Deep Dive Series – Focus on Replication

DataCore’s SANsymphony-V – Replication

In this post I want to introduce you to the “Replication” feature offered by the storage hypervisor SANsymphony-V. A replication solution offers you the ability to keep a “copy” of your production data at a remote site, usually for disaster recovery scenarios. But a replication solution does not simply copy the data occasionally, no they continuously keep them up to date to be as close as possible to the production data to offer good restore point objectives (RPO).

Read More: http://vtricks.com/?p=744

To be able to replicate data you need to partner your server group with a replication group.PartnerWithReplicationGroup
Once the server groups are connected your SVV console will look similar to this.
ServerGroupOverview_new
 

Tuesday 28 May 2013

DataCore Survey Finds Cause for Data Storage Pause

Article from IT BusinessEdge: DataCore Survey Findings

With the rise of Flash memory and cloud computing, there have never been more options for managing data storage effectively. Obviously, that’s a very good thing, given the amount of data that needs to be managed.

A new survey of 477 IT professionals conducted by DataCore Software, a provider of storage virtualization software, finds that the move to embrace new approaches to storage is growing at a slow but steady pace.

Datacore CEO George Teixeira says that despite some of the inherent performance benefits of Flash and the promise of reduced storage costs in the cloud, issues such as the cost of Flash memory and the fact that applications are not optimized for Flash mean that the broad transition to Flash memory has thus far been hampered. As for the cloud, Teixeira says IT organizations are still struggling with any number of compliance and performance issues.

Add the fact that most IT organizations seem predisposed to build their own private cloud and it becomes clear that a large number of cultural and process issues still have to be worked out.
As Flash memory continues to get less expensive, it will change the way primary storage is managed, and the cloud will increasingly be relied on for backup and archiving. What’s not as clear is to what degree storage administrators will lead this charge versus having it forced upon them by developers and senior IT managers.

In either case, for now it looks like these changes will take place at an evolutionary rather revolutionary pace.

Tuesday 21 May 2013

DataCore storage virtualization software boosts telecom's DR strategy

Home
When Chris Jones, manager of IT services at Blair, Neb.-based Great Plains Communications Inc., sought to improve his disaster recovery strategy, he knew exactly what he needed: synchronous mirroring of data hosted at two locations approximately 10 miles from each other. He also knew that he wanted to keep his storage array, even if it didn't support synch mirroring.

The independent local exchange service communications company has about 50 TB of storage, nearly all of which is virtualized through 200 virtual servers and 200 virtual desktops. Jones set about investigating options for 
synchronous mirroring capabilities, and said he learned quickly that "only the large manufacturers had that ability."
Jones's shop was running an EqualLogic iSCSI SAN array. "It was a very nice storage box," he said. "We didn't want to pull it out of service early." With VMware ESX and vMotion running at both locations, he wanted a way to balance workloads dynamically between the two data centers without buying a new storage system.
"DataCore [Software] had this offering that you could layer on top [of our existing system]," Jones said. He purchased DataCore's SANsymphony-V storage, virtualization software billed by the vendor as having the ability to auto-tier and manage storage in enterprises using incompatible devices from multiple suppliers. SANsymphony-V's feature list includes synchronous mirroring, disk pooling, high-speed caching and RAID pooling, among others. "We saw some pretty good performance improvement through [DataCore's] caching technology," Jones said. "Our primary interest was in the mirroring."
According to Jones, "If I did have storage failure, which has occurred, everything would fail over [to the other DataCore node.] Should one of those storage systems fail, the VMs [virtual machines] immediately fail over to the remote storage system and retain their operational store. At that point, you would be looking at recovering from a snapshot or, heaven forbid, you actually go back to backup these days." That means "all our VMs reside in two storage systems at any given time," he said.
SANSymphony-V enabled Jones to "break out of a single-layer approach. We no longer have to buy big, complex systems. We can buy an x86 off the shelf and DataCore adds all the SAN technologies on top of that."
DataCore's tiering works similarly to that of Dell's Compellent Data Progression, EMC's FAST VP, Hewlett-Packard's 3PAR Adaptive Optimization, Hitachi Data Systems' Dynamic Tiering and IBM's System Storage Easy Tier. But among those vendors, only Hitachi supports arrays outside of its own. SANsymphony-V's tiering works across any storage device.
Jon Toigo, CEO and managing principal at Toigo Partners International, runs DataCore in his own environment and tells IT customers to look at DataCore to avoid "buying feature-encrusted gear that jacks up the price.
"DataCore can overlay on top of anything that connects to the server and manage it all as one pool," Toigo said. "It leverages all the load balance, receives all the writes into memory on the server and writes to non-volatile RAM. The system thinks your storage is four times faster because it's going down to RAM."

Thursday 16 May 2013

DataCore Introduction and Technical Deep Dive


Check out this site for a number of Deep Dive topics

DataCore deepdive

Here you can find an overview of my technical deep dive series about DataCore’s SANsymphony-V “storage hypervisor”:  VTricks DataCore Deep Dive   

The first article in the series is below:

DataCore Software – Introduction to SANsymphony-V

DataCore Software – Introduction to SANsymphony-V
Let’s start in this series about the DataCore storage server with a short introduction of DataCore software and its SANsymphony V solution. What is DataCore and can it help you to fulfill your storage requirements? DataCore’s latest solution is SANsymphony … Continue reading 

Wednesday 15 May 2013

TechTarget Interviews Great Plains on DataCore storage virtualization software and its impact on the telecom's DR strategy

DataCore storage virtualization software boosts telecom's DR strategy
Home
When Chris Jones, manager of IT services at Blair, Neb.-based Great Plains Communications Inc., sought to improve his disaster recovery strategy, he knew exactly what he needed: synchronous mirroring of data hosted at two locations approximately 10 miles from each other. He also knew that he wanted to keep his storage array, even if it didn't support synch mirroring.
The independent local exchange service communications company has about 50 TB of storage, nearly all of which is virtualized through 200 virtual servers and 200 virtual desktops. Jones set about investigating options for 
synchronous mirroring capabilities, and said he learned quickly that "only the large manufacturers had that ability."
Jones's shop was running an EqualLogic iSCSI SAN array. "It was a very nice storage box," he said. "We didn't want to pull it out of service early." With VMware ESX and vMotion running at both locations, he wanted a way to balance workloads dynamically between the two data centers without buying a new storage system.
"DataCore [Software] had this offering that you could layer on top [of our existing system]," Jones said. He purchased DataCore's SANsymphony-V storage, virtualization software billed by the vendor as having the ability to auto-tier and manage storage in enterprises using incompatible devices from multiple suppliers. SANsymphony-V's feature list includes synchronous mirroring, disk pooling, high-speed caching and RAID pooling, among others. "We saw some pretty good performance improvement through [DataCore's] caching technology," Jones said. "Our primary interest was in the mirroring."
According to Jones, "If I did have storage failure, which has occurred, everything would fail over [to the other DataCore node.] Should one of those storage systems fail, the VMs [virtual machines] immediately fail over to the remote storage system and retain their operational store. At that point, you would be looking at recovering from a snapshot or, heaven forbid, you actually go back to backup these days." That means "all our VMs reside in two storage systems at any given time," he said.
SANSymphony-V enabled Jones to "break out of a single-layer approach. We no longer have to buy big, complex systems. We can buy an x86 off the shelf and DataCore adds all the SAN technologies on top of that."
DataCore's tiering works similarly to that of Dell's Compellent Data Progression, EMC's FAST VP, Hewlett-Packard's 3PAR Adaptive Optimization, Hitachi Data Systems' Dynamic Tiering and IBM's System Storage Easy Tier. But among those vendors, only Hitachi supports arrays outside of its own. SANsymphony-V's tiering works across any storage device.
Jon Toigo, CEO and managing principal at Toigo Partners International, runs DataCore in his own environment and tells IT customers to look at DataCore to avoid "buying feature-encrusted gear that jacks up the price.
"DataCore can overlay on top of anything that connects to the server and manage it all as one pool," Toigo said. "It leverages all the load balance, receives all the writes into memory on the server and writes to non-volatile RAM. The system thinks your storage is four times faster because it's going down to RAM."

Friday 10 May 2013

Virtualization Survey Identifies Storage Obstacles to Virtualizing Critical Business Apps and Flash memory SSD technology Trends


This week DataCore Software’s Third Annual State of Virtualization Survey  findings were published; 477 IT professionals from a broad range of industries participated in the survey conducted at the end of March 2013. 
What was surprising?
Surprisingly, nearly one in three respondents said they are still avoiding virtualization projects altogether because the related storage concerns and costs are too high.

Despite the incredible attention the industry is placing on solid state technologies and Cloud storage, there are major issues impacting wider adoption:

% of capacity on Flash/SSD
  • Integration difficulties associated with Flash memory and solid state disks (SSDs) rank high among the factors discouraging organizations from applying these fast technologies to their most latency-sensitive virtualized business workloads and desktop virtualization (VDI) programs.
  • One in two respondents said they are not planning to use Flash/SSD for their virtualization projects.
  • When asked about what classes of storage they are using across their environments, nearly six in 10 respondents said they aren’t using Flash/SSD at all.
  • Organizations are not flocking to public cloud storage in droves for their storage needs. Eight in 10 said they are not using any public cloud storage. This is particularly notable given the amount of recent attention devoted to public cloud options.
What was not surprising?
Storage continues to be the biggest chunk of the investment in virtualization projects. These include both server and desktop initiatives.
  • 52 percent of those surveyed said storage accounted for more than 25 percent of their virtualization budget.
  • Storage virtualization software use is growing.
Storage Virtualization Software
Organizations are still eager to virtualize their mission critical applications, but the findings point out that storage-related costs and I/O performance issues remain significant obstacles to their benefiting from these virtualization initiatives. The results revealed that SQL Server, Exchange, SharePoint, Oracle and SAP represent the most prevalent Tier-1 business applications targeted for consolidation. These popular business applications are also the most storage intensive applications and therefore the hardest and costliest to virtualize due to the level of storage needed to ensure that their demanding performance and availability requirements are met.
  • 44 percent of respondents said the disproportionate storage-related costs were a “serious obstacle” or “somewhat of an obstacle” preventing them from virtualizing more of their workloads. 42 percent of respondents said the same about performance degradation or inability to meet performance expectations.
  • IT storage budgets are on a relatively tight leash in 2013 compared to 2012. More than half (51 percent) of respondents said their storage budget has remained stable, but 20 percent said their storage budget has been reduced, compared to just 14 percent in 2012. While 38 percent said their storage budget grew in 2012, only 30 percent said the same this year.
IT professionals from organizations across the globe participated in the survey, with;
56 percent of respondents from organizations with less than 1,000 employees;
23 percent of respondents from organizations with 1,000 to 5,000 employees and;
21 percent of respondents from organizations with more than 5,000 employees.

Respondents represented a range of industries, including Financial Services (11 percent), Healthcare (12 percent), Government (13 percent), Manufacturing (15 percent), Education (13 percent) and IT services (14 percent). The average capacity managed by these organizations was 234 terabytes.

DataCore's 2013 State of Virtualization Survey was conducted at the end of March 2013. The survey asked a series of questions about virtualization and its impact on storage.
The full report may be found here: http://pages.datacore.com/StateofVirtualizationSurvey.html

Thursday 2 May 2013

DataCore Software Study: Virtualization Hindered by High Storage Costs, Performance


DataCore Software has released the results of its third annual State of Virtualization Storage Survey
Full survey results are available at:  DataCore Software’s Third Annual State of Virtualization Survey 
DataCore Software Study: Virtualization Hindered by High Storage Costs, Performance