Saturday, 30 April 2011

DataCore Unveils Features of Forthcoming Storage Pooling Add-in for Microsoft Home Server

The Home Server Show interviewed DataCore’s Carlos Carreras this week about their forthcoming storage pooling add-in for Windows Home Server 2011. As revealed back in February, DataCore the Enterprise storage virtualization specialist was approached by Microsoft late last year to look at ways of replicating some of the Drive Extender features that have been removed from Windows Home Server 2011.

The company subsequently announced support for Windows Home Server some weeks later but there has been little detail on which features from Datacore’s Enterprise software would be enabled for Windows Home Server users, and indeed, how the software’s storage pooling capabilities would work. In the course of this week’s interview, some of those questions were answered.

What is Tranquil PC Doing with Windows Home Server 2011

...I already said a few weeks ago that DataCore’s add-in for Windows Home Server 2011 was the one I was most interested in testing and using, and now that Tranquil are going to be incorporating it as well can only be yet another tick in the box for DataCore.

Friday, 29 April 2011

DataCore Software's SANsymphony-V Wins 2011 Network Computing Award for "Data Center Product of the Year"

“With the launch of SANsymphony-V, DataCore has redesigned its software for virtual server and desktop environments. However, DataCore’s key benefits of extending the useful life of hardware, cutting hardware costs and removing vendor lock-in are still intact, and apply to both the physical and virtual world.” -Carla Arend , IDC

DataCore Software, the industry’s premier provider of storage virtualization software announced that it has won one of the UK’s top networking accolades by scooping the Data Center Product of the Year Award at the prestigious 2011 Network Computing Awards held at the Russell Hotel in London.

Keith Joseph, Regional Manager, Northern Europe at DataCore Software, happily collected the award and commented, “This is the first award that SANsymphony-V has gained since its launch in January. Part of what enticed the Network Computing readers to vote for the new solution was the favorable independent review that appeared in the magazine recently. The review and this award endorses just how easy SANsymphony-V has made installation of virtualization across desktop and server environments for network administrators and managers.”

Indeed, Carla Arend, program manager, European infrastructure software research at IDC, underscores Joseph’s comments, “With the launch of SANsymphony-V, DataCore has redesigned its software for virtual server and desktop environments. However, DataCore’s key benefits of extending the useful life of hardware, cutting hardware costs and removing vendor lock-in are still intact, and apply to both the physical and virtual world.”

In brief the award recognizes that with SANsymphony-V network managers can achieve:
  • Much lower risk to deploy virtualization projects since a complete storage and process overhaul is not required; changes can be implemented incrementally, tried or undone selectively.
  • Cost-effective Business Continuity and Disaster Recovery to safeguard business.
  • Infrastructure-wide administration and automation to increase staff productivity.
  • Greater savings from consolidation, repurposing and better utilization of existing resources.
  • Reduced complexity and greater control to manage growth and optimize budgets.
  • Enduring value from a software solution that lives beyond devices that “come and go.”

The Network Computing Awards were established in 2005 to recognize best-of-breed, easy to use solutions that make the working lives of network administrators and managers easier and more effective. Voting takes place independently across a two month time frame, with just under 4,000 voters contributing.

“DataCore Tackles Storage Virtualization Barrier”
Access the Network Computing review on SANsymphony-V:

Test-drive SANsymphony-V:

Thursday, 28 April 2011

New Microsoft Case Study Details How DataCore Storage Virtualization Software and Hyper-V Combine to Enable Cloud Service Provider AcXess to Save Millions

SANsymphony and Hyper-V software support Microsoft’s global field and technical force sales teams who rely on AcXess V-Works™ Cloud to demonstrate and sell their solutions to customers worldwide.

At the recent Microsoft Management Summit 2011 in Las Vegas, DataCore Software announced that cloud services provider AcXess is using DataCore’s SANsymphony™ software to create a dynamic, resilient virtual storage infrastructure that cost-effectively supports the company’s Microsoft® Hyper-V™ server virtualization environment and its business model.

AcXess has built its business around flexible virtualization technologies designed to meet the needs of key customers such as Microsoft who rely on AcXess to provide reasonably priced services such as on-demand sales demos, proof-of-concept (POC) labs, and training sessions in virtualized environments to thousands of employees, partners, and customers. Leveraging the combination of Hyper-V and DataCore, AcXess has doubled the number of users year over year and has grown its business 300 percent, while saving $5 million in hardware costs. A full case study is available on Microsoft’s website.

“DataCore storage virtualization software has proven itself to be the ideal solution. We run a number of large systems for Microsoft in the cloud today and it was a very logical choice to choose DataCore to run the SAN component of the AcXess V-Works™ platform,” states Tom Elowson, president and cofounder of AcXess. “The reason that we chose DataCore was the flexibility and cost savings we gained from their storage virtualization solution and the way that it worked within our system to provide very powerful experiences to accelerate revenue for Microsoft sales people and other clients around the world.”

ESG Lab Reports: DataCore SANsymphony-V: Compelling Storage Virtualization Software

ESG Lab Validation Highlights:

  • DataCore SANsymphony-V R8 was very quick and easy to set up and manage. ESG Lab set up a two node, high-availability SANsymphony-V environment and was serving storage to virtual machines in minutes.
  • With one click of a mouse, ESG Lab was able to serve a thin provisioned, performance tuned, fully protected, synchronously mirrored 1TB virtual disk to a Windows server in less than a minute.
  • Migrating and importing disk drives from a working physical server into the virtual disk pool was a quick and easy process. The DataCore software provided a seamless transition from physical to virtual infrastructure while actually enhancing performance and availability.
  • DataCore synchronous mirroring provided hosts with continuous access to virtual disks through a node outage with zero downtime and no interruptions to service.
  • DataCore Continuous Data Protection (CDP) was easy to configure and use, enabling rollback to a specific point in time without having to create multiple snapshots.
  • DataCore asynchronous remote replication was also easy to configure; compression and multi-streaming provided a 2x throughput enhancement over a simulated T1 link.

Tuesday, 26 April 2011

451 Group Impact Report: DataCore Renews Storage Virtualization, Plays Up Virtual Desktop Synergies

Analyst: John Abbott, Simon Robinson

DataCore Software recently introduced the first complete rewrite of its core storage virtualization technology since it was founded in 1998. The newly launched SANsymphony-V will eventually replace the original SANsymphony product line and also incorporate the previously separate entry-level SANmelody software, first introduced back in 2005. It's been designed to better support more recent trends in computing, such as cloud, virtual servers and virtual desktops, but it also helps simplify the core functionality with a better user interface, auto-tuned caches and built-in features such as workflow, continuous data protection and automated traffic-compressing replication.

Check out all the latest analyst reports on DataCore at:

Tips to Overcome 3 Potential NAS Issues

Network attached storage (NAS) is an attractive idea. It is relatively inexpensive and can deliver high performance; but unless managed carefully, with a product like Datacore’s virtualizing SANsymphony-V, it can create its own problems.

Saturday, 23 April 2011

Virtualization TCO: Fact or Fiction?

“The nine-month ROI that vendors might advertise for their latest and greatest technologies can actually average three or four years, according to our clients that have crunched the numbers for their hosted desktop virtualization deployments,” Forrester wrote. “Why? Because the upfront infrastructure and licensing costs far outweigh the upfront benefits.”

This is especially true when you consider the high cost of the storage infrastructure needed to support virtual desktop deployments. In fact, Wikibon recently posted an intriguing video interview on how storage represents the biggest drag to virtual desktops. It points out that for every $1 spent on virtual desktop deployments, $3 to $10 is spent on storage.

To offset these numbers and make the cost of storage seem reasonable, vendors have spread the large cost over thousands of desktops. But as desktop Guru Brian Madden points out on his blog, we can clearly see the problem when ...

Full story

Thursday, 21 April 2011

Server Virtualization Faces Real Problems

Storage costs and needed upgrades slowing adoption, surveyed IT managers say

Valerie Valentine: Information Management Online

According to Jon Toigo, CEO of Toigo Partners International, said the figures in the survey underscore anecdotal issues facing IT managers. “The fundamental issue confounding hypervisor-based virtualization is storage. Inflexible storage fabrics are becoming a source of application performance pain and represent a single point of failure in the high availability story touted by hypervisor vendors,” Toigo says. 

Toigo points out that virtualizing storage is less expensive than ripping out and replacing current storage infrastructure.

“Virtualized storage is also the key, going forward to successful desktop virtualization initiatives and to cloud service integrations. As more companies, compelled by reasons of cost or enhanced user mobility, begin to explore these technologies, hopefully the lessons learned around stalled server virtualization projects will not be forgotten.”

Unanticipated storage costs, availability concerns and performance bottlenecks are hindering server consolidation and desktop virtualization projects, according to storage vendor DataCore’s survey of more than 450 IT organizations across North America and Europe.

Of those that have deployed server virtualization, 66 percent cited a substantial increase in storage costs as the biggest problem they are facing. Nearly 40 percent say the storage infrastructure is either slowing application performance or limiting its availability, while more than 20 percent indicate that business continuity has become more difficult, according to the survey.

DataCore reports that despite hype surrounding cloud initiatives, a majority (73 percent) of organizations have yet to take advantage of cloud services for storage needs. However, 70 percent said access to more disk space would be the most important characteristic they would want from cloud-related storage.

Almost half (48 percent) of those surveyed are now using storage virtualization software with their server and desktop virtualization initiatives. Nearly three in four (74 percent) rely on it to improve disaster recovery and business continuity practices.

Storage: Virtualization’s wallflower?

DataCore Software, known for its storage virtualization software, has released a survey comprising over 450 IT organizations across North America and Europe, “The State of Virtualization.” The findings can be a little disturbing, especially to a company who creates a product that many medium and large enterprise IT orgs are leaving out of their virtualization plans: Storage. The study found that 43 percent had mistaken the impact storage would have on server and desktop virtualization or had shied away from a virtualization project because storage-related costs were too high.

Enterprise Strategy Group Hands-On Testing and Benchmarking Validates Key Value Propositions of DataCore Software’s Storage Virtualization Platform

DataCore Software has announced the results of comprehensive hands-on testing and performance benchmarking conducted on its SANsymphony™-V R8 product by the Lab team at leading industry analyst firm Enterprise Strategy Group (ESG).

“DataCore’s impact on performance was dramatic, in every metric we measured,” said Tony Palmer, senior engineer and analyst with ESG Lab. “Even more impressive is how SANsymphony-V simplifies management and how easily it can make data center storage more resilient. With a single mouse click disk capacity is served and all the normal error-prone steps to configure, tune, and set best paths for high-availability get done auto-magically.”

ESG Lab summarized its test conclusions as follows: “ESG Lab firmly believes that it would benefit any organization considering or implementing an IT virtualization project to take a long look at DataCore SANsymphony-V R8 storage virtualization software. It is robust, flexible, and responsive and delivers major value in terms of utilization, economics, improved response times, high-availability, and easy administration.”

To download the full ESG Lab Validation Report, please visit: DataCore Software – FEATURED ANALYST REPORTS.

Powerful, Intuitive, and Automated

ESG Lab found that the software was “very quick and easy to set up and manage” and “virtualizing storage was intuitive and straightforward.” In fact, ESG analysts configured a high-availability SANsymphony-V environment and were serving storage to virtual machines in minutes. “The volume was thin provisioned, mirrored for high-availability, and preferred and alternate paths to storage resources were set without administrator intervention. ESG Lab was impressed with the speed, simplicity, and completeness of the configuration.”

SANsymphony-V software enables data centers to use existing equipment and conventional storage devices to achieve the robust and responsive shared storage environment necessary to support highly dynamic virtual IT environments. This contrasts sharply with the expensive “rip and replace” approaches being proposed to support desktop and server virtualization projects. In its findings, ESG Lab verified that DataCore SANsymphony-V software can be deployed to cost-effectively provide easy-to-configure, performance enhanced, storage virtualization, offering affordable scalability, and performance acceleration for virtualized servers and applications.

“Enterprises have long been concerned about underutilization of servers in their data centers,” continued Palmer. “Virtualization helps them consolidate data by reducing the number of servers used and leaving those that remain as a pool of resources to be drawn on as needed. Now, enterprises are turning their attention to storage, and DataCore is tapping into that demand.”

SANsymphony-V Delivering Compelling Value, Innovation and Faster Performance

The ESG Lab Validation Report on DataCore SANsymphony-V also specifically highlights the following:
  • Performance improved significantly for every workload with the File Server workload posting the most significant improvement, more than 6x and the Exchange 2007 workload showing a 5x improvement.
  • With one click of a mouse, ESG Lab was able to serve a thin provisioned, performance tuned, fully protected, synchronously mirrored 1TB virtual disk to a Windows server in less than a minute.
  • Migrating and importing disk drives from a working physical server into the virtual disk pool was a fast and effortless process. DataCore storage virtualization software provided a seamless transition from physical to virtual infrastructure while actually enhancing performance and availability.
  • DataCore synchronous mirroring provided hosts with continuous access to virtual disks through a node outage with zero downtime and no interruptions to service.
  • DataCore Continuous Data Protection (CDP) was simple to configure and use, enabling rollback to a specific point in time without having to create multiple snapshots.
  • DataCore asynchronous remote replication was also easy to configure; compression and multi-streaming provided a 2x throughput enhancement over a simulated T1 link.
“The results of the ESG Lab test strongly validate the core value proposition we hoped to achieve with SANsymphony-V, which is to cost-effectively shape the shared storage infrastructure required by virtual IT environments,” said Alex Earhard, product manager of DataCore Software’s SANsymphony-V. “By providing high-availability storage virtualization that multiplies the performance of what’s already in place, we are solving the big problem that is bringing many of today’s server and desktop virtualization projects to a standstill.”

For more information and to view the full report, please visit:

Tuesday, 19 April 2011

Virtualization Strategies Must Include Storage

Virtualization of desktops and servers has so far saved some businesses a bundle on their IT costs, but when overhauling a data center or integrating virtualization to existing networks, businesses need to consider storage needs as well.

According to George Teixeira, president and chief executive of DataCore Software Corp, the cost appeal of virtualization can often distract from other priorities in the company.

"You get excited about saving all this money with consolidating many servers to a few and completely forget that part of that has to be used to offset getting additional capabilities on the storage side," he told IT World Canada.

Teixeira's company conducted a survey of medium to large scale businesses throughout North America and Europe, with one third of respondents indicating they mis-estimated budgets for storage when deploying virtualization solutions.

Monday, 18 April 2011

DataCore Adds NAS Performance Acceleration, File Sharing Support to SANsymphony-V Storage Platform

Storage virtualization software specialist DataCore Software announced it has integrated Network Attached Storage (NAS) performance acceleration and high-availability file sharing support for its SANsymphony-V platform, makes it possible to employ the NAS services built into the Microsoft Windows Server 2008 R2 platform in those environments. The integrated combination is designed to speed up performance and add a level of fault tolerance to clustered network file system (NFS) and common Internet file system (CIFS) sharing.

DataCore’s design employs mirrored copies of the SANsymphony-V software layered beneath clustered file shares (NAS) integral to Windows Servers 2008 Enterprise. Any standard storage device can be used for disk space, ranging from the basic internal hard drives packaged by the server manufacturers, to bigger external disk arrays offered by the popular storage systems vendors.

“SANsymphony-V can be easily configured to significantly accelerate performance and add a new level of data protection to Microsoft’s Clustered File Shares,” said Jeff Boles, senior analyst and director of validation services, Taneja Group. “This resulting combination of SANsymphony-V and Microsoft is additive; it is simple to set up, requires no additional purchases and best of all it allows organizations to meet both their NAS and SAN requirements from one virtual infrastructure.”

In the smallest configuration, the software co-resides with the clustered file share functions on each of two redundant Windows Enterprise servers. SANsymphony-V performs adaptive I/O caching directly on top of Hyper-V to speed up block disk access while mirroring updates to the other server’s virtual disk copy: All disk space is thin provisioned. Snapshots and Continuous Data Protection (CDP) can be turned on for added protection helping customers recover from events like ‘virus attacks’ and inadvertent deletions.

For larger environments, the file share cluster remains split across two machines, while the DataCore software runs on two separate servers dedicated to block storage virtualization. The cluster then accesses the DataCore virtual storage pool over an iSCSI or Fibre Channel SAN. Customers who start small and later outgrow the two servers can reassign their software licenses to the two additional servers. Incremental capacity licenses may also be added at any time to cover expansion.

Existing SANsymphony-V customers running Windows Server 2008 R2 Enterprise with Hyper-V do not need to buy additional software to support the clustered NAS file sharing features. For companies new to DataCore, SANsymphony-V software, including the NAS capabilities, can be professionally installed through the company’s authorized solution providers and resellers.

Saturday, 16 April 2011

Solving the Data Center’s Next Big Problem

By George Teixeira, CEO & President, DataCore Software

The transition to a virtualized infrastructure environment has moved to the forefront of many IT discussions. Organizations of all sizes are continuing to pursue the benefits that virtualization can offer, from increased efficiencies to maximization of resources to lower costs. But virtualization is not just about servers or desktops.

Virtualization is about creating agile, cost-effective and enduring IT infrastructures that can evolve to support the enterprise over time. To date, virtualization has largely failed to live up to this promise. Many server and desktop virtualization projects have stalled or failed because of cost overruns or unanticipated infrastructure issues. But why is this happening?

What is being missed is that server and desktop virtualization are about more than merely virtualizing servers and desktops. They are both about virtual IT infrastructure—and if organizations do not virtualize the entire infrastructure, they will not achieve the anticipated return on investment (ROI) and business benefits they are seeking. What they often overlook is the part of the infrastructure upon which both virtual servers and desktops depend: storage.

Today, the “Big Problem” in virtualization projects is storage, and enterprises have learned some hard lessons. They are being sold forklift upgrades of their storage infrastructure and vendor-specific virtualization strategies to support virtualized server and desktop infrastructures. This has moved the storage issue—its high cost, inadequate performance, inflexibility and vendor lock-in—to the front burner in virtualization discussions. That’s why this has become IT’s next “Big Problem.”

Why Virtualize?

Why not? It is universally agreed that server virtualization is a huge leap forward...

Read the full article at:

B.Y.O.S. (Bring Your Own Storage); Thoughts on Wine and Virtualization

On a recent trip, I sat with a group of partners talking about wine and virtualization (a natural pairing, if I do say so myself). I love a restaurant where I can bring my own wine if I choose, but this isn’t the norm in most restaurants and certainly wasn’t the case on that particular night. Instead, I was locked in to their wine list and could only order what they offered, at a steep markup.

So, what does that have to do with virtualization? Everything, when it comes to virtualizing the storage tier.

Everyone wants flexibility and choice without being locked-in to specific hardware requirements. This choice largely exists today in the server and desktop tiers, thanks to the widespread adoption of virtualization software. But when it comes to the storage tier, hardware vendors act like restaurants – they want to lock you into their proprietary product ...

Full story

Saturday, 9 April 2011

HA! (We’re not laughing at you, Dilbert.)

Ever since hard disks were deemed critical to data processing, storage suppliers have devoted much effort to circumvent hardware failures. It started with basic disk mirroring and then evolved into the various RAID protection levels in attempts to reduce the cost of redundancy. As external disk subsystems became popular, vendors added redundancy to other components whose failure was considerably more catastrophic; fans, power supplies and disk controllers come to mind.

It’s now commonplace to regard storage products as offering “high-availability” (HA) simply because they have internally redundant hardware. This interpretation creates the expectation that you can always get to data on disks. But this is far from reality.

In effect, better storage products have shifted the risk from hardware failures to data outages. Protecting against these outages is particularly important in environments where numerous workloads depend on centralized storage devices. Take, for example, server and desktop virtualization. ...

Full story

What They Won’t Tell You About Virtualization … We Will.

Well done! You’ve consolidated 40 applications from  25 servers down to just four servers. Sure, you’ll realize hardware  cost savings over time, but surprisingly, these partial steps can also  increase the complexity and thus the cost of  managing your IT environment. And in too many cases, early successes  have given way to unmet expectations.

Why is this happening? You have to remember that  virtualization isn’t just about servers or desktops. Virtualization is  about creating agile, cost-effective, and enduring IT infrastructures  that can evolve to support the enterprise over  time. But there is a looming problem that is often overlooked, and one  which can bring a virtualization deployment to a standstill – the  storage problem.

You mean you didn’t know that server/desktop  virtualization actually increases the complexities of storage  management? And at the same time, you didn’t know it can actually stress  the overhaul needed for high availability, and disaster  recovery? ...

Full story

Friday, 8 April 2011

How Old Man Kryder Gets Back his Giddyup

As the hypervisor wars continue to rage, much attention is being paid to the high end of the market. There is a reason for this – Microsoft is flexing its marketing muscles promoting successes in large, mission-critical deployments. The most recent example of this was a deal it announced with Target, in which Target is migrating numerous mission-critical applications to its Hyper-V environment.

While the large-enterprise sector currently receives the lion’s share of the glitz, we all know that 95% of the economy is the small- and medium-sized enterprise (SME) sector. And, we also know that the SME sector is where the first shots of the hypervisor wars were fired, with Microsoft’s launch of Windows Server 2008.

What we have seen since then is a classic case study in why competition is a good thing for customers. VMware responded to Hyper-V with new SME-friendly products, and since then both companies have been “sweetening the deal” with even more attractive bundling and pricing of products and features. Thus, it’s no surprise that as we sit here in 2011, virtualization adoption in the SME market is skyrocketing.

But there is a rub – because while desktop and server virtualization technology costs have never been more appealing, ancillary costs have never been more daunting – particularly storage costs. Kryder’s Law, which states that storage density doubles every 13 months, was remarkably accurate until 2005. Since then, however, capacity gains have slowed and today one can expect no more than 30 or 40 percent annual gains in capacity at any given price-point.

What this means is that traditional storage economics are not keeping pace with growing storage demands. And with virtualization projects causing huge increases in storage consumption, the age-old approach of “throwing hardware at the problem” is fatally flawed.

Kryder’s Law is no longer a valid solution to keeping storage costs in check – particularly when one considers the exotic and expensive hardware that is often required to achieve sufficient performance and availability in a virtualized environment. A far more practical solution lies in getting more out of existing and lower-cost off-the-shelf hardware through the use of device-independent storage virtualization software.

The time has never been better for SMEs to move forward into the brave new world of virtualization. But they need to understand that Old Man Kryder needs some help these days because he just doesn’t move as fast as he used to. He needs a healthy dose of storage virtualization software to get back his giddyup.

Thursday, 7 April 2011

DataCore Software Receives Second Consecutive 5-Star Rating in CRN’s 2011 Partner Programs Guide

Annual guide and 5-Star designation identifies DataCore as having an exceptional vendor partner program

DataCore Software has been named to CRN’s 2011 Partner Programs Guide and has also been awarded a 5-Star Partner rating for the second straight year. CRN’s Partner Programs Guide and 5-Star Partner ratings serve as the definitive authority on vendors who have robust partner programs or products that service IT Channel solution providers. The annual 5-Star rating produced by Everything Channel recognizes the DataCore Partner Program as offering a program that provides the best possible partnering elements for channel success.

With its SANsymphony™-V next-generation storage virtualization software solution, DataCore makes it easy to address the essential storage dimension to virtualization, enabling solution providers to accelerate sales cycles, increase margins and address the lucrative and growing storage market. DataCore provides a compelling software advantage delivering hardware independence and cost saving benefits to storage in much the same way as VMware, Microsoft and Citrix software do to servers and desktops. Virtualization solution providers can easily leverage their virtual server and virtual desktop practices with a software solution that makes it practical to address the demanding storage needs – cost, performance and availability – of their virtualization projects.

DataCore also has strong partnerships with the software industry’s leading developers of virtualization technologies. These partnerships offer customers full featured, best-of-breed virtualized IT solutions across the entire data center. To read what resellers have to say about the value of selling SANsymphony-V, please see: Resellers Herald DataCore Software’s SANsymphony-V Solution for Making Storage Virtualization Practical for All. To view SANsymphony-V reseller video testimonials, please visit: What Virtualization Resellers Are Saying About SANsymphony-V.”

“The companies listed on the 2011 Partner Programs Guide represent the best channel programs in the market today. Of those, only a few get our 5-Star award, based on their commitment to the channel, breadth of program offerings and services offered to their partners,” said Kelley Damore, VP and Editorial Director, for Everything Channel’s CRN. “Each of these organizations understands that technology alone does not make for a successful channel program. By focusing on delivering a comprehensive partner program, vendors and solution providers can work together to drive business opportunities and revenue.”

DataCore has undertaken a major effort to streamline its Partner Program over the last two years to make it easier for partners to reap the rewards and benefits of featuring DataCore’s storage virtualization solutions among their product offerings. The 5-Star recognition from Everything Channel validates this partner-centric approach. DataCore’s Solution Advisor Resource Center includes a wealth of sales training and positioning tools and materials to help resellers and solution advisors grow their virtualization business with DataCore's storage virtualization solutions. To learn more, please visit: