http://www.drunkendata.com/?p=3393
I don’t know about you, but I really don’t like those “For Dummies” books. Maybe I am put off by the idea of being seen carrying a book around that casts aspersions on what my detractors might call my already questionable skills and knowledge. Or maybe it is because a number of titles I have perused over the past couple of years — thin tomes passed out at trade show booths — seem to be more like vendor brochures than sources of meaningful information about technology.
In any case, when DataCore Software approached me to write a “For Dummies” guide to storage virtualization, my initial inclination was to pass on the project. For one thing, I had seen at least two other “Storage Virtualization for Dummies” books in that series, which has become a cottage industry for its publisher, neither of which I had found particularly insightful or probative — unless you just wanted to learn about a particular vendor’s gear.
Then there was that negative title. ”…For Dummies.”
Heck, I might be a dummy. But I don’t want to go around advertising it.
Then, they came back to me with a new idea that changed my mind.
Instead of the familiar black-and-yellow “For Dummies” book, why not do things differently: write a “For Rock Stars” guide. Heck, everyone wants to be a rock star (or did once, before their teeth started falling out of their heads).
As we discussed the project, I grew more and more comfortable with it. I didn’t need to focus on the capabilities and limitations of any one vendor’s product in the guide. Nor did I have to explore the minutiae of differences between one product or another (especially not in a way that was calculated to stilt the discussion in favor of DataCore or any other vendor). DataCore was already quite secure about their technology and the overall superiority and value of their product.
The idea was to articulate the concepts and implementation alternatives available to any IT professional who wanted to get his or her head around the technology.
Moreover, the concept was settled upon to make the entire project a kind of mixed media adventure. I would help to develop the book as a series of white papers, each delivered in concert with a webcast in which I will do the “lead riff” followed by experts from DataCore.
That sounded cool, prompting me to hit PhotoShop hard to develop some seminal art. Here’s the cover:
Intro slide for webcast series Storage Virtualization for Rock Stars
Get ready to rock and roll this Thursday, assuming I can still talk. I will post the invite details shortly.
I should add that this will not only cover the errata of storage virtualization technology, but also the business value case for those of you who have the practical challenge of cost-justifying all expenditures to the bean counters in the front office — (a category that includes most of us today!)
Fact is, I am having a lot of fun helping to develop this program and hope it will get a lot of interest and attendance.
Special thanks to DataCore Software for sponsoring this endeavor. They are all true rock stars in my book.
Invitation to Would-Be Rock Stars
http://www.drunkendata.com/?p=3397
Per my previous post, it is my pleasure to pass along this invitation to participate in Thursday’s introductory installment of the Storage Virtualization for Rock Stars Webinar Series.
Greetings.
You are receiving this because I consider you part of the DataCore Software ecosystem — meaning that I have communicated with you either regularly, routinely, or for a specific purpose in the past. I wanted to “waive a flag” on behalf of a NEW webcast series we are doing in the hopes that you will attend the first one – Part 1: Hitting the Perfect Chord in Storage Efficiency — this Thursday, May 26th. For those who have not already done so, please register at your earliest convenience.
The Storage Virtualization for Rock Stars Webcast Series
Starting this week, DataCore will offer one webcast a month, running from May through October. Different topics under the storage virtualization umbrella will be tackled each time.
Please join us on Thursday, May 26th at 11:30 AM EDT for the first webcast in this series — Part 1: Hitting the Perfect Chord in Storage Efficiency. The webcast will last between 30 and 40 minutes and then we will conduct a Q&A at the end (hence the one-hour “duration” on the registration form). IT industry analyst and data storage luminary Jon Toigo will serve as host and moderator for the entire “For Rock Stars” webcast series.
Also please consider forwarding (or posting on your web site/blog) this short text to encourage any others in your organization to attend –
The Storage Virtualization for Rock Stars Webcast Series: Part 1: Hitting the Perfect Chord in Storage Efficiency
Thursday, May 26, 2011. 11:30 AM – 12:30 PM EDT.
While I don’t know how luminescent I will be given the condition of my dentata, I will give my solo everything I’ve got and will be delighted to hear George Teixeira, President and CEO of DataCore, and Tim Warden, Director of Systems Engineers, contribute their “tasty riffs.” And of course, I will moderate what I hope will be an energetic panel discussion at the end of the presos.
Be there or be square. (Did I say that?)
Information, commentary and updates from Australia / New Zealand on virtualization, business continuity solutions, FC SAN, iSCSI, high-availability, remote replication, disaster recovery and storage virtualization and SAN management solutions.
Wednesday, 25 May 2011
Tuesday, 24 May 2011
Don’t Build Your Cloud with Iron!
I suppose almost everyone remembers cloud-watching as a child: lying on your back outdoors watching them morph from horse to dragon to spaceship to—well, you could find almost anything in those solid-seeming billows drifting overhead. And if someone had asked you then what clouds are made of, I can imagine any number of answers, and not one of them would have involved iron.
But that’s precisely what big-iron storage vendors like EMC, NetApp, and others would like you to think. The hoopla at EMC World last week in Las Vegas got me thinking about this, such as this quote from EMC CEO Joe Tucci’s keynote: “virtualization is the key to the cloud. This is the year when most, if not all, mission critical applications get virtualized and run on cloud topologies. This will be the year when all IT professionals will understand the opportunities [presented by the cloud].”
I couldn’t agree more with the first sentence: virtualization is the key to the cloud. And I hope this is the year when everyone understands cloud opportunities: especially the opportunity to get out from under the iron clouds that storage hardware vendors are pushing. (That image that makes me want to cover my head and run for cover!)
Virtualization software like VMware and Microsoft Hyper-V makes it easy to create virtual machines from a cloud of common CPU and memory resources without worrying about the iron behind the scenes. That’s a huge productivity boost for IT, so why not do the same with storage? Point-and-click creation of virtual disks—no matter whose storage hardware is behind them—would sure beat pushing heavy iron around.
Not surprisingly, private cloud providers like External IT, Host.net, and iomart hosting have been quick to recognize this opportunity. The single point of failure, hardware lock-in, and expense of big-iron storage solutions made it hard for them to provide the agile, cost-effective, highly-reliable infrastructure that their customers demand. So they’ve moved on to storage virtualization software that runs in the cloud like everything else and transforms all their storage assets into a single pool of storage that can very rapidly be “carved up” into virtual disks for customer applications or servers.
The benefits they talk about include:
So forget the rivets and virtualize all the way down to keep your cloud soft and agile.
(Photograph taken by Karin Dalziel)
But that’s precisely what big-iron storage vendors like EMC, NetApp, and others would like you to think. The hoopla at EMC World last week in Las Vegas got me thinking about this, such as this quote from EMC CEO Joe Tucci’s keynote: “virtualization is the key to the cloud. This is the year when most, if not all, mission critical applications get virtualized and run on cloud topologies. This will be the year when all IT professionals will understand the opportunities [presented by the cloud].”
I couldn’t agree more with the first sentence: virtualization is the key to the cloud. And I hope this is the year when everyone understands cloud opportunities: especially the opportunity to get out from under the iron clouds that storage hardware vendors are pushing. (That image that makes me want to cover my head and run for cover!)
Virtualization software like VMware and Microsoft Hyper-V makes it easy to create virtual machines from a cloud of common CPU and memory resources without worrying about the iron behind the scenes. That’s a huge productivity boost for IT, so why not do the same with storage? Point-and-click creation of virtual disks—no matter whose storage hardware is behind them—would sure beat pushing heavy iron around.
Not surprisingly, private cloud providers like External IT, Host.net, and iomart hosting have been quick to recognize this opportunity. The single point of failure, hardware lock-in, and expense of big-iron storage solutions made it hard for them to provide the agile, cost-effective, highly-reliable infrastructure that their customers demand. So they’ve moved on to storage virtualization software that runs in the cloud like everything else and transforms all their storage assets into a single pool of storage that can very rapidly be “carved up” into virtual disks for customer applications or servers.
The benefits they talk about include:
- High availability with synchronous mirroring of all storage resources
- Disaster recovery with asynchronous replication
- Zero-downtime SAN hardware refreshes
- Improved storage performance from intelligent software cache
- No hardware lock-in and easy migration of older storage hardware to lower tiers
So forget the rivets and virtualize all the way down to keep your cloud soft and agile.
(Photograph taken by Karin Dalziel)
Saturday, 21 May 2011
It Ain’t Easy Being ‘Green’ … Or Is It? Virtualization software makes an impact on waste.
Since Earth Day was proclaimed in 1970, protecting the environment has advanced from a grassroots effort to a global cause. We’ve all pitched in and taken steps to ensure the planet will be a healthy place to live for generations to come. From recycling to alternative sources of energy, the pursuit for a green future is on.
With the daily power consumption of a typical data center equivalent to the monthly power consumption of thousands of homes, it’s no surprise that a major focus of the green movement has turned to the data center. In fact, just this week The Green Grid Association announced new steps organizations can take to transform their data centers from an operational burden to a source of prosperity and sustainability.
But most of the hype around green data centers has focused on data center architecture, server consolidation, and new cooling strategies – and somehow ...
Thursday, 19 May 2011
Photos from DataCore at Microsoft TechEd 2011; Dancing DataCore GoBots ‘Hot’ item at the Show
Check out the photos and click on the video links to see the DataCore Dancing GoBots…
Dancing GoBots:
Jousting GoBot
Hula GoBot
Flava Flav GoBot
Also, check out the latest releases from the show:
DataCore Showcases Use Cases and Next-Generation Storage Virtualization Software, SANsymphony-V at Microsoft TechEd
Microsoft Tech-Ed 2011: DataCore Software Adds Extra Data Protection, Flexibility and Speed to Windows Home Server with DriveHarmony
Dancing GoBots:
Jousting GoBot
Hula GoBot
Flava Flav GoBot
Also, check out the latest releases from the show:
DataCore Showcases Use Cases and Next-Generation Storage Virtualization Software, SANsymphony-V at Microsoft TechEd
Microsoft Tech-Ed 2011: DataCore Software Adds Extra Data Protection, Flexibility and Speed to Windows Home Server with DriveHarmony
DataCore Showcases WW User Testimonials and New Software at Microsoft Tech-Ed
http://www.it-director.com/technology/sys_mgmt/news_release.php?rel=24964&ref=fd_info
“Microsoft’s full range of virtualization solutions provide partners and customers with an effective way to save costs, increase availability, and improve agility within their business infrastructure,” said Mike Schutz, senior director of product management for the Server and Cloud division at Microsoft Corp. “By using Microsoft Windows Server 2008 R2 Hyper-V SP1 and Microsoft System Center, customers can also take advantage of DataCore SANsymphony-V and explore ways to reducethe cost of storage while delivering improved performance and availability.”
Enterprise-class Solution Purpose-Built for Windows Server 2008 R2
Cloud services provider AcXess, for example, is using DataCore’s SANsymphony™ software to create a dynamic, resilient virtual storage infrastructure that cost-effectively supports the company’s Windows Server 2008 R2 Hyper-V virtualization environment and its business model.
AcXess has built its business around flexible virtualisation technologies designed to meet the needs of key customers who rely on AcXess to provide reasonably priced services such as on-demand sales demos, proof-of-concept (POC) labs, and training sessions in virtualised environments to thousands of employees, partners, and customers. Using Windows Server 2008 R2 Hyper-V and DataCore, AcXess has doubled the number of users year over year and has grown its business 300 percent, while saving $5 million in hardware costs.
A full case study is available on Microsoft’s website.
“Microsoft’s full range of virtualization solutions provide partners and customers with an effective way to save costs, increase availability, and improve agility within their business infrastructure,” said Mike Schutz, senior director of product management for the Server and Cloud division at Microsoft Corp. “By using Microsoft Windows Server 2008 R2 Hyper-V SP1 and Microsoft System Center, customers can also take advantage of DataCore SANsymphony-V and explore ways to reducethe cost of storage while delivering improved performance and availability.”
Enterprise-class Solution Purpose-Built for Windows Server 2008 R2
Cloud services provider AcXess, for example, is using DataCore’s SANsymphony™ software to create a dynamic, resilient virtual storage infrastructure that cost-effectively supports the company’s Windows Server 2008 R2 Hyper-V virtualization environment and its business model.
AcXess has built its business around flexible virtualisation technologies designed to meet the needs of key customers who rely on AcXess to provide reasonably priced services such as on-demand sales demos, proof-of-concept (POC) labs, and training sessions in virtualised environments to thousands of employees, partners, and customers. Using Windows Server 2008 R2 Hyper-V and DataCore, AcXess has doubled the number of users year over year and has grown its business 300 percent, while saving $5 million in hardware costs.
A full case study is available on Microsoft’s website.
Wednesday, 18 May 2011
DataCore Showcases Use Cases And Next-Generation Storage Virtualization Software, SANsymphony-V At Microsoft TechEd
http://www.bsminfo.com/article.mvc/DataCore-Showcases-Use-Cases-And-Next-0001
DataCore Software is showcasing its SANsymphony™-V solution at this week’s Microsoft Tech•Ed North America 2011 conference. SANsymphony-V is DataCore’s next-generation storage virtualization software solution that enables IT organizations to eliminate storage-related barriers preventing them from realizing the financial and operational goals of their virtualization initiatives.
“By using Microsoft Windows Server 2008 R2 Hyper-V SP1 and Microsoft System Center, customers can also take advantage of DataCore SANsymphony-V and explore ways to reduce the cost of storage while delivering improved performance and availability.”-Mike Schutz, senior director of product management for the Server and Cloud division at Microsoft Corp.
Tech•Ed 2011 takes place at the Georgia World Congress Center in Atlanta from Monday, May 16 until Thursday, May 19, 2011. DataCore is a Silver Sponsor and is exhibiting in Booth 1629.
To read more about DataCore’s solutions and use cases in Microsoft environments, download the Taneja Group profile, “Building the Virtual Infrastructure with DataCore SANsymphony-V.”
DataCore™ SANsymphony-V enables data centers to use existing equipment and conventional storage devices to satisfy shared storage requirements introduced by highly dynamic, virtual IT environments. It provides a Windows Server 2008 R2 Hyper-V SP1 based set of advanced storage virtualization functions to provision, share, reconfigure, migrate, replicate, expand, and upgrade storage. This cost-effectively speeds up applications, delivers uninterrupted data access, and extends the useful life of storage devices, contrasting sharply with the expensive “rip and replace” approaches being proposed to support desktop and server virtualization projects.
“Microsoft’s full range of virtualization solutions provide partners and customers with an effective way to save costs, increase availability, and improve agility within their business infrastructure,” said Mike Schutz, senior director of product management for the Server and Cloud division at Microsoft Corp. “By using Microsoft Windows Server 2008 R2 Hyper-V SP1 and Microsoft System Center, customers can also take advantage of DataCore SANsymphony-V and explore ways to reduce the cost of storage while delivering improved performance and availability.”
Enterprise-class Solution Purpose-Built for Windows Server 2008 R2
Two years in development, DataCore designed SANsymphony-V to meet the needs of midmarket end-users and solution-providers, especially those familiar with Windows Server 2008 R2 Hyper-V SP1 administration eager to adapt their IT operations to accommodate the many variables that server and desktop virtualization introduce. It is interoperable with Microsoft’s virtualization infrastructure, including Windows Server 2008 R2 Hyper-V SP1, System Center, and Failover Clustering.
While some vendors in the virtualization field recommend that customers lock themselves into specific storage hardware configurations and exotic purpose-built appliances, SANsymphony-V frees customers from hardware lock-in by decoupling the virtual infrastructure from the underlying disks. It also optimizes the I/O response obtained from standard storage devices that would have otherwise needed to be replaced. Then when customers need to expand capacity, they can simply choose the best deal at the time from a number of suppliers rather than be limited to one specific vendor’s hardware.
“Shared storage is often the ‘weak link’ in server and desktop virtualization initiatives; first as a performance bottleneck and then as a single point-of-failure. Many times, virtualization projects intended to reduce server or desktop costs hit the storage wall,” said George Teixeira, president and CEO of DataCore Software. “We employ sophisticated software portable and scalable across Windows Server 2008 R2 Hyper-V SP1 to overcome the storage-related roadblocks standing between IT organizations and their virtualization objectives.”
Real-world Successes Built on Virtualization Software
At Tech•Ed 2011, DataCore is also highlighting a number of customers that deploy Microsoft solutions in conjunction with a virtualized storage infrastructure powered by DataCore Software.
Cloud services provider AcXess, for example, is using DataCore’s SANsymphony™ software to create a dynamic, resilient virtual storage infrastructure that cost-effectively supports the company’s Windows Server 2008 R2 Hyper-V virtualization environment and its business model.
AcXess has built its business around flexible virtualization technologies designed to meet the needs of key customers who rely on AcXess to provide reasonably priced services such as on-demand sales demos, proof-of-concept (POC) labs, and training sessions in virtualized environments to thousands of employees, partners, and customers. Using Windows Server 2008 R2 Hyper-V and DataCore, AcXess has doubled the number of users year over year and has grown its business 300 percent, while saving $5 million in hardware costs. A full case study is available on Microsoft’s website.
More Information
Extensive reference material and supporting videos for SANsymphony-V may be found at on DataCore’s website: http://www.datacore.com/SANsymphony-V.
Additional resources, solution overviews, and product documents may be found on the DataCore website: http://www.datacore.com/Solutions/storage-virtualization-and-virtual-server-desktop/Microsoft/Resources.aspx and find out why the Taneja Group profile states, “For Hyper-V users that would like to build an enterprise-capable virtual infrastructure, DataCore SANsymphony-V is an ideal fit.”
DataCore Software is showcasing its SANsymphony™-V solution at this week’s Microsoft Tech•Ed North America 2011 conference. SANsymphony-V is DataCore’s next-generation storage virtualization software solution that enables IT organizations to eliminate storage-related barriers preventing them from realizing the financial and operational goals of their virtualization initiatives.
“By using Microsoft Windows Server 2008 R2 Hyper-V SP1 and Microsoft System Center, customers can also take advantage of DataCore SANsymphony-V and explore ways to reduce the cost of storage while delivering improved performance and availability.”-Mike Schutz, senior director of product management for the Server and Cloud division at Microsoft Corp.
Tech•Ed 2011 takes place at the Georgia World Congress Center in Atlanta from Monday, May 16 until Thursday, May 19, 2011. DataCore is a Silver Sponsor and is exhibiting in Booth 1629.
To read more about DataCore’s solutions and use cases in Microsoft environments, download the Taneja Group profile, “Building the Virtual Infrastructure with DataCore SANsymphony-V.”
DataCore™ SANsymphony-V enables data centers to use existing equipment and conventional storage devices to satisfy shared storage requirements introduced by highly dynamic, virtual IT environments. It provides a Windows Server 2008 R2 Hyper-V SP1 based set of advanced storage virtualization functions to provision, share, reconfigure, migrate, replicate, expand, and upgrade storage. This cost-effectively speeds up applications, delivers uninterrupted data access, and extends the useful life of storage devices, contrasting sharply with the expensive “rip and replace” approaches being proposed to support desktop and server virtualization projects.
“Microsoft’s full range of virtualization solutions provide partners and customers with an effective way to save costs, increase availability, and improve agility within their business infrastructure,” said Mike Schutz, senior director of product management for the Server and Cloud division at Microsoft Corp. “By using Microsoft Windows Server 2008 R2 Hyper-V SP1 and Microsoft System Center, customers can also take advantage of DataCore SANsymphony-V and explore ways to reduce the cost of storage while delivering improved performance and availability.”
Enterprise-class Solution Purpose-Built for Windows Server 2008 R2
Two years in development, DataCore designed SANsymphony-V to meet the needs of midmarket end-users and solution-providers, especially those familiar with Windows Server 2008 R2 Hyper-V SP1 administration eager to adapt their IT operations to accommodate the many variables that server and desktop virtualization introduce. It is interoperable with Microsoft’s virtualization infrastructure, including Windows Server 2008 R2 Hyper-V SP1, System Center, and Failover Clustering.
While some vendors in the virtualization field recommend that customers lock themselves into specific storage hardware configurations and exotic purpose-built appliances, SANsymphony-V frees customers from hardware lock-in by decoupling the virtual infrastructure from the underlying disks. It also optimizes the I/O response obtained from standard storage devices that would have otherwise needed to be replaced. Then when customers need to expand capacity, they can simply choose the best deal at the time from a number of suppliers rather than be limited to one specific vendor’s hardware.
“Shared storage is often the ‘weak link’ in server and desktop virtualization initiatives; first as a performance bottleneck and then as a single point-of-failure. Many times, virtualization projects intended to reduce server or desktop costs hit the storage wall,” said George Teixeira, president and CEO of DataCore Software. “We employ sophisticated software portable and scalable across Windows Server 2008 R2 Hyper-V SP1 to overcome the storage-related roadblocks standing between IT organizations and their virtualization objectives.”
Real-world Successes Built on Virtualization Software
At Tech•Ed 2011, DataCore is also highlighting a number of customers that deploy Microsoft solutions in conjunction with a virtualized storage infrastructure powered by DataCore Software.
Cloud services provider AcXess, for example, is using DataCore’s SANsymphony™ software to create a dynamic, resilient virtual storage infrastructure that cost-effectively supports the company’s Windows Server 2008 R2 Hyper-V virtualization environment and its business model.
AcXess has built its business around flexible virtualization technologies designed to meet the needs of key customers who rely on AcXess to provide reasonably priced services such as on-demand sales demos, proof-of-concept (POC) labs, and training sessions in virtualized environments to thousands of employees, partners, and customers. Using Windows Server 2008 R2 Hyper-V and DataCore, AcXess has doubled the number of users year over year and has grown its business 300 percent, while saving $5 million in hardware costs. A full case study is available on Microsoft’s website.
More Information
Extensive reference material and supporting videos for SANsymphony-V may be found at on DataCore’s website: http://www.datacore.com/SANsymphony-V.
Additional resources, solution overviews, and product documents may be found on the DataCore website: http://www.datacore.com/Solutions/storage-virtualization-and-virtual-server-desktop/Microsoft/Resources.aspx and find out why the Taneja Group profile states, “For Hyper-V users that would like to build an enterprise-capable virtual infrastructure, DataCore SANsymphony-V is an ideal fit.”
Tuesday, 17 May 2011
Microsoft Tech-Ed 2011: DataCore Software Adds Extra Data Protection, Flexibility and Speed to Windows Home Server with DriveHarmony
http://usingwindowshomeserver.com/2011/05/17/datacore-driveharmony-public-beta-announced/
Welcomes beta testers to trial DriveHarmony’s new pooling, mirroring and caching features that boost performance and sidestep hard disk drive limitations
Welcomes beta testers to trial DriveHarmony’s new pooling, mirroring and caching features that boost performance and sidestep hard disk drive limitations
DataCore Software today announced the Beta release of DriveHarmony™, a simple to use add-on software package for Microsoft Windows Home Server (WHS). Users who are eager to protect, pool, accelerate, and easily expand storage capacity are welcome to request an early version of DriveHarmony for their personal beta testing by visiting DataCore’s booth (#1629) at this week’s Microsoft Tech•Ed 2011 or by sending a request to WHS@DataCore.com.
“With more and more content going digital, people increasingly want a simple way to access, store, and enjoy the wide range of photos, personal videos, music, and films they save at home”
Easily Pool and Expand Storage to Simplify File Sharing and Folder Management
Leveraging techniques learned from a decade of experience in large data centers across the globe, DataCore™ DriveHarmony maximizes the value home users get from their hard disk drives. The software integrates into the familiar WHS dashboard with a simple, easy-to-use control panel. From there, users can combine one or more physical disks of variable sizes and types into one “Virtual Big Disk” pool. When this virtual drive is created, it is automatically initialized, formatted, assigned a drive letter, and selectively mirrored for data protection. The virtual drive is then ready for use by any application running on the WHS 2011 operating system.
Growing with the Needs of Today’s Digital Lifestyle
As families and small businesses gather more digital assets, their disk space consumption naturally grows. DriveHarmony delivers the right combination of power, capacity, and ease-of-use for today’s home consumers. It allows them to expand appropriately by adding more drives to virtual disks without having to split up or re-organize their files. The rest is done behind the scenes without any need for intervention.
“With more and more content going digital, people increasingly want a simple way to access, store, and enjoy the wide range of photos, personal videos, music, and films they save at home,” said Carlos M. Carreras, vice president of alliances and business development, DataCore Software. “Through the gathering of feedback during the beta process, we look forward to working closely with Microsoft and the Windows Home Server Community to bring to market a powerful solution to help home users easily protect, organize, and manage their growing disk requirements in a digital age.”
Availability
DriveHarmony is currently in Beta release and will be rolled out via select OEMs later this year. Visitors to DataCore’s booth (#1629) at Microsoft Tech•Ed 2011 can request an early version of DriveHarmony for their personal beta testing. Tech•Ed takes place at the Georgia World Congress Center in Atlanta from Monday, May 16 until Thursday, May 19, 2011.
DataCore Attending Tech•Ed 2011
If you are in the Atlanta, GA area from May 16 through May 19, stop by Tech•Ed 2011 at the Georgia World Congress Center. Visit DataCore’s booth (#1629) where we’ll be “unlocking the mysteries” of storage virtualization and showcasing SANsymphony-V.
As part of a new initiative to extend our product line across the Microsoft Windows Server family, we will also be using Tech•Ed to highlight our newest software product, DriveHarmony, an easy to use add-on software application for Microsoft Windows Home Server (WHS). Due for general release in June, DriveHarmony offer WHS users pooling, mirroring and caching features that boost performance and sidestep hard disk drive limitations, functionality long familiar to DataCore enterprise customers. All visitors to our booth will be encouraged to sign up for the beta version for personal testing. We hope to see you there!
Visit our website for more information:
http://www.datacore.com/teched2011
See related blog post:
http://usingwindowshomeserver.com/2011/05/12/datacore-at-tech-ed/
As part of a new initiative to extend our product line across the Microsoft Windows Server family, we will also be using Tech•Ed to highlight our newest software product, DriveHarmony, an easy to use add-on software application for Microsoft Windows Home Server (WHS). Due for general release in June, DriveHarmony offer WHS users pooling, mirroring and caching features that boost performance and sidestep hard disk drive limitations, functionality long familiar to DataCore enterprise customers. All visitors to our booth will be encouraged to sign up for the beta version for personal testing. We hope to see you there!
Visit our website for more information:
http://www.datacore.com/teched2011
See related blog post:
http://usingwindowshomeserver.com/2011/05/12/datacore-at-tech-ed/
Saturday, 14 May 2011
Guess Who’s Back, Back Again? … External Storage Fights the Hard Fight
It’s official. According to Gartner, external controller-based disk storage has recovered from the global recession, exceeding the record sales figure from 2008 with revenue in 2010 of more than $19.4 billion (US). Analyst Roger Cox commented that “enterprises and service providers alike are investing more in external storage as they virtualize servers and build cloud-based services.”
Here’s my prediction: the storage required to support the Clouds will remain too expensive until Clouds move to a hardware independent software-based storage virtualization model.
As with virtual desktops, potential private cloud customers are experiencing “sticker shock” due to demanding storage requirements. This has slowed commercial interest. Clouds by definition are software constructs that provide services as needed when needed. To do so, the combination of virtual servers, virtual storage and virtual networks working together as a combined virtualization layer is the obvious choice.
From our perspective, a cloud computing platform ...
Thursday, 12 May 2011
New Case Study on Ports of Auckland
“Performance has been nothing short of phenomenal.”
Read the complete Case Study
The IT infrastructure at Ports of Auckland is extensively virtualized, but its aging Storage Area Network (SAN) was limiting critical application performance, required too much downtime for physical maintenance or upgrades, and could not furnish the high availability demanded by port operations. By deploying DataCore SANsymphony storage virtualization software, the IT team was able to guarantee a failover time of mere minutes, increase storage performance and utilization, and greatly reduce their storage infrastructure costs and management burden.
“With SANsymphony, we have seen the benefits first-hand of improved uptime and being able to do operational maintenance without affecting the business. We’re getting better utilization out of our storage hardware, and need less of it to get the same performance. Lastly, we have peace of mind. We know we don’t have to throw away any functionality intelligence because we get to keep using SANsymphony software even as the hardware underneath changes.”
- Craig Beetlestone, Lead System
Read the complete Case Study
The IT infrastructure at Ports of Auckland is extensively virtualized, but its aging Storage Area Network (SAN) was limiting critical application performance, required too much downtime for physical maintenance or upgrades, and could not furnish the high availability demanded by port operations. By deploying DataCore SANsymphony storage virtualization software, the IT team was able to guarantee a failover time of mere minutes, increase storage performance and utilization, and greatly reduce their storage infrastructure costs and management burden.
“With SANsymphony, we have seen the benefits first-hand of improved uptime and being able to do operational maintenance without affecting the business. We’re getting better utilization out of our storage hardware, and need less of it to get the same performance. Lastly, we have peace of mind. We know we don’t have to throw away any functionality intelligence because we get to keep using SANsymphony software even as the hardware underneath changes.”
- Craig Beetlestone, Lead System
Monday, 9 May 2011
It’s Not Rocket Science: The More Things Change, the More They Stay the Same
After the space shuttle Atlantis lifts off from the Kennedy Space Center in June, an era of NASA’s manned space exploration will come to a close. Nearly 50 years to the day after President John F. Kennedy called for a brave new era of space exploration, and 30 years after the launch of Columbia, NASA is bringing its three-decade-long Space Shuttle Program to a close. This is at a time when 2011 marks such a significant, historic year for the U.S. space program.
After completing 135 missions, the space shuttle – the workhorse and pride of the American space program – has finally outlived its benefit. It was originally devised because, at the time, a reusable spacecraft seemed much more economical than its one-and-done predecessors. And for 30 years, the space shuttle fulfilled its mission objectives thanks in part to one critical component: its design.
That’s right; the space shuttle’s main design has been unchanged for 30 years. Certainly, upgrades have been made here and there, but to the casual observer, it appears very much the same vehicle as it did in 1981. The “software” has been updated, but the hardware remains much the same. We see instances of this still to this day, where the life of hardware can be extended with smarter software.
How fitting that we see one example in an area that was born from NASA and the space program – storage virtualization. When converting to a virtualized environment, many will tell you that you need to make a costly upgrade investment to the storage infrastructure, if you want to run the latest software. And of course you’ll need to have the latest hardware. This is just not true.
Smarter virtualization software has shown that it can take advantage of just about any storage hardware and you don’t need the most powerful, fastest, or biggest anymore. That’s because the software is doing a lot of the heavy lifting. Just like with a car… if you want more oomph off the line, you don’t need a new engine, you just need to change the gear ratios and the shift points (the latter also being a software programming issue), in other words, use a smarter program to get much more from the same “old” engine. This is the same with storage virtualization projects.
Now don’t get me wrong, software cannot extend the life of hardware indefinitely. Eventually, your car engine will die, but hopefully after several hundred thousand miles. And the space shuttle will retire this year, only to be replaced by something newer.
Eventually, you may have to roll in some new devices, but smarter software can ensure that the investments you’ve already made won’t go obsolete anytime soon and can enjoy a long, healthy, and productive lifespan. Maybe people will be talking about your “old” disks, even 30 years from now.
After completing 135 missions, the space shuttle – the workhorse and pride of the American space program – has finally outlived its benefit. It was originally devised because, at the time, a reusable spacecraft seemed much more economical than its one-and-done predecessors. And for 30 years, the space shuttle fulfilled its mission objectives thanks in part to one critical component: its design.
That’s right; the space shuttle’s main design has been unchanged for 30 years. Certainly, upgrades have been made here and there, but to the casual observer, it appears very much the same vehicle as it did in 1981. The “software” has been updated, but the hardware remains much the same. We see instances of this still to this day, where the life of hardware can be extended with smarter software.
How fitting that we see one example in an area that was born from NASA and the space program – storage virtualization. When converting to a virtualized environment, many will tell you that you need to make a costly upgrade investment to the storage infrastructure, if you want to run the latest software. And of course you’ll need to have the latest hardware. This is just not true.
Smarter virtualization software has shown that it can take advantage of just about any storage hardware and you don’t need the most powerful, fastest, or biggest anymore. That’s because the software is doing a lot of the heavy lifting. Just like with a car… if you want more oomph off the line, you don’t need a new engine, you just need to change the gear ratios and the shift points (the latter also being a software programming issue), in other words, use a smarter program to get much more from the same “old” engine. This is the same with storage virtualization projects.
Now don’t get me wrong, software cannot extend the life of hardware indefinitely. Eventually, your car engine will die, but hopefully after several hundred thousand miles. And the space shuttle will retire this year, only to be replaced by something newer.
Eventually, you may have to roll in some new devices, but smarter software can ensure that the investments you’ve already made won’t go obsolete anytime soon and can enjoy a long, healthy, and productive lifespan. Maybe people will be talking about your “old” disks, even 30 years from now.
Saturday, 7 May 2011
Is This The Year When Desktop Virtualization “Gets Real”? VMware Thinks So.
I recently came across the following article in Barron’s and I think it merits discussion. Financial analyst Louis Miscioscia of Collins Stewart gives some insight into what’s cooking at VMware, and interestingly, the virtualization leader believes that this is the year when desktop virtualization projects will transform from proof of concepts to real deployments.
The main reason for this adoption? The cost of deploying a virtualized desktop infrastructure (VDI) has dropped so significantly (about 70% cheaper) in just the past few years that it’s becoming a no-brainer from an investment perspective. The writer, Tiernan Ray, does point out that server virtualization deployments have far outpaced this uptick in VDI, so the bottom line is that there is a real uptick in the overall volume of virtualization projects going on right now. Maybe your own organization is dealing with this surge right now.
But as enterprises jump to on the ...
Thursday, 5 May 2011
DataCore Software Powers Scottsdale Community College Virtual Storage Infrastructure Designed for Virtual Desktops
Read the full Case study:
http://www.datacore.com/Libraries/Case_Study_PDFs/Scottsdale_Community_College.sflb.ashx
DataCore Software Powers Scottsdale Community College’s Virtualized IT Environment
Complements Citrix XenDesktop to provide an end-to-end virtualization solution and an enhanced virtual desktop experience for approximately 12,000 students and 1,000 employees
http://www.wwpi.com/index.php?option=com_content&view=article&id=12712:datacore-powers-scottsdale-community-colleges-virtualized-it-environment&catid=231:storage-software&Itemid=2701181
http://www.istockanalyst.com/business/news/5108185/datacore-software-powers-scottsdale-community-college-s-virtualized-it-environment
Scottsdale Community College (SCC) has deployed DataCore’s SANsymphony™ storage virtualization software in conjunction with XenDesktop from Citrix in order to realize a fully virtualized IT environment. The responsiveness of virtualized environments – and in particular – the virtual desktop infrastructure (VDI), is closely tied to the availability and performance of the underlying storage systems.
DataCore SANsymphony and Citrix run with server hardware from HP and storage hardware from Xiotech. Collectively, they power the mySCC (my Scottsdale Community College) virtualized environment. With mySCC’s virtualization strategy, Scottsdale Community College has significantly reduced the total costs of its IT operations by $250,000, while increasing the number of services offered and dramatically improving access to them through a richer, more powerful desktop interface. The full case study is available on DataCore’s website: Virtual Storage Infrastructure Designed for Virtual Desktops.
“DataCore front-ends our Xiotech storage arrays and improves I/O performance”
“DataCore front-ends our Xiotech storage arrays and improves I/O performance,” said Dustin Fennell, vice president of IT and CIO, Scottsdale Community College. “And next year, if we get a better hardware option in terms of price-performance – we can hang that behind DataCore too. By embracing total virtualization of servers, desktops and storage, the college is now saving a quarter million dollars a year that would have been spent on hardware had we not deployed best-of-breed virtualization solutions from DataCore and Citrix.”
A Complete Virtual IT Environment Powered by DataCore and Citrix
DataCore’s SANsymphony software enables institutions of higher learning, such as SCC, to use their existing storage equipment and devices to achieve the robust and responsive shared storage infrastructure necessary to support highly dynamic virtual IT environments, including desktops. This contrasts sharply with the alternative “rip and replace” approach, which is often prohibitively expensive and results in a rigid infrastructure that cannot adapt to future storage needs.
DataCore software runs on HP DL 380 servers and these DataCore storage virtualization nodes connect to Xiotech Emprise 5000 hardware. The combination of Citrix’s unique provisioning technology and DataCore’s storage virtualization software dramatically reduces demands on storage and improves performance significantly – giving users maximum availability and performance from their virtual desktops.
Virtualization has turned technology operations at SCC from a cost center into a resource that actually funds innovation. SCC now offers technology innovation grants from the money it is saving in its technology operations budget.
The mySCC Virtual Environment: Today’s Online, Digitized University of the Future
mySCC is an end-to-end virtual IT environment whose virtual desktops enable access to over 230 applications across all of SCC’s faculty, staff and students. It serves the needs of approximately 12,000 students and 1,000 employees, including both full-time and part-time staffers, as well as adjunct and residential faculty. Through mySCC’s virtual desktop experience, users are presented with remote access to simple Windows applications like Microsoft Office – as well as more esoteric ones like AutoCAD. SCC sought to make its learning materials and resources completely accessible and affordable for any student – regardless of physical whereabouts, academic discipline and socioeconomic status.
IT administrators at SCC are now able to manage their older storage devices behind the same unified DataCore interface as the newer, higher-speed Xiotech arrays. SCC centrally controls DataCore’s comprehensive feature set across their entire infrastructure rather than being forced to deal with model-specific variations as is the case with device-by-device administration.
“The IT needs of our country’s colleges and universities are growing increasingly complex and expensive,” said George Teixiera, president and CEO of DataCore Software. “With tuition growing at alarming rates, colleges must drive operating costs down significantly to keep higher education affordable and accessible. By virtualizing their infrastructure, SCC has not only been able to extend their online resources to every student regardless of location and focus of study, they were able to generate incredible savings – funds that can be re-allocated to support critical assets and functions, such as the faculty and academic programs.”
DataCore offers 50% discounts to qualified educational institutions. To find out more, please visit: http://www.datacore.com/Solutions/Major-Verticals/storage-virtualization-software-for-education.aspx.
Full Scottsdale Community College Case Study: http://www.datacore.com/Libraries/Case_Study_PDFs/Scottsdale_Community_College.sflb.ashx
About Scottsdale Community College
Scottsdale Community College, a two-year college located in Scottsdale, Arizona, enrolls approximately 12,000 students and employs around 1,000 staff. The institution has recently expanded enrollment with non-traditional learners such as working adults and people who take courses online. Visit www.scottsdalecc.edu.
http://www.datacore.com/Libraries/Case_Study_PDFs/Scottsdale_Community_College.sflb.ashx
DataCore Software Powers Scottsdale Community College’s Virtualized IT Environment
Complements Citrix XenDesktop to provide an end-to-end virtualization solution and an enhanced virtual desktop experience for approximately 12,000 students and 1,000 employees
http://www.wwpi.com/index.php?option=com_content&view=article&id=12712:datacore-powers-scottsdale-community-colleges-virtualized-it-environment&catid=231:storage-software&Itemid=2701181
http://www.istockanalyst.com/business/news/5108185/datacore-software-powers-scottsdale-community-college-s-virtualized-it-environment
Scottsdale Community College (SCC) has deployed DataCore’s SANsymphony™ storage virtualization software in conjunction with XenDesktop from Citrix in order to realize a fully virtualized IT environment. The responsiveness of virtualized environments – and in particular – the virtual desktop infrastructure (VDI), is closely tied to the availability and performance of the underlying storage systems.
DataCore SANsymphony and Citrix run with server hardware from HP and storage hardware from Xiotech. Collectively, they power the mySCC (my Scottsdale Community College) virtualized environment. With mySCC’s virtualization strategy, Scottsdale Community College has significantly reduced the total costs of its IT operations by $250,000, while increasing the number of services offered and dramatically improving access to them through a richer, more powerful desktop interface. The full case study is available on DataCore’s website: Virtual Storage Infrastructure Designed for Virtual Desktops.
“DataCore front-ends our Xiotech storage arrays and improves I/O performance”
“DataCore front-ends our Xiotech storage arrays and improves I/O performance,” said Dustin Fennell, vice president of IT and CIO, Scottsdale Community College. “And next year, if we get a better hardware option in terms of price-performance – we can hang that behind DataCore too. By embracing total virtualization of servers, desktops and storage, the college is now saving a quarter million dollars a year that would have been spent on hardware had we not deployed best-of-breed virtualization solutions from DataCore and Citrix.”
A Complete Virtual IT Environment Powered by DataCore and Citrix
DataCore’s SANsymphony software enables institutions of higher learning, such as SCC, to use their existing storage equipment and devices to achieve the robust and responsive shared storage infrastructure necessary to support highly dynamic virtual IT environments, including desktops. This contrasts sharply with the alternative “rip and replace” approach, which is often prohibitively expensive and results in a rigid infrastructure that cannot adapt to future storage needs.
DataCore software runs on HP DL 380 servers and these DataCore storage virtualization nodes connect to Xiotech Emprise 5000 hardware. The combination of Citrix’s unique provisioning technology and DataCore’s storage virtualization software dramatically reduces demands on storage and improves performance significantly – giving users maximum availability and performance from their virtual desktops.
Virtualization has turned technology operations at SCC from a cost center into a resource that actually funds innovation. SCC now offers technology innovation grants from the money it is saving in its technology operations budget.
The mySCC Virtual Environment: Today’s Online, Digitized University of the Future
mySCC is an end-to-end virtual IT environment whose virtual desktops enable access to over 230 applications across all of SCC’s faculty, staff and students. It serves the needs of approximately 12,000 students and 1,000 employees, including both full-time and part-time staffers, as well as adjunct and residential faculty. Through mySCC’s virtual desktop experience, users are presented with remote access to simple Windows applications like Microsoft Office – as well as more esoteric ones like AutoCAD. SCC sought to make its learning materials and resources completely accessible and affordable for any student – regardless of physical whereabouts, academic discipline and socioeconomic status.
IT administrators at SCC are now able to manage their older storage devices behind the same unified DataCore interface as the newer, higher-speed Xiotech arrays. SCC centrally controls DataCore’s comprehensive feature set across their entire infrastructure rather than being forced to deal with model-specific variations as is the case with device-by-device administration.
“The IT needs of our country’s colleges and universities are growing increasingly complex and expensive,” said George Teixiera, president and CEO of DataCore Software. “With tuition growing at alarming rates, colleges must drive operating costs down significantly to keep higher education affordable and accessible. By virtualizing their infrastructure, SCC has not only been able to extend their online resources to every student regardless of location and focus of study, they were able to generate incredible savings – funds that can be re-allocated to support critical assets and functions, such as the faculty and academic programs.”
DataCore offers 50% discounts to qualified educational institutions. To find out more, please visit: http://www.datacore.com/Solutions/Major-Verticals/storage-virtualization-software-for-education.aspx.
Full Scottsdale Community College Case Study: http://www.datacore.com/Libraries/Case_Study_PDFs/Scottsdale_Community_College.sflb.ashx
About Scottsdale Community College
Scottsdale Community College, a two-year college located in Scottsdale, Arizona, enrolls approximately 12,000 students and employs around 1,000 staff. The institution has recently expanded enrollment with non-traditional learners such as working adults and people who take courses online. Visit www.scottsdalecc.edu.
Wednesday, 4 May 2011
How To Sell Storage Virtualization To the CIO
http://community.crn.com/groups/how-do-i-info-for-starting-your-var-business/blog/2011/04/29/how-to-sell-storage-virtualization-to-the-cio#comments
By Dan Hascall, vice president of Americas channel and sales operations, DataCore Software
Your customers are becoming overrun with data—and those that must comply with Federal regulations are struggling to store crushing amounts of information for long periods of time. The solution is, quite obviously, to sell the those companies more storage. Here, Hascall, who has held senior level management positions at VMware, Alliance Systems, and Sun Microsystems, discusses how to get buy-in from the top for a virtualized solution.
Data that is housed in enterprise data centers is growing at an astounding 60 percent per year, according to analyst firm IDC. For CIOs, this is not good news. Furthermore, Sarbanes-Oxley compliant companies must store their data for at least seven years. The move to agile, virtual infrastructures has also exacerbated the cost and demands placed on shared storage systems. In this quandary of data storage issues, storage virtualization software has provided a much-needed grounding.
Virtual Resource Pools: Manage the growth and cost of storage as you do servers
Put simply, storage virtualization software makes it easy to pool and share storage resources just as server virtualization allows you to do the same for computing resources. It adds a new level of flexibility and responsiveness to meet user needs by ensuring that supply meets business demand. From a ‘lowering risk’ standpoint, storage virtualization software insulates users and applications from the inevitable upgrades, changes and disruptions that occur at the underlying hardware level. Storage virtualization software that works infrastructure-wide is the logical next step to complement and extend the business value of server and desktop virtualization. The combination acts as an enabler and an accelerator for businesses to fully benefit from virtualized environments, while at the same time giving users the freedom to choose the storage hardware technologies that best meet their needs and budgets.
How do you get CIOs to jump on the storage virtualization bandwagon if they’re not on it already? Learn five compelling points that make storage virtualization right for their organization. [Read the complete article]
By Dan Hascall, vice president of Americas channel and sales operations, DataCore Software
Your customers are becoming overrun with data—and those that must comply with Federal regulations are struggling to store crushing amounts of information for long periods of time. The solution is, quite obviously, to sell the those companies more storage. Here, Hascall, who has held senior level management positions at VMware, Alliance Systems, and Sun Microsystems, discusses how to get buy-in from the top for a virtualized solution.
Data that is housed in enterprise data centers is growing at an astounding 60 percent per year, according to analyst firm IDC. For CIOs, this is not good news. Furthermore, Sarbanes-Oxley compliant companies must store their data for at least seven years. The move to agile, virtual infrastructures has also exacerbated the cost and demands placed on shared storage systems. In this quandary of data storage issues, storage virtualization software has provided a much-needed grounding.
Virtual Resource Pools: Manage the growth and cost of storage as you do servers
Put simply, storage virtualization software makes it easy to pool and share storage resources just as server virtualization allows you to do the same for computing resources. It adds a new level of flexibility and responsiveness to meet user needs by ensuring that supply meets business demand. From a ‘lowering risk’ standpoint, storage virtualization software insulates users and applications from the inevitable upgrades, changes and disruptions that occur at the underlying hardware level. Storage virtualization software that works infrastructure-wide is the logical next step to complement and extend the business value of server and desktop virtualization. The combination acts as an enabler and an accelerator for businesses to fully benefit from virtualized environments, while at the same time giving users the freedom to choose the storage hardware technologies that best meet their needs and budgets.
How do you get CIOs to jump on the storage virtualization bandwagon if they’re not on it already? Learn five compelling points that make storage virtualization right for their organization. [Read the complete article]
3 Year Storage Implementations … Déjà Vu All Over Again?
We’ve been talking a lot lately about “The Big Problem” facing virtualization projects, which is the need to virtualize the often-overlooked storage tier when desktops and servers are being virtualized. Many organizations have fallen into the trap of totally overhauling their existing hardware storage infrastructure in order to accommodate large-scale projects, at great expense both financially and in terms of IT efficiency.
One example that stuck out to me was in a recent article by CMIO Magazine in which a New York-based rehab hospital spent $450K and three years implementing a storage disk array to facilitate the use of electronic medical records (EMRs).
The CIO of the hospital was dealing with 500 users with more than 1,000 devices on its network across 15 buildings, spanning a 60-acre campus. One of main problems he was facing was that his 10-person IT staff had 30 projects going on simultaneously and the server-attached ...
Monday, 2 May 2011
DataCore Adds NAS Performance Acceleration and High-Availability File Sharing Support to SANsymphony-V Storage Virtualization Software
http://vmblog.com/archive/2011/03/18/datacore-adds-nas-performance-acceleration-and-high-availability-file-sharing-support-to-sansymphony-v-storage-virtualization-software.aspx
DataCore Software is announcing that its SANsymphony™-V software addresses the three major roadblocks – cost, performance and business continuity – that have made it impractical for critical virtualization projects to utilize Network Attached Storage (NAS).
Why It Matters?
DataCore with SANsymphony-V software can now address customer NAS requirments in addition to meeting their SAN needs.
Enterprise NAS hardware (High Performance, Highly Available, Highly Scalable) systems such as NetApp filers and EMC Isilon have been successful in larger data centers but due to their very expensive price points are not able to meet the needs of small and mid-size businesses. DataCore has addressed this dilemma with its SANsymphony-V storage virtualization software. The new development announced today employs standard Windows servers with off-the-shelf disks to achieve the required levels of uninterrupted shared file services at a reasonable price. Moreover, these systems now scale to performance levels once achieved only by “enterprise” NAS hardware.
Quotes --
Jeff Boles, senior analyst and director of validation services, Taneja Group:
“SANsymphony-V can be easily configured to significantly accelerate performance and add a new level of data protection to Microsoft’s Clustered File Shares. This resulting combination of SANsymphony-V and Microsoft is additive; it is simple to set up, requires no additional purchases and best of all it allows organizations to meet both their NAS and SAN requirements from one virtual infrastructure.”
John Bocskor, vice president of product management, DataCore Software:
“Unlike other attempts at converged SAN/NAS, the DataCore approach optimizes each layer for what it does best, yet both are managed from the familiar Windows Server administration console via a user-friendly interface with self-guided wizards.”
Key points --
Existing DataCore SANsymphony-V customers running Windows Server 2008 R2 Enterprise with Hyper-V do not need to buy additional software to support the clustered NAS file sharing features. For those new to DataCore, SANsymphony-V software, including the NAS capabilities, can be professionally installed through DataCore-authorized solution providers across the globe.
DataCore Software is announcing that its SANsymphony™-V software addresses the three major roadblocks – cost, performance and business continuity – that have made it impractical for critical virtualization projects to utilize Network Attached Storage (NAS).
Why It Matters?
DataCore with SANsymphony-V software can now address customer NAS requirments in addition to meeting their SAN needs.
Enterprise NAS hardware (High Performance, Highly Available, Highly Scalable) systems such as NetApp filers and EMC Isilon have been successful in larger data centers but due to their very expensive price points are not able to meet the needs of small and mid-size businesses. DataCore has addressed this dilemma with its SANsymphony-V storage virtualization software. The new development announced today employs standard Windows servers with off-the-shelf disks to achieve the required levels of uninterrupted shared file services at a reasonable price. Moreover, these systems now scale to performance levels once achieved only by “enterprise” NAS hardware.
Quotes --
Jeff Boles, senior analyst and director of validation services, Taneja Group:
“SANsymphony-V can be easily configured to significantly accelerate performance and add a new level of data protection to Microsoft’s Clustered File Shares. This resulting combination of SANsymphony-V and Microsoft is additive; it is simple to set up, requires no additional purchases and best of all it allows organizations to meet both their NAS and SAN requirements from one virtual infrastructure.”
John Bocskor, vice president of product management, DataCore Software:
“Unlike other attempts at converged SAN/NAS, the DataCore approach optimizes each layer for what it does best, yet both are managed from the familiar Windows Server administration console via a user-friendly interface with self-guided wizards.”
Key points --
- SANsymphony-V overcomes shared storage shortcomings and makes it feasible to employ the widely used NAS services already built into the Microsoft Windows Server 2008 R2 platform in those environments.
- DataCore’s powerful feature set can be easily configured to enhance these Microsoft services. The integrated combination cost-effectively speeds up performance and adds a new level of fault tolerance to clustered network file system (NFS) and common Internet file system (CIFS) sharing.
- The new solution is particularly well-suited for IT organizations that prefer to store their VMware vSphere, Microsoft Hyper-V and Citrix XenDesktop virtual machine images under NFS or CIFS. Many of them are eager to eliminate bottlenecks and disruptions in their underlying storage systems, but not willing to spend a lot of money doing it.
Existing DataCore SANsymphony-V customers running Windows Server 2008 R2 Enterprise with Hyper-V do not need to buy additional software to support the clustered NAS file sharing features. For those new to DataCore, SANsymphony-V software, including the NAS capabilities, can be professionally installed through DataCore-authorized solution providers across the globe.
Storage by the Barrel
Every time you drive past a gas station, it’s painfully clear that the oil and gas industry is experiencing extraordinary supply and demand issues … which translates to escalating prices. According to the International Energy Agency’s World Energy Outlook report, by 2030 global energy demand is expected to increase by 50%, and oil- and gas-based energy will account for approximately 60% of this increase.
That’s why there may be no other industry today that demands a more diverse set of technological capabilities than oil and gas exploration and production. Mark P. Mills said it best in his recent Forbes.com column, “Whither oil prices? 150 in sight. So follow Bill Gates in to digital barrels”:
Sit in an oil conference today and you hear about digital technology and software, cloud computing, remote servers, bandwidth constraints, high-speed wireless, terabytes of storage, GPS, laser mapping, virtualization, virtual-reality caves, satellite imaging, hyperspectral cubes, reverse-time software, and virtualization.
Did you catch that: terabytes of storage? Let me put it into perspective. One terabyte can hold 1,000 copies of the Encyclopedia Britannica and 10 terabytes can hold the printed collection of the Library of Congress. That’s a lot of data.
There’s little doubt that data and storage are literally the “beating heart” of today’s oil and gas organizations. But really, oil and gas is just a “poster child” for what’s happening across enterprises of all types and sizes. As enterprises become increasingly data-driven, they consume and create ever-more data from an increasingly diverse universe of sources. And while storage prices continue to decline, they are not keeping pace with the rate of data consumption, which is causing new problems in manageability, usability and data center resources. This all adds up to systemic storage problems.
With the emergence of the virtualized IT infrastructure, as well as the explosive data growth we’ve experienced over the past 10-plus years, storage management is now in the spotlight as storage volume has become the biggest cost driver for IT. And it’s become evident that a new approach is needed.
In this quandary of data storage issues, we believe virtualization-driven storage consolidation has provided a much-needed grounding. Storage virtualization is a practical and effective way to manage the increasing complexity of data and growing demands for high-availability. It also allows organizations to meet this challenge in a manner that reduces operational costs, improves efficiency, provides an expanded choice of hardware alternatives, and extends the life of existing storage assets.
And it puts real money back into the IT budget. Don’t you have some encyclopedias to buy or something?
That’s why there may be no other industry today that demands a more diverse set of technological capabilities than oil and gas exploration and production. Mark P. Mills said it best in his recent Forbes.com column, “Whither oil prices? 150 in sight. So follow Bill Gates in to digital barrels”:
Sit in an oil conference today and you hear about digital technology and software, cloud computing, remote servers, bandwidth constraints, high-speed wireless, terabytes of storage, GPS, laser mapping, virtualization, virtual-reality caves, satellite imaging, hyperspectral cubes, reverse-time software, and virtualization.
Did you catch that: terabytes of storage? Let me put it into perspective. One terabyte can hold 1,000 copies of the Encyclopedia Britannica and 10 terabytes can hold the printed collection of the Library of Congress. That’s a lot of data.
There’s little doubt that data and storage are literally the “beating heart” of today’s oil and gas organizations. But really, oil and gas is just a “poster child” for what’s happening across enterprises of all types and sizes. As enterprises become increasingly data-driven, they consume and create ever-more data from an increasingly diverse universe of sources. And while storage prices continue to decline, they are not keeping pace with the rate of data consumption, which is causing new problems in manageability, usability and data center resources. This all adds up to systemic storage problems.
With the emergence of the virtualized IT infrastructure, as well as the explosive data growth we’ve experienced over the past 10-plus years, storage management is now in the spotlight as storage volume has become the biggest cost driver for IT. And it’s become evident that a new approach is needed.
In this quandary of data storage issues, we believe virtualization-driven storage consolidation has provided a much-needed grounding. Storage virtualization is a practical and effective way to manage the increasing complexity of data and growing demands for high-availability. It also allows organizations to meet this challenge in a manner that reduces operational costs, improves efficiency, provides an expanded choice of hardware alternatives, and extends the life of existing storage assets.
And it puts real money back into the IT budget. Don’t you have some encyclopedias to buy or something?
Subscribe to:
Posts (Atom)