Welcome!

Industrial IoT Authors: Dalibor Siroky, Liz McMillan, William Schmarzo, Elizabeth White, Stackify Blog

Blog Feed Post

VMware or Microsoft?–Agentless Backup for Virtual Environments

Today’s post comes to us courtesy of an old friend and coworker, who is still a friend but who now works for Veeam.  I’m talking about none-other than Chris Henley

Thank you, Chris, for this excellent write-up on backing up virtual environments!

---

One of the most important things to remember when talking about backup best practices in virtual environments is that virtual environments are not physical environments. I know that sounds really silly but that it is really quite important because physical environments have a different architecture than virtual environments. When we talk about running one operating system and one underlying hardware set, it’s important to understand that one hardware, one disk, one operating system, relationship demands a specific architectural design for the software that would be used to backup that physical architecture. In that physical environment the software designers used an architecture that focused heavily on the use of agents to provide the interactions between backup software and the physical hardware they were trying to back up. This agent based approach was incredibly successful for a very long time. Decades! The agent based approach is still successful in physical environments today, and probably represents the best possible backup solution for the physical environment. The problem is virtual environments are not physical environments, and the world of IT is headed for the virtual environment. Virtual environments differ from physical environments because the hypervisor, whether that’s VMware or Hyper-V, provides a layer of abstraction between the underlying hardware and overlying operating systems that will actually run above the hypervisor in the virtual architecture. The important consequence that goes right along with this architectural change in the virtual world means that if you try and use the agent based approach of the past in conjunction with a virtual environment it just doesn’t work. Now the reason it doesn’t work is not that you couldn’t force the old agent based model into the virtual environment where you added an agent to every virtual machine and then monitored, managed, administrated, and maintained those agents. The challenge here is that the virtual environment would demand a dramatic additional measure of work in order to get the backup operations to work properly, and frankly it is not necessary. VMware and Microsoft, the two major players in the hypervisors space, with ESX and Microsoft Hyper-V respectively, have each made a recommendation that we do not use agents in the virtual machines! Instead the recommendation is that you use an open set of APIs and connect to those APIs using standards that will allow us to interact with a virtual machine. This technique allows the software to interact with the underlying host for that Hyper-V or ESX VM. The host provides the tracking mechanism for us to do data protection or data protection mechanisms. Agentless data protection is a big deal.

The Agentless Backup Approach.

When we think about Hyper-V we want to make certain that we take an agentless approach to backup, replication, restoration, monitoring, and management so that we maximize the capabilities that have been built into the hypervisor by Microsoft as well as minimizing the impact in resources that data protection will have on the actual virtual machines themselves. The Microsoft VSS process allows for the imaging of virtual machines in their entirety along with the associated binary, configuration, xml, snapshot, settings, and any other associated virtual machine files which would allow you to make a very complete copy of a virtual machine and its data for backup or other data protection uses. The cool thing is that this is all without the use of any installed agent inside the virtual machine. Of course all of this relies on the fact that you are using the standards based approach, where you have built a set of tools that work directly in conjunction with VSS, and with the way that Hyper-V is built.

When we think about Agentless Systems we don’t necessarily mean that we will not use any agents anywhere in the architecture. Instead what we’re talking about is the fact that the agents will not be installed in the virtual machines. In most cases the actual software that is going to provide data protection to a virtual environment running Hyper-V will have some kind of interactive component that is actually installed or configured on the Hyper-V host. These “agents” and I use the term loosely run in conjunction with the windows operating system that is actually supporting that Hyper-V host. Generally these “agents” come in the form of drivers and or services. They are really not agents in the traditional sense. The key here is that when we make the installation of components that those installed components are not going to the virtual machines, meaning there is no additional overhead to the running virtual machine, or to its application based workflow, or services, and you are not providing any additional requirement for the usage of administrative time and resources necessary to update and manage those agents.

The VSS process

Microsoft has this really cool process called the volume shadow copy service and it is the base for agentless backup of VM’s in Hyper-V. The Volume Shadow Copy service is not new, in fact, it has been around since 2003. Microsoft introduced the volume shadow copy service with Windows Server 2003 and initially it was designed to provide just what its title suggests, shadow copies or previous version copies of existing documents inside the Windows Server operating system. Today we rely on that same functionality and in fact the same VSS.exe service that was used for volume shadow copies to make image copies of virtual machines in Hyper-V. It’s important that you have a brief understanding of the volume shadow copy service so let’s talk about it now.

The volume shadow copy service is made up of three essential components first the Vss.exe service, second the VSS Requestor, and finally the VSS writer.

The VSS.exe service is responsible for taking requests from a VSS Requestor and fulfilling those requests. In this case the requests will be associated with virtual machines and image copies of those VM’s. VSS is installed with each version of Windows Server.

The VSS Requestor will formulate requests to the VSS service for a specific image to be created of a specific virtual machine. The VSS Requestor is not written by Microsoft; instead it’s a piece of software that is written by a third party in order to formulate a request that would then be passed to the VSS Service. You can make your own VSS Requestor with a little help from Microsoft who provides code samples and guidance for those interested in writing a VSS Requestor.

The VSS writer is responsible for taking the image copy of the data that is requested. The VSS writer does the actual writing of that data to disk. Depending on exactly what is requested there are a number of different VSS writers that might be used. For example if you wanted to make an image of a virtual machine running on Hyper-V the volume shadow copy service would use the Hyper-V VSS writer in order to write the image of the virtual machine that was requested by the requester.

image

For more information on the VSS process please see the following link to Technet.Microsoft.com. http://technet.microsoft.com/en-us/library/cc785914(v=WS.10).aspx

Fast recovery

Agentless backup is cool, VSS process is cool, and new ways to implement the 3-2-1 rule are cool, none of this really makes any difference if we can’t get that data back quickly. The defining point in any disaster recovery plan is the ability to recover the data. When we think about recovering data, not only is it important that we understand where the data is located, it’s also important that we know and can clearly work with the format in which the data is stored, and be able to extend the new capabilities to enable advanced data recovery options at a moment’s notice. Virtual machines are built to run application workloads and those application workloads support lots of individual users. A virtual machine running Microsoft Exchange is providing e-mail services to the users in an organization. Those users do not want downtime of the virtual machine that supports their email. In the event of data loss (small or large scale) as administrators we need to find a way to recover e-mail items direct from the backup into the running virtual machine that is supporting the Microsoft Exchange email application. The data protection market has changed dramatically over the past two years with companies focusing more and more on application specific tools and less and less on the legacy methods of data restoration.

With innovative tools like Veeam’s Explorer for Exchange an organization might receive a request from a user who needs to recover an erroneously deleted email message with an associated attachment. The tool allows for the mounting of the Exchange.edb database from within the backup file. Once mounted the helpdesk professional can then search for the desired email, or simply select the user’s mailbox and browse to the email. At this point the email can be restored to the running Exchange VM, emailed directly back to the user, saved as an .msg file, or a .pst file. All of this is done in seconds while the user is on the phone, and while the Exchange server is still running and providing the desired services to the rest of the network.

This new paradigm of agentless data protection at the application level is changing the way we think about data protection and disaster recovery in virtual environments. Best of all its free!

Get the Veeam Backup Free Edition tools at http://www.veeam.com

Read the original blog entry...

More Stories By Kevin Remde

Kevin is an engaging and highly sought-after speaker and webcaster who has landed several times on Microsoft's top 10 webcast list, and has delivered many top-scoring TechNet events and webcasts. In his past outside of Microsoft, Kevin has held positions such as software engineer, information systems professional, and information systems manager. He loves sharing helpful new solutions and technologies with his IT professional peers.

A prolific blogger, Kevin shares his thoughts, ideas and tips on his “Full of I.T.” blog (http://aka.ms/FullOfIT). He also contributes to and moderates the TechNet Forum IT Manager discussion (http://aka.ms/ITManager), and presents live TechNet Events throughout the central U.S. (http://www.technetevents.com). When he's not busy learning or blogging about new technologies, Kevin enjoys digital photography and videography, and sings in a band. (Q: Midlife crisis? A: More cowbell!) He continues to challenge his TechNet Event audiences to sing Karaoke with him.

@ThingsExpo Stories
Widespread fragmentation is stalling the growth of the IIoT and making it difficult for partners to work together. The number of software platforms, apps, hardware and connectivity standards is creating paralysis among businesses that are afraid of being locked into a solution. EdgeX Foundry is unifying the community around a common IoT edge framework and an ecosystem of interoperable components.
"IBM is really all in on blockchain. We take a look at sort of the history of blockchain ledger technologies. It started out with bitcoin, Ethereum, and IBM evaluated these particular blockchain technologies and found they were anonymous and permissionless and that many companies were looking for permissioned blockchain," stated René Bostic, Technical VP of the IBM Cloud Unit in North America, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Conventi...
"Akvelon is a software development company and we also provide consultancy services to folks who are looking to scale or accelerate their engineering roadmaps," explained Jeremiah Mothersell, Marketing Manager at Akvelon, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
Coca-Cola’s Google powered digital signage system lays the groundwork for a more valuable connection between Coke and its customers. Digital signs pair software with high-resolution displays so that a message can be changed instantly based on what the operator wants to communicate or sell. In their Day 3 Keynote at 21st Cloud Expo, Greg Chambers, Global Group Director, Digital Innovation, Coca-Cola, and Vidya Nagarajan, a Senior Product Manager at Google, discussed how from store operations and ...
Large industrial manufacturing organizations are adopting the agile principles of cloud software companies. The industrial manufacturing development process has not scaled over time. Now that design CAD teams are geographically distributed, centralizing their work is key. With large multi-gigabyte projects, outdated tools have stifled industrial team agility, time-to-market milestones, and impacted P&L stakeholders.
Gemini is Yahoo’s native and search advertising platform. To ensure the quality of a complex distributed system that spans multiple products and components and across various desktop websites and mobile app and web experiences – both Yahoo owned and operated and third-party syndication (supply), with complex interaction with more than a billion users and numerous advertisers globally (demand) – it becomes imperative to automate a set of end-to-end tests 24x7 to detect bugs and regression. In th...
"Cloud Academy is an enterprise training platform for the cloud, specifically public clouds. We offer guided learning experiences on AWS, Azure, Google Cloud and all the surrounding methodologies and technologies that you need to know and your teams need to know in order to leverage the full benefits of the cloud," explained Alex Brower, VP of Marketing at Cloud Academy, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clar...
In his session at 21st Cloud Expo, Carl J. Levine, Senior Technical Evangelist for NS1, will objectively discuss how DNS is used to solve Digital Transformation challenges in large SaaS applications, CDNs, AdTech platforms, and other demanding use cases. Carl J. Levine is the Senior Technical Evangelist for NS1. A veteran of the Internet Infrastructure space, he has over a decade of experience with startups, networking protocols and Internet infrastructure, combined with the unique ability to it...
"There's plenty of bandwidth out there but it's never in the right place. So what Cedexis does is uses data to work out the best pathways to get data from the origin to the person who wants to get it," explained Simon Jones, Evangelist and Head of Marketing at Cedexis, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
"MobiDev is a software development company and we do complex, custom software development for everybody from entrepreneurs to large enterprises," explained Alan Winters, U.S. Head of Business Development at MobiDev, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
SYS-CON Events announced today that CrowdReviews.com has been named “Media Sponsor” of SYS-CON's 22nd International Cloud Expo, which will take place on June 5–7, 2018, at the Javits Center in New York City, NY. CrowdReviews.com is a transparent online platform for determining which products and services are the best based on the opinion of the crowd. The crowd consists of Internet users that have experienced products and services first-hand and have an interest in letting other potential buye...
SYS-CON Events announced today that Telecom Reseller has been named “Media Sponsor” of SYS-CON's 22nd International Cloud Expo, which will take place on June 5-7, 2018, at the Javits Center in New York, NY. Telecom Reseller reports on Unified Communications, UCaaS, BPaaS for enterprise and SMBs. They report extensively on both customer premises based solutions such as IP-PBX as well as cloud based and hosted platforms.
"Space Monkey by Vivent Smart Home is a product that is a distributed cloud-based edge storage network. Vivent Smart Home, our parent company, is a smart home provider that places a lot of hard drives across homes in North America," explained JT Olds, Director of Engineering, and Brandon Crowfeather, Product Manager, at Vivint Smart Home, in this SYS-CON.tv interview at @ThingsExpo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
It is of utmost importance for the future success of WebRTC to ensure that interoperability is operational between web browsers and any WebRTC-compliant client. To be guaranteed as operational and effective, interoperability must be tested extensively by establishing WebRTC data and media connections between different web browsers running on different devices and operating systems. In his session at WebRTC Summit at @ThingsExpo, Dr. Alex Gouaillard, CEO and Founder of CoSMo Software, presented ...
WebRTC is great technology to build your own communication tools. It will be even more exciting experience it with advanced devices, such as a 360 Camera, 360 microphone, and a depth sensor camera. In his session at @ThingsExpo, Masashi Ganeko, a manager at INFOCOM Corporation, introduced two experimental projects from his team and what they learned from them. "Shotoku Tamago" uses the robot audition software HARK to track speakers in 360 video of a remote party. "Virtual Teleport" uses a multip...
A strange thing is happening along the way to the Internet of Things, namely far too many devices to work with and manage. It has become clear that we'll need much higher efficiency user experiences that can allow us to more easily and scalably work with the thousands of devices that will soon be in each of our lives. Enter the conversational interface revolution, combining bots we can literally talk with, gesture to, and even direct with our thoughts, with embedded artificial intelligence, whic...
SYS-CON Events announced today that Evatronix will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Evatronix SA offers comprehensive solutions in the design and implementation of electronic systems, in CAD / CAM deployment, and also is a designer and manufacturer of advanced 3D scanners for professional applications.
Leading companies, from the Global Fortune 500 to the smallest companies, are adopting hybrid cloud as the path to business advantage. Hybrid cloud depends on cloud services and on-premises infrastructure working in unison. Successful implementations require new levels of data mobility, enabled by an automated and seamless flow across on-premises and cloud resources. In his general session at 21st Cloud Expo, Greg Tevis, an IBM Storage Software Technical Strategist and Customer Solution Architec...
To get the most out of their data, successful companies are not focusing on queries and data lakes, they are actively integrating analytics into their operations with a data-first application development approach. Real-time adjustments to improve revenues, reduce costs, or mitigate risk rely on applications that minimize latency on a variety of data sources. In his session at @BigDataExpo, Jack Norris, Senior Vice President, Data and Applications at MapR Technologies, reviewed best practices to ...
An increasing number of companies are creating products that combine data with analytical capabilities. Running interactive queries on Big Data requires complex architectures to store and query data effectively, typically involving data streams, an choosing efficient file format/database and multiple independent systems that are tied together through custom-engineered pipelines. In his session at @BigDataExpo at @ThingsExpo, Tomer Levi, a senior software engineer at Intel’s Advanced Analytics gr...