It is currently Thu Mar 28, 2024 7:51 am

All times are UTC - 7 hours [ DST ]

Recent News:



Post new topic Reply to topic  [ 307 posts ]  Go to page Previous  1, 2, 3, 4, 5 ... 21  Next
Author Message
PostPosted: Mon May 04, 2009 8:08 am 
Offline
MVP/Moderator
MVP/Moderator
User avatar

Joined: Sat Dec 15, 2007 8:21 am
Posts: 4725
Location: Dutchess County, NY
Thanks: 77
Thanked: 273 times in 268 posts
Good thoughts, George...

I have no problem with using SyncToy (v1.4, NOT v2.0) or RoboCopy except that the RoboCopy vs Xcopy time differential is so radical! If you want deletions to carry through to the backup you would, of course, need to synchronize rather than copy. I have not hit that point yet but had been thinking that when it got close I would probably reformat the target drive and let it start over. Would take time but would also result in a defragmented target drive!

As for the EXCLUDE ... I've always thought users would tailor the script and this is one of the ways it can be done. I believe the EXCLUDE parameter is for a list of FILES (not folders) with the files to contain the list of the objects to be excluded. The description for this parameter indicates that the parameter:
Quote:
Specifies a list of files containing strings. Each string
should be in a separate line in the files. When any of the
strings match any part of the absolute path of the file to be
copied, that file will be excluded from being copied. For
example, specifying a string like \obj\ or .obj will exclude
all files underneath the directory obj or all files with the
.obj extension respectively.

The concept is fine, but you should confirm the implementation!

I don't understand your setting "Task Manager" to call SyncToy, etc. Did you mean the Task Scheduler? Personally I would use the RoboCopy line command (syntax in the first post here) and have it right in the same script (CMD/BAT file) as the XCOPY commands. I prefer to have everything in a single place for ease of long term maintenance (both of the actions in the script and of the scheduling of the script itself). But scheduling it separately would work.

Sure, this is another level of backup above and beyond duplication and you COULD turn off duplication for the files backed up automatically. Philosophically I see this level of backup as support for backing up things OFF-SITE. I would probably NOT be backing up recorded TV or ripped DVDs this way. It is to protect against a major home disaster. If that happened I would not be too worried about TV shows or things I can purchase. I worry about the pictures of the family, financial records, correspondence, home movies, etc. My goal would be to keep the size of the backup to what can be placed on a single disk drive for off-site (aafe deposit box) storage. But the beauty of a server with the functionality of WHS/MSS is that you can tailor it to YOUR philosophy and not be forced into mine (or Cavedivers or Yakuza's or...).

For testing I just specified a limited source specification, easier than setting up the Exclude.

At the end your "waffling" indicates you are beginning to understand my philosophy!

By the way, there is another reason for doing all this in a single script (CMD/BAT file) and NOT scheduling separate tasks for synchronizing. ALL this is intended to ultimately be wrapped withing a few commands to enable the target disk to be encrypted. The timing of opening an encrypted volume (disk), using it and then closing it again is important and not easily done via the Task Scheduler.

_________________
....JBick

EX475, 2 GB, LE-1640
PC1: Vista-->W7 Ultimate/32, (D-Drive RAID-5 Array)
PC2: Lenovo Laptop, Win XP Home SP3
2xLinksys WRT54G v1.1 and 2xNetGear GS105 Gbit switch


Top
 Profile  
Thanks  

Attention Guest: Remove this ad by Registering with the MediaSmartServer.net Forums. It's Free!
PostPosted: Mon May 04, 2009 10:30 am 
Offline
Top Contributor
Top Contributor
User avatar

Joined: Tue Apr 01, 2008 5:33 pm
Posts: 902
Location: Florida
Thanks: 53
Thanked: 35 times in 35 posts
JohnBick wrote:
I have not hit that point yet but had been thinking that when it got close I would probably reformat the target drive and let it start over. Would take time but would also result in a defragmented target drive!


More about this below because I know you don't hold large (repoducable, as you say) files in high regard :) , but you reach the maximum target drive capacity a lot quicker when you are dealing with a lot of 4GB files (or larger) so if you were me :shock: you would have to deal with "hitting that point yet" a lot sooner than you do with the files sizes and types of files you consider "necessary" to backup. I don't think I made that point very clearly, but I think you'll probly get my point.

Actually, it is not the fact that the files you consider "unnecessary" (take no offense... said with good will :) ) are easily repoduced, but the time you have spend getting these files to the point so you can back them up :-k For instance, I could wait until the series "The Revolution" comes back around on PBS but I don't want to and even if I did I could not choose to remind myself what a problem we had getting this country together and why it was worth it if I am caught without the series on my WHS while waiting for it to come back around again. Well you say, just go buy it! Well I say, if you build a good backup system, you won't have too :wink: . As you said an another comment the WHS/MSS allows us to do it like I WANT to do things, not how someone else does. No criticism intended, BTW. 8)

JohnBick wrote:
As for the EXCLUDE ... I've always thought users would tailor the script and this is one of the ways it can be done. I believe the EXCLUDE parameter is for a list of FILES (not folders) with the files to contain the list of the objects to be excluded. The description for this parameter indicates that the parameter:
Quote:
Specifies a list of files containing strings. Each string
should be in a separate line in the files. When any of the
strings match any part of the absolute path of the file to be
copied, that file will be excluded from being copied. For
example, specifying a string like \obj\ or .obj will exclude
all files underneath the directory obj or all files with the
.obj extension respectively.

The concept is fine, but you should confirm the implementation!


Thanks for the clarification John. I think you are right. I will have to test to make sure how it works as you suggested. I wasn't entirely clear on what was said about the parameters for the Exclude option. I did get that if you put a string "/FolderName" it would exclude the folder called FolderName because the Windows/DOS convention of usina a "/" as the last character before the folder name. Its really hard for me to write about this and say what I mean. Its probably because I haven't studied the technical language necessary to describe this well. :beerme:

JohnBick wrote:
I don't understand your setting "Task Manager" to call SyncToy, etc. Did you mean the Task Scheduler? Personally I would use the RoboCopy line command (syntax in the first post here) and have it right in the same script (CMD/BAT file) as the XCOPY commands. I prefer to have everything in a single place for ease of long term maintenance (both of the actions in the script and of the scheduling of the script itself). But scheduling it separately would work.


Thanks for catching that mistake. Yes, I meant Task Scheduler, not Task Manager. (changed in original post).

I am not familiar with the details of RoboCopy. Do you mean RoboCopy has a Synchronize command line parameter? If so that would be great and would just use that in the script as you suggested.
JohnBick wrote:
Sure, this is another level of backup above and beyond duplication and you COULD turn off duplication for the files backed up automatically. Philosophically I see this level of backup as support for backing up things OFF-SITE. I would probably NOT be backing up recorded TV or ripped DVDs this way. It is to protect against a major home disaster. If that happened I would not be too worried about TV shows or things I can purchase. I worry about the pictures of the family, financial records, correspondence, home movies, etc. My goal would be to keep the size of the backup to what can be placed on a single disk drive for off-site (aafe deposit box) storage. But the beauty of a server with the functionality of WHS/MSS is that you can tailor it to YOUR philosophy and not be forced into mine (or Cavedivers or Yakuza's or...).


Well I guess I addressed most of what I wanted to say here previously. But I guess I need to think about this in a home disaster context and see if I shoud just recreate the stuff by purchasing it. My initial reaction is time invested does not allow to think like you do. But, on the other hand, maybe I should just purchase the DVDs and then not spend any time putting them on the MSS. It's kind of a hard choice for me to make. I will just have to think about that some more. Those massive files (and I don't even deal with HD Videos) are a real PITA some time. I guess some of us are just a glutton for punishment :crazy:
JohnBick wrote:
For testing I just specified a limited source specification, easier than setting up the Exclude.


Do you mean you just use
XCopy D:\shares\SmallFolderSize E:\Shares /D /E /V /C /I /G /H /R /K /X /Y >C:\temp\External-Shares.log
for testing. Can you clarify?

Thanks for your answers and thoughts, John.
George

_________________
It is the mark of an educated mind to be able to entertain a thought without accepting it.
Aristotle - Greek critic, philosopher, physicist, & zoologist (384 BC - 322 BC)


Top
 Profile  
Thanks  
PostPosted: Mon May 04, 2009 11:07 am 
Offline
Top Contributor
Top Contributor
User avatar

Joined: Tue Apr 01, 2008 5:33 pm
Posts: 902
Location: Florida
Thanks: 53
Thanked: 35 times in 35 posts
Anybody ever used RichCopy? Its referenced as a replacement for the RoboCopy GIU that John referneces in his original post in this thread.
Here is the link to the article discussing RichCopy. It allows file "filtering" so maybe it has some sort of sync option. Anyone know?

As John has found, however, XCopy is much faster than RoboCopy so we may be getting back to the slower option with RichCopy. Since I have not used it I don't know. The article about RichCopy is April, 2009 sothe untility may be fairly new :?:

BTW, John part of your "link string" got pushed to the next line in your original post on the RoboCopy GUI article. You can piece it together and get to the linked article, but maybe you want to edit and fix it?
George

_________________
It is the mark of an educated mind to be able to entertain a thought without accepting it.
Aristotle - Greek critic, philosopher, physicist, & zoologist (384 BC - 322 BC)


Top
 Profile  
Thanks  
PostPosted: Mon May 04, 2009 3:36 pm 
Offline
MVP/Moderator
MVP/Moderator
User avatar

Joined: Sat Dec 15, 2007 8:21 am
Posts: 4725
Location: Dutchess County, NY
Thanks: 77
Thanked: 273 times in 268 posts
Yes, RoboCopy can be executed in a script (CMD/BAT file) as shoun in my first post in this thread and I do believe it can synchronize as opposed to a straight copy (echo).

For testing with small amounts of data you can either exclude a large amount of stuff or restrict the operation, as in your example. (Obviously later testing stages required more data!)

The line breaks in the example that you believe are in error are actually due to warpping in your display. When I open to full screen width they are fine.

_________________
....JBick

EX475, 2 GB, LE-1640
PC1: Vista-->W7 Ultimate/32, (D-Drive RAID-5 Array)
PC2: Lenovo Laptop, Win XP Home SP3
2xLinksys WRT54G v1.1 and 2xNetGear GS105 Gbit switch


Top
 Profile  
Thanks  
PostPosted: Mon May 04, 2009 5:10 pm 
Offline
Top Contributor
Top Contributor
User avatar

Joined: Tue Apr 01, 2008 5:33 pm
Posts: 902
Location: Florida
Thanks: 53
Thanked: 35 times in 35 posts
Thanks for the info John. I will set up some test and let you know what I find out through this thread.
George

_________________
It is the mark of an educated mind to be able to entertain a thought without accepting it.
Aristotle - Greek critic, philosopher, physicist, & zoologist (384 BC - 322 BC)


Top
 Profile  
Thanks  
PostPosted: Mon May 04, 2009 5:50 pm 
Offline
MVP/Moderator
MVP/Moderator
User avatar

Joined: Sat Dec 15, 2007 8:21 am
Posts: 4725
Location: Dutchess County, NY
Thanks: 77
Thanked: 273 times in 268 posts
I appreciate the feedback....

By the way, I had never heard of RichCopy before your post. If you try it please let us know how it compares to RoboCopy, SyncToy, XCopy, etc. Also please let us know if it has a command line interface. Thanks in advance!

_________________
....JBick

EX475, 2 GB, LE-1640
PC1: Vista-->W7 Ultimate/32, (D-Drive RAID-5 Array)
PC2: Lenovo Laptop, Win XP Home SP3
2xLinksys WRT54G v1.1 and 2xNetGear GS105 Gbit switch


Top
 Profile  
Thanks  
PostPosted: Mon May 04, 2009 8:46 pm 
Offline
Top Contributor
Top Contributor
User avatar

Joined: Tue Apr 01, 2008 5:33 pm
Posts: 902
Location: Florida
Thanks: 53
Thanked: 35 times in 35 posts
JohnBick wrote:
By the way, I had never heard of RichCopy before your post. If you try it please let us know how it compares to RoboCopy, SyncToy, XCopy, etc. Also please let us know if it has a command line interface. Thanks in advance!


John,
The last paragraph in the quoted text below addresses one of your quesitons. RichCopy does have a Command line interface. I will get back here with more info when I know more. I have downloaded and installed the app but have not used it yet. From the "readme file" embedded in the download it appears version 3.5 of RichCopy was already much faster than "other file copy tools" and this 4.0 version improves on that by 10%. So it looks like it might be a replacement for RoboCopy and maybe even Xcopy. As is RoboCopy and Sync Toy, RichCopy is provided "as is" and is not supported by Microsoft.
George
Update: The more I look at this the better it sounds. It could definitely show a performance improvement over Xcopy but would need to be tweaked as the default settins are not for performance according to the Help File. Here is a statement about "Performance" and "When to Use" that looks promising (From the Help File)

When to use...
If you often copy many files over low bandwidth or high latency network, RichCopy will be help you relief from your stress an accelerate your daily business. RichCopy remove the issue of low bandwidth and high latency by executing multiple tasks simultaneously and shorten the wait ACK from target machine. There was a cast that RichCopy 8 times faster than XCOPY. Of course it is faster than RoboCopy with more features.

More Good Info: I has a Purge option so files not found on the source directory will be purged from the destination directory, meaning synchronization is possible. And for us not so talented script writers it has a great option...One uses a GUI interface and as you use the GUI, the GUI writes out the command line syntax for you so you can copy the finished syntax from the GUI interface to a script without having to write the syntax directly.

John, I am pretty excited about this utility. I hope you will look into it as well for you XCOPY script replacement.
\:D/ \:D/ \:D/

Link to Download exe file from the Microsoft Download Site.


Quote:
Microsoft has now removed those restrictions and released RichCopy as a public download though the associated license agreement (EULA) is outdated and still refers to RichCopy as an "internal tool".

RichCopy - A Better File Copying Tool
There are several reasons why you may want to use RichCopy for file copying operations on your Windows computer. First, it can copy files from multiple places (or folders) into a single location. You don’t have to run multiple copy processes for consolidating files (like music, photos, etc.) that may be spread across different directories.

And unlike the default copy process in Windows that copies files in sequence one after another, RichCopy can copy multiple files in parallel thus speeding the process.

RichCopy is especially useful when copying a large number of files or for moving big files - the tool lets you pause and resume file copy operations so if something bad happens, like a dropped network connection, you don’t have start the copy process all over again.

There’s something for geeks as well who use XCOPY on the command line. RichCopy is not just faster than XCOPY but all features available in the GUI version of RichCopy can be used from command line as well.


_________________
It is the mark of an educated mind to be able to entertain a thought without accepting it.
Aristotle - Greek critic, philosopher, physicist, & zoologist (384 BC - 322 BC)


Top
 Profile  
Thanks  
PostPosted: Tue May 05, 2009 3:08 pm 
Offline
MVP/Moderator
MVP/Moderator
User avatar

Joined: Sat Dec 15, 2007 8:21 am
Posts: 4725
Location: Dutchess County, NY
Thanks: 77
Thanked: 273 times in 268 posts
SeaRay33 wrote:
John, I am pretty excited about this utility. I hope you will look into it as well for you XCOPY script replacement. \:D/ \:D/ \:D/

Link to Download exe file from the Microsoft Download Site.

Feel free to drop it into the procedure and try it!

I'll add this to my ToDo list. What I really like about XCopy is that there is nothing else to download, install and set up; from a usability standpoint that is a BIG advantage. I suspect that, at most, this will be an OPTIONAL replacement for XCopy in the procedure.

_________________
....JBick

EX475, 2 GB, LE-1640
PC1: Vista-->W7 Ultimate/32, (D-Drive RAID-5 Array)
PC2: Lenovo Laptop, Win XP Home SP3
2xLinksys WRT54G v1.1 and 2xNetGear GS105 Gbit switch


Top
 Profile  
Thanks  
PostPosted: Wed May 06, 2009 4:42 pm 
Offline
2.5TB storage
2.5TB storage
User avatar

Joined: Tue Jan 15, 2008 10:44 pm
Posts: 372
Thanks: 4
Thanked: 12 times in 8 posts
Why not use the backup that's already built into the Server 2003 O.S.? I realize it's maybe limited compared to other tools, but it works and you can run it automatically as a scheduled task. I used to have my server copy a backup of the shares to each of my client comuters back when my server only had one drive in it. Now that my server has 3 drives in it, I count on folder duplication instead.

I thought it would be smarter to use drivespace I already had instead of purchasing more drives. There was plenty of drivespace in my client PC's that would never be used anyhow. I figured between the server and 3 clients, my data was safe. It's very unlikley that all of those drives could die on me at the same time. 8)


Top
 Profile  
Thanks  
PostPosted: Wed May 06, 2009 6:47 pm 
Offline
Top Contributor
Top Contributor
User avatar

Joined: Tue Apr 01, 2008 5:33 pm
Posts: 902
Location: Florida
Thanks: 53
Thanked: 35 times in 35 posts
Mach1 wrote:
Why not use the backup that's already built into the Server 2003 O.S.? I realize it's maybe limited compared to other tools, but it works and you can run it automatically as a scheduled task. I used to have my server copy a backup of the shares to each of my client comuters back when my server only had one drive in it. Now that my server has 3 drives in it, I count on folder duplication instead.

I thought it would be smarter to use drivespace I already had instead of purchasing more drives. There was plenty of drivespace in my client PC's that would never be used anyhow. I figured between the server and 3 clients, my data was safe. It's very unlikley that all of those drives could die on me at the same time. 8)


Not a bad idea, Mach1. I am considering any backup software/alternatives now. My thinking is still open on this :D

I ran into trouble using RichCopy. It was probably my fault but I could not even get to first base with access to the server shares using UNC from one of the clients I installed it on. I figured I might need to use drive letters and would eventually need to install it on the Server anyway, but it was not intuitive and if I was having that much trouble getting it going just "out of the box" I just gave up (for a while). I may do some more looking but I will consider all alternatives. RichCopy definitely has potential but there is no one (in the brief looking that I did) available to discuss it with... no forums or anything like that. As I said, I may look at it again, but my intial experience was not good. As I said, probably me rather than the software. I check it out more on another day unless I find something I like in the meantime.

Right now I am interested in software Dianebrat recommended called Second Copy. As with RichCopy it looks good on the surface and the GUI interface sees like it would be good. I have not downloaded it yet, but probalby will when I decidet to go fool with backup strategies again soon. It hasa 15 day free trial so I will proabably at least check it out. The cost is only $29 so it will not break the bank either.

Frankly, I am not aware of the backup software that comes with Server 2003. Don't know why I did not think of that. Any server OS would come with that. #-o If it has a sync capability and a scheduler it will certainly be in the running for my evaluation. If you don't mind where is it located or at least what is it called? Edit: I think I found it in the logical place Administration->System Tools->Backup right? :roll:

I don't want to hijack John's thread here so maybe PM me or if we need to discuss futher, I could start another thread unless John wants the discussion here. He will probably let us know. John... :P

We could have a long talk too Mach1 on server backup inside the server vs. duplicaiton vs. backup outside the server (or tower enclosures) but that is another thread. I have lots of large files taking lots of space and I don't want to have backups of the backup database or the share folders inside the server or the tower holding my data pool drives. Those drive bays are (IMO) just to valuable to use for backups. I will do some folder duplicaiton when I have an automated backup scheme working like I want it but I will have to develop the strategy for which folders are duplicated and which are not at that time. Right now, I have dup ON for almost every folder but that number of folders will go down when my full backup strategy is developed and complete.

Thanks for the idea of the Server 2003 backup software.
George

_________________
It is the mark of an educated mind to be able to entertain a thought without accepting it.
Aristotle - Greek critic, philosopher, physicist, & zoologist (384 BC - 322 BC)


Top
 Profile  
Thanks  
PostPosted: Wed May 06, 2009 8:39 pm 
Offline
Top Contributor
Top Contributor
User avatar

Joined: Tue Apr 01, 2008 5:33 pm
Posts: 902
Location: Florida
Thanks: 53
Thanked: 35 times in 35 posts
Mach1,
I did not see a "Purge" or Sync option style backup. I had a problem also saying the the D:\Shares\FolderName\File was not copied for all files in Folder "FolderName".... reason... Unable to open... device is not connected. I am runnng the backup Now and am signed on to the server as Administrator.

John, if Mach1 or someone else decides to answer should we take this to another thread or do you want this kind of thing discussed here? Your call...
George

_________________
It is the mark of an educated mind to be able to entertain a thought without accepting it.
Aristotle - Greek critic, philosopher, physicist, & zoologist (384 BC - 322 BC)


Top
 Profile  
Thanks  
PostPosted: Wed May 06, 2009 9:00 pm 
Offline
2.5TB storage
2.5TB storage
User avatar

Joined: Tue Jan 15, 2008 10:44 pm
Posts: 372
Thanks: 4
Thanked: 12 times in 8 posts
I think the backup built into the Server O.S. might have all the features you are looking for. To access it, you RDP into your server. Go into "My Computer" and right click on the "D" drive, and go to "Properties". Click on the "Tools" tab, then the "Backup now"... button. Follow through with the wizard, and it will give you a bunch of options. It will let you choose:

*What to Backup
*When to Backup
*How often to backup
*Where to put the backup
*Type of backup (copy, incremental, etc)

If you pick daily backups, it ends up creating a scheduled task to do the backup. You can run through and create a fake backup job just to get the feel of it. You actually should probably do it once or twice, because there's a LOT of options to choose when you go through the wizard. If you want to get rid of the scheduled backup, go to START > PROGRAMS > ACCESSORIES > SYSTEM TOOLS > SCHEDULED TASKS. Right click and delete the task.

This backup creates a zipped-up large single backup file, which may or may not be useful to you. I've never tried to restore one of these type backups before, but I'm pretty sure you would need to run through the same wizard to do a restore from the backup.


Top
 Profile  
Thanks  
PostPosted: Thu May 07, 2009 7:19 pm 
Offline
MVP/Moderator
MVP/Moderator
User avatar

Joined: Sat Dec 15, 2007 8:21 am
Posts: 4725
Location: Dutchess County, NY
Thanks: 77
Thanked: 273 times in 268 posts
Mach1: You ask why not the built-in backup. The answer is simple, for me, as the backup (or synchronization) tool MUST have a full command line interface. The goal I am shooting for is an encrypted backup that can be taken off-site without fear of sensitive data being utilized by others. To do this the automation will be a part of a script (CMD/BAT file) that also opens and closes an encrypted disk. I don't want to leave files open longer than necessary which pretty much rules out having separate scheduled tasks to open the encrypted volume, then create the backups and then close the volume. The sequential nature of the steps must be enforced.

There is another reason as well. To ensure data integrity of the client backup database the Backup Service must be turned off before the files are accessed and turned on again when the operation is complete. This introduces two more steps in the serial process, making the timing even harder when scheduling at least 6 and as many as 9 separate tasks.

And what happens when you get it all timed correctly and then drop in a few hundred GB of new data? The timing is all off again.

And if you try to pass this on to others to use it is a nightmare to explain.

Under carefully controlled circumstances I am sure that that the built-in backup solution can be made to work. I do NOT believe it is a general solution.

I don't mind some discussion of alternatives in this thread. Evaluating alternatives is what helps us all learn how to do things better. Discussion of the rationale above would be fair game here, but if this is going to get into a discussion of how to set up the built-in backup tool I would prefer that be in a new thread -- and definitely do cross-reference it with a link here, please.

_________________
....JBick

EX475, 2 GB, LE-1640
PC1: Vista-->W7 Ultimate/32, (D-Drive RAID-5 Array)
PC2: Lenovo Laptop, Win XP Home SP3
2xLinksys WRT54G v1.1 and 2xNetGear GS105 Gbit switch


Top
 Profile  
Thanks  
PostPosted: Sun May 10, 2009 3:16 pm 
Offline
Top Contributor
Top Contributor
User avatar

Joined: Tue Apr 01, 2008 5:33 pm
Posts: 902
Location: Florida
Thanks: 53
Thanked: 35 times in 35 posts
I have been thinking about Disaster Recovery Server Backup Strategy to “recover my data” in case my MSS and it’s connected “data pool” eSATA tower were lost. Most of the thinking has revolved around the use of JohnBick’s ROBOCOPY and XCOPY scripts and how they could be adapted to my needs. There is the distinct possibility that my needs may be similar to yours so I wanted to openly discuss my decisions and to ask you to either validate or criticize my strategy. I seek primarily your constructive :D criticisms. Your validation will confirm I am on the right track but your criticisms could lead to improving the strategy. My initial post is placed here since we will be discussing modifications to John’s script. I have permission from the thread owner to start the discussion here. If the discussion moves outside that area, I will move further discussions to a new thread.

My work in recent days has been focused on adapting John’s script to my needs. They differ from John’s because of my need to synchronize some of my Share Folders that contain a substantial number of large DVR-MS and WTV files and my concern for the time it will take each day for the scrip to back up the volume of data involved. Since some of my server share folders contain lots of large files, my needs require me to back these folders up to more than one target drive, even if one of those target drives is a 1.5 TB (or larger) external drive. These are the main reasons John’s script needs some modification to meet my needs.

Other (but not as critical) considerations had to do with speed as well but also took in to consideration my interest in automating the backup of the WHS Backup Database (BDBB, a term “coined” by Alex/Yakuza) and the Server Shared Folders. Unfortunately, in the first implementation of my strategy, I have been unable to “completely” meet the “fully automated” objective. However, full automation is easily achieved with the addition of some hardware…more about that later.

I considered several backup products like RichCopy, SyncToy, Second Copy, and the Server 2003 OS backup software to name a few. For one reason or another I finally settled on using XCOPY and ROBOCOPY. We can discuss this decision in another thread if you like. It’s conceivable I may modify my Disaster Recovery Server Backup Plan to include some of this software in the future. But, at least temporarily, I am implementing my Backup Strategy with XCOPY and ROBOCOPY.

I chose XCOPY (mainly for its speed over ROBOCOPY as discussed in JohnBick’s post above) for some backups I will likely move form a manual weekly process over to a daily command file driven script in the future. I say “in the future” because I do not presently use XCOPY in the script shown below but I have shown how it would be used. For now the XCOPY command lines are “remarked out” with REM commands. When my strategy moves to that of “complete automation”, the XCOPY commands will become relevant then. I show them now so we can discuss them if you like.

I also considered backup target hardware options, but since I already had enough external drives to get my plan started with the hardware I presently own, I chose to go with three or maybe eventually four external drives. In my adapted script show below I only use two of these drives (E: and F: ) but one other drive is presently being used (G: ) outside this script. In the current implementation of my strategy I have to manually run a PP1 style Server Backup which uses the G: Drive. And as my backup volume grows I may eventually use a forth drive (H: ) either as part of this script or outside it. I have these four external drives now. Eventually I want to go to a single 4-Bay RAID 5 enclosure with 2TB drives. This enclosure will be seen as a single target drive E: . When the enclosure is available the script will be somewhat simplified. The presence of a single large RAID 5 target drive will allow me to “completely” automate the backup strategy and will allow me to execute it on a daily instead of a weekly basis. Right now, my strategy requires a manual weekly step that is simple to run, yet keeps my present plan from being fully automated.

Before I show the script here is a copy of what my Shared Folders Tab looks like. You can see Folder Name, Sizes and duplication status here. When I have the Server Backups in place I plan to turn duplication OFF on many of these folders and free up some more drive space inside my server for data pool files.

Attachment:
ScreenHunter_07 May. 03 21.30.gif
ScreenHunter_07 May. 03 21.30.gif [ 57.96 KiB | Viewed 14255 times ]


A copy of my script is show below for your review. At this point I have not run the script on my server but I will be doing that sometime today. I have some questions that could be answered by trial and error but I will be asking some specific questions in another post here if I have trouble getting the script to execute properly. Hopefully, if that does happen, I can ask any of you that may be reading this post (including JohnBick) for a few pointers in getting the script to work as I planned it. I have tried to “comment” the script with notes but please feel free to ask any questions you may have about it if I have not made my intentions clear.

REM ***begin script***
REM Copy WHS Backup Database and Shared Folders to an unmanaged disk drive
REM This script assumes the target is Disk F: -- modify script in two places if that is to be changed
REM The logs are placed in the C:\root directory. That can also be changed (22 places)
REM A Critical Error will be present on the server when the Backup Database is being backed up as the Backup Service will be shut down.
REM - This may be safely ignored as it will be restarted at the completion of the database backup



REM Note: I have attemped to "boldface" notes and changes/additions I made to to JohnBick's script.
REM Any information not in Bold are parts of JohnBick's original XCOPY or ROBOCOPY scipts as shown above in this thread.




REM Note: Throughout this script I have "remarked out" parts of this script with "REM***" to show lines of the script
REM that will be included in the script or not included depending on the the test conditions. When this script goes into "production"
REM all lines with "REM***" at the beginning will be removed.


REM Backup the WHS BACKUP DATABASE

REM Stop the Backup Service
REM***NET STOP PDL
REM***NET STOP WHSBackup

REM Copy the "backup" folder

REM***XCopy D:\folders\{00008086-058D-4C89-AB57-A7F909A47AB4} F:\Backups\{00008086-058D-4C89-AB57-A7F909A47AB4} /D /E /V /C /I /G /H /R /K /X /Y >C:\Temp\External-Backups.log


REM Start the Backup Service
REM***NET START WHSBackup
REM***NET START PDL

REM Rename the log files to save last NINE
REM***erase C:\Temp\External-Backups-1.log
REM***rename C:\Temp\External-Backups-2.log External-Backups-1.log
REM***rename C:\Temp\External-Backups-3.log External-Backups-2.log
REM***rename C:\Temp\External-Backups-4.log External-Backups-3.log
REM***rename C:\Temp\External-Backups-5.log External-Backups-4.log
REM***rename C:\Temp\External-Backups-6.log External-Backups-5.log
REM***rename C:\Temp\External-Backups-7.log External-Backups-6.log
REM***rename C:\Temp\External-Backups-8.log External-Backups-7.log
REM***rename C:\Temp\External-Backups-9.log External-Backups-8.log
REM***rename C:\Temp\External-Backups.log External-Backups-9.log

REM Backup Database backed up, backup service has been restarted, and logs saved


REM Windows Home Server backup of the "Selected" SHARES folder that "Require synchronization"

REM Use RoboCopy to Copy the “synchronized shares” folders

REM Note: I am using RoboCopy here instead of Xcopy because of the Mirrowing parameter if offers.
REM I have also changed the LOG parameter to LOG+ in (the second and following...if one needs to synchroize more thatn two) RoboCopy commands to append
REM the status to the log file rather than overwrite it. In this script I have two RoboCopy commands because there
REM are two D:\share subfolders that require synchronization. These subfolders are on Target Drive E:


REM Copy the shares folder in Mirror Mode, Restartable, All attributes, No progress, 4 retries, 5 second wait between retries, Log to file in c:\temp, Console output
REMREM***c:\Robocopy "D:\shares\Recorded TV" "E:\Shares\Recorded TV" /MIR /Z /COPYALL /NP /R:4 /W:5 /LOG:"C:\External-Shares.log" /TEE
c:\Robocopy "D:\shares\TV Collections-Permanent" "E:\Shares\TV Collections-Permanent" /MIR /Z /COPYALL /NP /R:4 /W:5 /LOG+:"C:\External-Shares.log" /TEE


REM Note: Pay no attention to the next line. It is in this script for potential use in the future should I decide to replace the PP1 weekly backups with daily automated backups using this script.
REM XCopy D:\shares E:\Shares /EXCLUDE:ExcludedShareFolders.txt /D /E /V /C /I /G /H /R /K /X /Y >C:\temp\External-Shares.log.


REM Rename the log files to save last NINE

REM***erase C:\Temp\External-Shares-1.log
REM***rename C:\Temp\External-Shares-2.log External-Shares-1.log
REM***rename C:\Temp\External-Shares-3.log External-Shares-2.log
REM***rename C:\Temp\External-Shares-4.log External-Shares-3.log
REM***rename C:\Temp\External-Shares-5.log External-Shares-4.log
REM***rename C:\Temp\External-Shares-6.log External-Shares-5.log
REM***rename C:\Temp\External-Shares-7.log External-Shares-6.log
REM***rename C:\Temp\External-Shares-8.log External-Shares-7.log
REM***rename C:\Temp\External-Shares-9.log External-Shares-8.log
REM***rename C:\Temp\External-Shares.log External-Shares-9.log

REM Shared Folders backed up and logs saved
REM ***end script***

_________________
It is the mark of an educated mind to be able to entertain a thought without accepting it.
Aristotle - Greek critic, philosopher, physicist, & zoologist (384 BC - 322 BC)


Last edited by SeaRay33 on Mon May 11, 2009 8:03 am, edited 1 time in total.

Top
 Profile  
Thanks  
PostPosted: Sun May 10, 2009 6:46 pm 
Offline
Top Contributor
Top Contributor
User avatar

Joined: Tue Apr 01, 2008 5:33 pm
Posts: 902
Location: Florida
Thanks: 53
Thanked: 35 times in 35 posts
John or anyone who is familiar with Robocopy command line parameters:
I am unable to get the Robocopy command to work if it has /MIR /COPYALL /LOG+:"C:\External-Shares.log" or /TEE Parameters included. If the parameters /Z /NP /R:4 /W:5 are used the copy goes as expected. The script will work with a single file in d:\shares\TV Collections-Permanent and the e:\shares\TV Collections-Permanent folder is empty. If the second directory has the file already copied into it that I am trying to copy over, no matter what parameters are used the file will not copy over with the robocopy command. However, this last behaviou is as expected as I would not want to copy a file over to the backup that has already been copied.

Guess I am stuck until I figure this out. I have read what these parameters do and I don't see why they are stopping RoboCopy from working ok. When I get the copy working with one file. I will move to testing with many files and measure how long it takes in (MB/sec).
George
P.S. John, Sorry to bug you. If you get a chance let me know. If not and I can't figure this out or no one else weighs in that knows Robocopy syntax, we will just wait until you are back home and all the festivities are over. :D

_________________
It is the mark of an educated mind to be able to entertain a thought without accepting it.
Aristotle - Greek critic, philosopher, physicist, & zoologist (384 BC - 322 BC)


Top
 Profile  
Thanks  
Display posts from previous:  Sort by  
Post new topic Reply to topic  [ 307 posts ]  Go to page Previous  1, 2, 3, 4, 5 ... 21  Next

All times are UTC - 7 hours [ DST ]


Who is online

Users browsing this forum: No registered users and 15 guests


You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot post attachments in this forum

Search for:
Jump to:  
cron
Powered by phpBB® Forum Software © phpBB Group