@ our corporate office we have about 25 computers. We have to archive emails to our machines because we have no better solution to manage space on exchange at this time. The .pst files, don't really get backed up and i want to make sure to make a reliable automated policy so they do. I can write a robocopy script to back them up upon via a schedule - no problem - but that would mean a full backup of their psts every time, and WE HAVE ALOT of .pst data at this office. It COULD be a strain on the network even though this is something i would make sure happens after hours. 

how do you guys prefer to manage backing up large files to your local file server? Just curious to see if you guys had a similar item to contemplate and what you would do

0 Comments   [ - ] Hide Comments


Please log in to comment

Answer this question or Comment on this question for clarity


You stuck in between a rock and hard place. PSTs can be troublesome and when it goes wrong, users normally give you grief.

Copy the data to the sever every night, apparently Remote Differential Compression dosent work with pst which sucks, so its a full file copy. Let your backup solution handle the backups, hopefully you will be doing a keep forever monthly, full weekly and differential daily. 

With the above you should be able to rebuild a pst if something goes wrong.

Other solution is to buy some software that will look after the pst's.

err, good luck.
Answered 02/09/2016 by: rileyz
Red Belt

Please log in to comment