locked
How to fix all of those duplicate files? RRS feed

  • Question

  • I am integrating a ton of files from many years past. I was backing up to tape then to small drives , then to a dedicated 160 drive, and that spilled over to discarded drives converted to USB drives. Now I am trying to get everything to the server for backup. The obvious is that some of these files are dupes two , three and four times.....and WHS knows it, and is barking about it. The WHS has a 1 TB drive in it which just was not enough for my needs and also barked about the dupe files.

    I have purchased a 2nd 1 TB drive and put it in my workhorse PC for break-in and to help resolve the the issue. ( It will end up in the server once the issue has been resolved.)

    I copied all of my files in the server back to the new drive cuz the server was barking about duplicate files, then deleted them from the server to make it happy again.

    Anyone have an idea on a utility or a function within WHS to help weed out all of these dupe files? I am sure that some of them are are dupe two or thee times and wasting a lot of space as well as pissing the server itself off cuz it does not like dupe files.

    Tuesday, May 26, 2009 12:19 AM

All replies

  • Let me clarify one thing. There is enough room on the Single 1 TB drive in the server to handle all of the duplicated files....that is not a problem...I still have 400 plus gigs left over after the duplication.....it is the barking etc....that is the issue...however when I said 1 TB is not enough I was reffering to the backups from all of the home pc's that would squeeze the space, so there were a couple of pc's that would not get backed up until I added another 1 TB Drive.....so I blocked the backups from them......
    Tuesday, May 26, 2009 12:33 AM
  • I am integrating a ton of files from many years past. I was backing up to tape then to small drives , then to a dedicated 160 drive, and that spilled over to discarded drives converted to USB drives. Now I am trying to get everything to the server for backup. The obvious is that some of these files are dupes two , three and four times.....and WHS knows it, and is barking about it. The WHS has a 1 TB drive in it which just was not enough for my needs and also barked about the dupe files.

    WHS does not care about duplicate files (unless they are in the same folder, and that's true of any OS, not just WHS).  I can store the exact same file in 2, 3, 100 different shares/folders and it will work just fine.  Exactly what error message are you seeing?

    I have purchased a 2nd 1 TB drive and put it in my workhorse PC for break-in and to help resolve the the issue. ( It will end up in the server once the issue has been resolved.)

    I copied all of my files in the server back to the new drive cuz the server was barking about duplicate files, then deleted them from the server to make it happy again.

    Anyone have an idea on a utility or a function within WHS to help weed out all of these dupe files? I am sure that some of them are are dupe two or thee times and wasting a lot of space as well as pissing the server itself off cuz it does not like dupe files.
    Tuesday, May 26, 2009 1:05 AM
    Moderator
  • Sorry, can't fully remember...said something about writing over files and that's all I can remember at this time. That was a remark in response to a health status warning. Have since found a Max PC "guru" , a guy that I trust, reccomendation on a dupe file finder that works on PC's......they also have one that works on server folders run from a remote PC. If this works out, I would be willing to bet that I shed 100-150 gigs of dupe files.

    If you go back to my original post on this.....This is stuff that goes back to when I could back up my whole PC on a half dozen floppies.....
    • Marked as answer by paschal123 Wednesday, May 27, 2009 5:54 PM
    • Unmarked as answer by paschal123 Wednesday, May 27, 2009 5:54 PM
    Wednesday, May 27, 2009 4:54 PM
  • How did you copy the data to the server? Did you use the UNC path or copy directly to d:\shares (which is predestinated to give you errors)?
    Best greetings from Germany
    Olaf
    Wednesday, May 27, 2009 10:02 PM
    Moderator
  • Well, I'm not sure I totally understand the problem, but if you have duplicate files, then I've been really pleased with the following program so far. I installed it on my WHS and it's helped me find and delete these dups:
    http://www.ashisoft.com/df.htm

    http://artisconsulting.com/Blog/GregGalloway
    Thursday, May 28, 2009 12:36 PM