Moving large amount...
 
Notifications
Clear all

Moving large amounts of data within the network

23 Posts
10 Users
0 Likes
2,346 Views
Adam10541
(@adam10541)
Posts: 550
Honorable Member
Topic starter
 

I'm wondering what software solutions people use within their own network/lab to move around large data sets?

At the moment I'm using a simple copy&paste type method to move large amounts of data, and while relatively reliable it means I'm doubling up and sometimes have to redo the process if one of the computers crashes or the USB caddy drops out.

I'm looking at backup solutions, FTP software or any software really that operates sort of like Tera Copy, but that is designed for network data transfer.

Speed and reliability are key, and it must run on Windows 7 and be GUI based. I have neither the time nor the inclination to mess around with Linux options as I need it to work and be simple for non tech's to use as well.

The software would ideally meet the following
MUST run on Windows 7
Setup a shared location on the server (will be a RAID 5 array)
Have the ability to send files to that network share either via some GUI or right click menu option on the client computers
Have the ability to monitor the progress
Have the ability to either auto or manually retry any failed files
Have the ability to report why a file transfer may have failed (loss of connection, long file name etc)
Have the ability to pause/resume a process
Have the ability to continue a process in case of unexpected power loss or crash

Any thoughts?

 
Posted : 18/06/2014 8:55 am
(@mscotgrove)
Posts: 938
Prominent Member
 

Some of what you want in standard in Windows 8 - eg pause a copy when you have multiple steams copying. Also a simple performance chart of the copy.

I often copy disk images(500GB to 2TB) and they just take time. I will interested to hear if there are better programs available

The most import aspect when copying large amounts of data is to make sure you do not 'thrash' drive by trying to read or write to different areas of the disk at the same time. With multiple copies to/from a single drive, performance is very very poor. (SSDs will probably cope better, but not have 3 or 4 TB capacity with out being RAIDED)

 
Posted : 18/06/2014 11:55 am
Adam10541
(@adam10541)
Posts: 550
Honorable Member
Topic starter
 

I've been talking to a close friend who works in IT and he had the amusing comment that if I find the "holy grail" of data transfer software to let him know as he'd be interested roll

There appears to be a few Linux apps out there dedicated to this I may have naively assumed that there would be something like this for Windows.

The best I've found so far (apart from good old Windows) is Comodo backup which allows what they call a 'simple file copy' mode which basically just copies the files to the location of your choice.

I've done some testing on small data sets to confirm it works as expected, so the next test I guess will be a real world comparison on a large data set to see if it's actually saving me any time…..the only problem is finding time to do that 😯

I may have to lower my expectations, I guess I'm just not overly keen on Windows as if something fails or I lose power it's difficult to pin down exactly why a copy failed, not to mention if I select 1TB of data for copy Windows sits there for a good 20 mins or longer just 'finding files' before it even starts the copy process, this causes a red film to wash down my eyes and exposes episodes of wild rage…..not healthy. I'm sure there is a good reason why Windows needs to 'discover' the files before it transfers them but it still bugs the hell out of me evil

 
Posted : 18/06/2014 12:56 pm
jaclaz
(@jaclaz)
Posts: 5133
Illustrious Member
 

I believe that it depends also of the "nature" of the "data set".

If I recall correctly some tests made some time ago a same app that dealt fine (fast) with large files, then had issues (slower) with zillions of small files and viceversa.

But Robocopy (MS) is usually fine and there are GUI's for it.
Or Richcopy (still MS)
http//technet.microsoft.com/en-us/magazine/2009.04.utilityspotlight.aspx

There is this tool Qcopy which was "designed" for transfer over Network (but never tested it)
http//www.flit.com.au/web/content/files/qcopy

What I personally use "normally" for "backup" is an old software that has proved over the years very handy and reliable, but that may not completely fit your bill and that cannot say how (and if) it will work fine under Windows 7, Samedir
https://web.archive.org/web/20071219234134/http//samedir.sbn.bz/base/view/document/1102950594
You will have to test it as the documentation is a bit scarce, and is not the easiest to set it the way one wants it, but has quite afew nice features that I didn't find on other programs.

jaclaz

 
Posted : 18/06/2014 3:17 pm
Bulldawg
(@bulldawg)
Posts: 190
Estimable Member
 

Your requirements are pretty lofty. Let us know if you find such a program.

I just use robocopy. It's built in to Windows 7, and any examiner should be able to learn to use it with little trouble. It's slow with many very small files–just like I expect any other tool to be–but with the 2+ GB image files I'm usually transferring, it runs at near wire speed.

I used to also use NUIX Evidence Mover, but when it started crashing when copying the root of a drive, I stopped. Evidence Mover is slower than other tools because it's copying, then hashing the source, then hashing the destination.

A typical robocopy command
robocopy /E <source dir> <destination dir> /LOG<log file location>
Note if you're copying from the root of an NTFS volume, use /XD to exclude the Recycle Bin, System Volume Information, and any other system directories. Windows doesn't like copying those files to a new location.

There's no pause and resume.
It will automatically retry on failures up to 1,000,000 times by default (can be modified with /R and /W).
Can run from any location, but since it's there by default in Windows 7, why bother.
No GUI.
It will output what file it's copying currently, which means you can sometimes tell how much time is left, but there's no explicit monitoring. Doing explicit monitoring would slow things down.
Log will show failed files.
Will skip files that already exist, which is pretty much the same as a resume process.

 
Posted : 18/06/2014 5:47 pm
(@inspectaneck)
Posts: 57
Trusted Member
 

I have used Teracopy, which performs a CRC check during transfer. I also like that if certain files to not transfer successfully, you can easily restart those files. Teracopy can be set to replace the standard Windows copy, so that it is used each time. I have had instances where Teracopy has caught a CRC error. When moving evidence files, it may still be prudent to verify the hash after copied.

I have also used Richcopy (the GUI for Robocopy). There are plenty more options, so I only use this in certain cases and when I wish to select multiple files or directories in different locations.

Both run fine on Windows 7.

 
Posted : 18/06/2014 6:33 pm
jaclaz
(@jaclaz)
Posts: 5133
Illustrious Member
 

I have also used Richcopy (the GUI for Robocopy).

Just for the record, Richcopy is NOT a GUI for Robocopy.

It is a separate (though similar in functions/scopes) program that has also BTW it's own set of command line options.

A (IMHO good) GUI for Robocopy is CopyriteXP
http//copyrite.dyndns.biz/

jaclaz

 
Posted : 18/06/2014 7:43 pm
(@inspectaneck)
Posts: 57
Trusted Member
 

Thanks for the clarification, Jaclaz.

 
Posted : 19/06/2014 1:05 am
Adam10541
(@adam10541)
Posts: 550
Honorable Member
Topic starter
 

Thanks for the input all, Qcopy looks worth a try as well, nice simple looking interface and has some of the features I'm looking for.

Something else I've been considering to counter the 'millions of small files' problem that can bring some copy processes to a crawl, is using 7zip (or the like) to create archives and then transfer the much larger archives which should in theory be much quicker.

Obviously the kicker will be how much time it takes to create those archives. Even the 'store' option which I think doesn't use any compression will still take some time.

Looks like some testing will be in order if things ever get quiet enough )

 
Posted : 19/06/2014 9:25 am
(@forveux)
Posts: 20
Eminent Member
 

… It's slow with many very small files–just like I expect any other tool to be–but with the 2+ GB image files I'm usually transferring, it runs at near wire speed.

Is it slow, because with lots of little files it has to run a CRC on each one to verify the integrity of the file? Since many smaller files could potentially mean more fragmentation?

Something else I've been considering to counter the 'millions of small files' problem that can bring some copy processes to a crawl, is using 7zip (or the like) to create archives and then transfer the much larger archives which should in theory be much quicker.

As with the question I've posed in response to the quote above, you reckon a disk defrag would help bring the smaller files into their own sequential/contiguous address space?

 
Posted : 19/06/2014 10:46 am
Page 1 / 3
Share: