For what it's worth, I just finished moving close to 24TB of data consisting of hundreds of millions of files… Use rsync.
I agree about rsync – at least using the 'native' protocol (rsync//), but not the fallback to SSH.
Some attention must be paid to bandwidth capping or any other LAN-specific methods of traffic priority. Transferring multi-terabyte transfers on a LAN is good way of upsetting other users who rely on LAN bandwidth. (Never do it on a LAN over which your salary payment orders are sent to the bank! … rsync does do this by means of the –bwlimit option. For other solutions on Unix, trickle may solve the problem.)
However … over a normal LAN (i.e. no multiple paths betweed src and dst), it is often faster to do transfers by simpler protocols than TCP. While TCP guarantees (in theory) that all sent packets arrive and in the correct order, that kind of problem rarely appears in that kind of LAN. Packet loss must be handled, though.
I've tested Bittorrent for smaller jobs – up to a TB or so. It takes time to set up, but once everything is in place. it can be left pretty much alone. And there are clients for just about any platform. (I see Bittorrent now have an application called 'sync' that uses the same P2P protocol(?), but takes less setup, intended for syncing data – it would be interesting to test how well that works in comparison.)
Rich copy seems to perform well but it's quite old, …
And? ….
What is the point, you want to call it "legacy"? 😯
http//homepage.ntlworld.com./jonathan.deboynepollard/FGA/legacy-is-not-a-pejorative.html
dd is also old, yet it works fine, and rsync (which also works fine ) ) is not "exactly a kid", as it was born in 1996
http//
jaclaz
+1 for Richcopy if you are using windows 7. Copy and paste in windows 8 rocks…..
For what it's worth it sounds like more of a process issue here. If large data transfers are slowing your workflow try to design the problem out of your workflow rather than trying to find a better tool.
With ever increasing data sets the best senario is to write once. I.e image to the network and leave it there. Work on and extract from the network copy. If you need second copies they can be done in slow time without affecting production.
easier said than done on limited budgets I know…..