Hello,
I recently had to acquire an user network share and mailbox without having physical access to the server (shared hosting/VPN plus it was important to have an immediate copy of the available data) however I kept having issues with FTK Imager and getmail getting stuck due to an unreliable network connection so in the end I had to use a virtual machine with robocopy and Thunderbird.
Any idea if there's some other software that wouldn't get stuck so easily in similar situations?
… I kept having issues with FTK Imager and getmail getting stuck due to an unreliable network connection so in the end I had to use a virtual machine with robocopy and Thunderbird.
I don't get any particular 'aha – of course!' feeling from that description, so I fail to see why that was successful? What is 'faulty' in this context? Did this solution replace the faulty connection in some way, or reduce the load of it? Or what particular part of the problem did this solution address?
Any idea if there's some other software that wouldn't get stuck so easily in similar situatios?
Similar … what degree of packet loss did you experience? Was it stable, or did it fluctuate? Or even increase? If packet loss is too large, you can't do anything about it except find the cause of the loss and fix *that*.
UDP doesn't work over partially working connections – you have to have added logic to handle that. So avoid anything that uses UDP and doesn't have such logic built-in. Some poor amateur software falls into this category.
TCP was designed to handle packet loss etc, but in practice there are failures that TCP (or more appropriately implementations of TCP) can't cope with. Again, you have to add logic to handle that.
Robocopy is probably useful – but as it does not guarantee file integrity (which seems to be a strange oversight for that kind of software), you need to add some kind of data validation yourself.
If you have something like rsync in place, it may be a little easier to use that, as you don't have to do any retransmissions manually if file integrity fails.
In more complex situations, P2P solutions like bittorrent may work better (I'm thinking of software that chops the data up into several parts, and transfers each one using checksum protection). Of course, you need to have the appropriate software on both sides of the connection. But if the magnitude of the fault is too large, the correct solution is of course not to use that particular connection, and, in a longer perspective, possibly even to redesign the remote site to allow for other alternatives.
Actually, I think it would be useful to examine various 'forensic mass transfer' scenarios, (hasn't anyone done this already?) and then induce various degrees of packet loss or other malfunctions.
I know that VMWare Workstation allows me to set up a virtual network with a predefined packet loss, and on Un*x I could do it with netem or iptables statistics. Thus, the packet-loss seems possible and even easy to produce.
(Added A Windows service I somethimes wish I knew more about is 'Background Intelligent Transfer Service'. It's used by Windows Update, so it should have a certain degree of robustness. It might be useful in this kind of scenario, but I know to little about it to make any promises. I find an interesting client called BITSync that uses BITS and that could be worth investigating …)
Sorry for the lack of details, the situation was pretty complicated. Basically the company was bought by another company and they temporarily shared part of the equipment however they kept everything separated in different VLANs and VPNs for privacy reasons. The sysadmin would have gladly extracted the data and backups in our presence but not that day since he was out of country, all he could give us were the credentials of the VPN (no access to anything else due to privacy concerns) but we were limited to access it through the same network port of the workstation.
They were using some kind of wireless bridge that looked perfectly stable at first but crumbled under load resulting in short disconnections once or twice every hour, but that's all we could use. It wasn't really a "success" after weighting the pros/cons it just happened to be enough to not warrant going back another day to acquire a more forensically sound copy of the data (since what we needed were mostly informations).
Anyway, with everybody moving to IMAP and cloud storage it's only going to get worse so I'd really prefer to be prepared for when I'll end up in similar situations. I'll see if I can set up a virtual machine for some tests but my main concern was figuring out what other software to test (e.g. I couldn't find any better replacement for getmail beside desktop clients).