Does anyone use any standard methodologies or pieces of software which capture an entire online website, to be available offline, in a forensic manner?
Not just the actual page, but all html files on that site.
Regards
I have used the following protocol
1. Clean system
2. HTTrack ( grabs entire site) or wget or other web collection apps
http//
3.Video and image Capture of the session
4. Possibly a pcap\wireshark capture of the session
5. Hash individually collected pages, files, supporting data such as images\video and pcap
6. Copy original evidence into some sort of encrypted archive
7. Create a working copy of the original evidence
i used teleport pro too.
Check out WebCase. Todd Shipley was on the 02-07-10 CyberSpeak and gave a bit of an overview.
http//
http//veresoftware.com/