Not a linux guru but I have a split dd image in about 20 parts that I want to hash. I know I can cat the image and create a combined image but this is not very elegant. I could also mount the volume and access from windows - but again this sort of defeats the object (bit of a learning exercise).
Can someone give me a clue - using redhat or slackware
Not a linux guru but I have a split dd image in about 20 parts that I want to hash. I know I can cat the image and create a combined image but this is not very elegant. I could also mount the volume and access from windows - but again this sort of defeats the object (bit of a learning exercise).
Can someone give me a clue - using redhat or slackware
Slackware…cool. 13.1 was released today, BTW…
Anyway, to get a hash of the combined files
cat /path/to/split.* | md5sum
This will stream each split, one after the other, and pipe the entire thing to md5sum, giving you a hash. The "-" at the end of the out put (in place of a file name) indicates the input file was through the pipe.
HTH,
Barry
Thansk Barry
Access Data now offers command line versions of FTK Imager for Linux, Mac OS X and Win32 which will allow you to calculate the hash for a multi-part image, though the solution posted already works just fine.
Access Data now offers command line versions of FTK Imager for Linux…
Oh good God, why? That's almost as perverted as GUI versions of "dd"…
D
OK slight variation on a theme.
How do I use DD to create a hash of multiple tape files and pipe the output to DD?
OK slight variation on a theme.
How do I use DD to create a hash of multiple tape files and pipe the output to DD?
Not smellin' what you're cooking here.
Can you rephrase the question? What is the objective? What kind of "tape files"? Do you mean tape devices or actual files obtained from a tape? Are you saying you want to pipe a hash to dd for some reason?
Barry
Ah OK Barry
When a backup is made to tape using some sort of commercial archive package the data is written as what are usually called tape files. If for instance you had a backup of C D and say an exchange server then (simplifying slightly) you would get three tapes files on your tape.
If you run dd with a command such as dd if=/dev/nst0 of=file1 you will get the first file only and dd will halt at the first file mark, run dd again and you get the second file etc.
I can write each of the files out to disk and then cat the files through md5sum as above, but there are times when I am just interested in a hash of all of the user data on a tape. I have written my own windows software that can do this, and I guess I could knock something up under limux) but I am wondering whether there is someway of essentially throwing multiple consecutive runs of DD into md5sum.
(hope this is clear and is not teaching you to suck eggs).
Got it…makes perfect sense. Problem is that it's been eons since I've actually dealt with a tape.
If I recall correct, you can use mt -f /dev/nst0 with a command like "tell" or "status" (I cant' remember, and I don't have any tape devices here to test on). Once you get the tape contents listed with start block, and EOF block for each tape, you can use dd with "count" and pipe (without using "of=") straight to md5sum. It may not be mt at all…but there was a tape command that would give this info.
It's been a long loooong time.
Can you try tar?
tar xf /dev/nst0 | md5sum
I'm tossing out ideas here with no real knowledge of if they work - cant test. So I'll stop now.
wink
OK thanks Barry
It is mt but it wont (as far as I am aware) work as you describe.
Knowing the total number of blocks is not enough as dd will still exit when it reaches the next file mark. It is also a problem when you have variable sized blocks (but this is not too common).
Lets try another way that is analogous to my problem to see if it sparks something. Lets assume that there was no cat command and you had 20 disk images that you wanted to verify as one large image. Could you use dd, maybe as part of a script, to run through all of the images and create a single hash for all files?