±Forensic Focus Partners

Become an advertising partner

±Your Account


Username
Password

Forgotten password/username?

Site Members:

New Today: 0 Overall: 35984
New Yesterday: 7 Visitors: 144

±Follow Forensic Focus

Forensic Focus Facebook PageForensic Focus on TwitterForensic Focus LinkedIn GroupForensic Focus YouTube Channel

RSS feeds: News Forums Articles

±Latest Articles

±Latest Videos

±Latest Jobs

FMEM and DD segmentation fault

Forensic software discussion (commercial and open source/freeware). Strictly no advertising.
Reply to topicReply to topic Printer Friendly Page
Forum FAQSearchView unanswered posts
 
  

banderas20
Member
 

FMEM and DD segmentation fault

Post Posted: May 03, 19 11:00

Hi all,

I am trying to acquire a live memory dump from an Ubuntu system. This is what I do:

1. Download fmem tool
2. Compile it with make and run ./sh
3. A /dev/fmem is created

I know this is a special file and I have to specify the size for dd. However, I either end up with a small file or I get a Segmentation Fault error:

The RAM is 2GB size. My commands are:

Code:
dd if=/dev/fmem of=./dumpfile.raw count=400

It works but the file is not complete

Code:
dd if=/dev/fmem of=./dumpfile.raw bs=1MB count=2000

dd if=/dev/fmem of=./dumpfile.raw count=500

Error: Segmentation fault

The version of fmem is 1.5

On top of that, if I download version 1.6 and run 'make', it gives me compilation errors Sad

Any clue?

Thanks in advance!  
 
  

jaclaz
Senior Member
 

Re: FMEM and DD segmentation fault

Post Posted: May 15, 19 11:50

In my experience there are more (slightly) different versions of dd than stars in the sky, so I wouldn't even THINK of using "a" dd (unless I am very familiar and have thoroughfully tested that specific version in the specific environment) without specifying all the needed parameters and specifying them in the most "basic" way, as the "default" blocksize may differ from what is expected and/or the "translation" from (say) 1MB to 1048576 bytes may simply not happen (i.e. the specific build/version may use 1M instead of 1MB, etc.)

Personally I would try:
1) dd if=/dev/fmem of=./dumpfile.raw count=4194304 bs=512
2) dd if=/dev/fmem of=./dumpfile.raw count=2097152 bs=1024
3) dd if=/dev/fmem of=./dumpfile.raw count=1048576 bs=2048
4) dd if=/dev/fmem of=./dumpfile.raw count=524288 bs=4096
5) dd if=/dev/fmem of=./dumpfile.raw count=2048 bs=1048576

2 GB are 2147483648 bytes, so:
4194304*512
2097152*1024
1048576*2048
524288*4096
2048*1048576

Of course, the dump will be (in some cases not-so-slightly) faster the bigger the blocksize, so you may want to try the above list of commands in reverse order.

jaclaz
_________________
- In theory there is no difference between theory and practice, but in practice there is. - 
 

Page 1 of 1