Using SSD as storag...
 
Notifications
Clear all

Using SSD as storage for DF

12 Posts
8 Users
0 Likes
689 Views
SilesianMan
(@silesianman)
Posts: 15
Active Member
Topic starter
 

Hello everyone.

Is there anyone using SSD drives (standalone or in an array) as storage for images? If so, have you made any tests comparing the speed of data analysis? I did some little tests with small binary copies analysis (file signature recovery, indexing, md5 and sha1 hashing) and got results three times faster than the same job done on image stored on HDD. Just wanted to compare if there is anyone tried the same thing?

What do you think about upgrading lab machines with the SSD storage? Is it a good idea?

Regards,
Karol

 
Posted : 16/02/2016 7:00 pm
(@dandaman_24)
Posts: 172
Estimable Member
 

I have a SSD for the OS and a separate SSD acting as a temp cache where I store files that I'm working on.

On my RAiD I use "normal HDDs"

 
Posted : 17/02/2016 12:55 am
BraindeadVirtually
(@braindeadvirtually)
Posts: 115
Estimable Member
 

Couple of 10k or 15k spinners in RAID1 for OS and apps - 300GB is OK - and as many decent quality SSDs as budget, case and RAID card will allow, probably in RAID0. Unless you've got a local DB you can stand to lose temp data and regroup without too many issues and you should be backing up cases often anyway. I've only seen one or two SSDs unexpectedly die in years of running similar setups.

 
Posted : 17/02/2016 3:56 am
Passmark
(@passmark)
Posts: 376
Reputable Member
 

If you can afford them, then using SSDs to hold images for your active projects is a no brainer.

The more interesting question is should you use PCIe based (or M.2) SSDs and if you should use RAID with SSDs.

Also if you are already using decent SSDs, then RAID 0 doesn't add much. You don't tend to get the queue depths in real life software to really push RAID.

There is a good summary here,
http//www.tomshardware.com/reviews/ssd-raid-benchmark,3485-13.html

Quote. "If you're planning an upgrade and want to know whether to buy a couple of 128 GB drives and put them in RAID 0 or just grab a single 256 GB SSD, for example, the answer still seems clear enough to us just grab the large drive and use one."

 
Posted : 17/02/2016 5:34 am
BraindeadVirtually
(@braindeadvirtually)
Posts: 115
Estimable Member
 

Also if you are already using decent SSDs, then RAID 0 doesn't add much. You don't tend to get the queue depths in real life software to really push RAID.

Disagree. Well, depends what you mean by real life software I guess. A lot of forensic tools especially with local DBs will absolutely hammer your sequential read/writes. I suspect a lot of comparisons are based on suboptimal onboard RAID controllers, not enterprise grade hardware, and cannot provide the I/O I'm talking about.

 
Posted : 18/02/2016 12:24 am
minime2k9
(@minime2k9)
Posts: 481
Honorable Member
 

Which tool did you do your testing with?
I suspect that using Encase 7, an SSD for your cache will speed up your processing much more than an SSD for your image files. Same with X-Ways, if you case file is on an SSD (and hash database) then your processing speed will increase with an SSD.
However most software will max out your CPU long before it hits the limit of your I/O speed. Look at IEF or similar tool that uses all cores on a system and see if there's much difference in processing speed with image files on SSD, although again having the output on an SSD probably will help.

 
Posted : 18/02/2016 2:48 am
SilesianMan
(@silesianman)
Posts: 15
Active Member
Topic starter
 

Which tool did you do your testing with?
I suspect that using Encase 7, an SSD for your cache will speed up your processing much more than an SSD for your image files. Same with X-Ways, if you case file is on an SSD (and hash database) then your processing speed will increase with an SSD.
However most software will max out your CPU long before it hits the limit of your I/O speed. Look at IEF or similar tool that uses all cores on a system and see if there's much difference in processing speed with image files on SSD, although again having the output on an SSD probably will help.

Yes, I did use EnCase 7 (7.10.05).

60GB image file

Chosen options
Recovery Folders, File signature analysis, Expand compound file, hash analysis, find internet artifacts, system info parser, windows event log parser and windows artifact parser

1Case

Image file stored on SATA HDD, cache files stored on SATA SSD

Processing time 18 mins 35 sec

2Case

Image file stored on SATA SSD, cache files stored on SATA HDD

Processing time 7 mins 52 sec

For now on, I am waiting for large samsung 850 EVO SSD SATA drives and will be making bigger tests including processing times in EnCase Forensic, X-Ways, FTK, IEF (I know IEF processing is mainly based on CPU but I am just curious of the results) and will present it here as well.

 
Posted : 19/02/2016 2:34 pm
(@jcadden)
Posts: 3
New Member
 

1Case

Image file stored on SATA HDD, cache files stored on SATA SSD

Processing time 18 mins 35 sec

2Case

Image file stored on SATA SSD, cache files stored on SATA HDD

Processing time 7 mins 52 sec

It would be very interesting to see what the same case file on the same machine does with two SSD's being used.

 
Posted : 19/02/2016 8:56 pm
SilesianMan
(@silesianman)
Posts: 15
Active Member
Topic starter
 

1Case

Image file stored on SATA HDD, cache files stored on SATA SSD

Processing time 18 mins 35 sec

2Case

Image file stored on SATA SSD, cache files stored on SATA HDD

Processing time 7 mins 52 sec

It would be very interesting to see what the same case file on the same machine does with two SSD's being used.

I will take that into account when testing )

 
Posted : 22/02/2016 1:22 am
Passmark
(@passmark)
Posts: 376
Reputable Member
 

By co-incidence I did some testing on OSForensics last week, with the aim of coming up with better system requirements recommendations.

The conclusion was that many tasks are either single threaded and / or disk bound. Meaning that for most tasks the disk system (even a SSD) was unable to provide enough data to keep multiple CPU cores busy.

This make perfect sense. While SSDs claim speeds of around 500MB/sec, the reality is that they don't get anywhere near those speeds for transfer of small data blocks at low queue depths. Speeds or 100MB/sec are more typical. Traditional HDD are even worse, even a small amount of disk seeking and you are down to maybe 20MB/sec.

On the other hand a single CPU core can read & process about 10GB/sec from RAM (depending on the processing required). More than an order of magnitude higher than the disk system.

Conclusion was that people should spend more money on faster storage (SATA SSDs and PCIe SSD) and less money on their CPU. And when picking a CPU, you should favour a small number of fast CPU cores (e.g. 4 cores at 3.9Ghz) rather than a large number of slowcores (32 cores at 2.4Ghz).

If anyone is interested, the resource usage study is here.

 
Posted : 22/02/2016 5:26 am
Page 1 / 2
Share: