Jump to content


Photo

Best HBA/RAID card for bootable SSD array


  • You cannot start a new topic
  • Please log in to reply
5 replies to this topic

#1 shoek

shoek

    Member

  • Member
  • 29 posts

Posted 17 February 2014 - 09:13 AM

Hi,
I'm building a dual Xeon IVB-E machine and want to boot off of a RAID-0 array of 4 512GB SSD's (Samsung 840 Pro). 
The mobo has Intel C602/X79 chipset so not enough Intel SATA3 ports. I'm looking for a HBA/RAID card that would be best for this use case.

LSI 9361/9341 - prepare for the future of SATA 12G, may be overkill, early reviews on NewEgg aren't that great
Areca 1882i - I love Areca for RAID-5 so this seems the safe choice
Adaptec 7805 - haven't had Adaptec in a decade; not sure what to think 

Do any of these have a BIOS like Intel's RAID where you don't need F6-installed drivers to get Windows installed as a boot drive? I'm thinking no...

What other cards should I be considering?

TIA,
-Steve


#2 Kevin OBrien

Kevin OBrien

    StorageReview Editor

  • Admin
  • 1,426 posts

Posted 17 February 2014 - 11:27 AM

What is your budget and will this be limited to SATA/SAS 6Gb/s drives only? Also any way to convince you not to go RAID0x4 for your boot partition? SSDs are reliable, but we still encounter glitches occasionally with them. Spreading that risk across 4 consumer SSDs (even very good ones such as the SSD 840 Pro) is begging for some trouble.

 

 


#3 shoek

shoek

    Member

  • Member
  • 29 posts

Posted 17 February 2014 - 11:45 AM

Hi Kevin - thanks for the reply.

 

My budget is up to ~$750 for the controller, but of course would love to hear that I could max out the throughput of this array with something cheaper.  I'd like to be able to move to the next gen of SATA/SAS (12Gb/s) when they become more mainstream, which is why I was thinking about the LSI 93x1 models.  

 

I've been running a 2, 3 and now 4 drive (SATA 3Gb/s) array on my current Intel 920/X79 system using the Intel chipset RAID for almost 5 years now and have not had a single drive failure.  Perhaps this has made me overconfident, but I'm comfortable with the risk.

 

-Steve

 


#4 Brian

Brian

    SR Admin

  • Admin
  • 5,213 posts

Posted 17 February 2014 - 11:47 AM

What's the use case? I agree that 4xRAID0 is far more dangerous than the inherent benefits in most cases. Understanding how you're using the drives would be helpful. 


Brian

Publisher- StorageReview.com
Twitter - @StorageReview

 

#5 shoek

shoek

    Member

  • Member
  • 29 posts

Posted 17 February 2014 - 11:51 AM

I'm a software developer working on large projects in Visual Studio.  We've experimented with a putting tools/OS on a single SSD and code on a R0 array and it is not as fast for build times as having everything on the same R0 array.  We're starting to see that be matched when building on Mac's using Bootcamp and taking advantage of their PCI-ex SSD interface for the boot drive and a Thunderbolt R0 array for the code, but I'm not aware of similar technology on the PC.

 


#6 Kevin OBrien

Kevin OBrien

    StorageReview Editor

  • Admin
  • 1,426 posts

Posted 17 February 2014 - 12:24 PM

Interesting... have you watching perfmon closely to see which files are seeing more I/O thrash on the boot drive? If you were able to pinpoint those and find a way to move them over to the scratch-space array, you could help gain some security on the boot side and fix those performance items.





0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users