It seems to work on old card but fails with the new one. I have never tried port multipliers in combination with SAS cards. The script can be called with –nagios parameter. My setup is 2x1Tb drive as Raid1 for the system in fact 3 raid1 with different partitions and 4x2Tb as Raid5 for data. For reference, the maximum practical throughputs per port I assumed have been computed with these formulas:

Uploader: Galar
Date Added: 8 December 2014
File Size: 6.69 Mb
Operating Systems: Windows NT/2000/XP/2003/2003/7/8/10 MacOS 10/X
Downloads: 66411
Price: Free* [*Free Regsitration Required]

It doesn’t quite answer my particular question. I don’t know the market of internal drive bays that well, but I would love to buy this external one for one of my next builds: I found this just after I had done my own bit of researching.

If you are performance-wise, avoid PMs, indeed. I find this is test gives me a useful idea of the real world performance of the array under load I used the latest bios and drivers for each card The following numbers are with the drives in a Sans Digital TR4M with its port multiplier. Write cache should be enabled ONLY if you have a battery pack on your controller.

Great article, I’ve been looking for a summary like this for years, I still want to know if they support 3TB though.

From 32 to 2 ports: Ideal SATA/SAS Controllers for ZFS & Linux MD RAID

Also, everything works fine on Win7, so the hardware is ok. Please add your card if it worked for you.


I’d also like to see some real world numbers about SAS expanders. It’s the standard nagios expected return code. No wait, I think I see what you mean now. I always had the card already in when adding the OS.

Monitoring hardware RAID: LSI SAS controller, OMSA

This is done all day long by many people in the world, but sometimes cards are bad or are going bad, you lose power, etc… Please keep this in mind. Liunx as of 5. I am trying to create several multi terabyte arrays and have run into some problems.

Any wisdom much appreciated. Thanks for keeping this list up to date!

From 32 to 2 ports: Ideal SATA/SAS Controllers for ZFS & Linux MD RAID

Helped me locate a 2 port controller card for a new ZFS build. If you need RAID 5 support you should look at the libux.

The mv94xx Linux driver version 4. Ideally, I’m actually looking for something to sandwich in as many 3. I downloaded the latest file and it will now identify the M card but will not flash it. They were very fast with their reply. These partitions are used by madam directly without putting a filesystem onto the single disk. This has been the only reported issue thus far.


Linux and Hardware RAID: an administrator’s summary

Do you know where I can get the best price? Tested with intel iometer on raw storage device. I wonder if the extra stuff can be get to use it in extra devices in lsisass2008 future Llnux for example. I haven’t been able to really test it yet, so please use it carefully and contact me if you have a server on which you are able to do some testing plug disks, unplug, create hotspare Driver not included at this point.

The mvsas driver loads, sees it and all seems right and true in the world. Despite pinux AHCI-compliant, this series of chips seems unsupported by Solaris according to reader comments, see below. Put the 6 disks on the P55 chipset’s sata ports. Adaptec HBA based on the The controller is well supported and integrated without problems when installing Ubuntu In a one card system this should always be Zero the default I put in.