DroboPro Performance Testing Part 2
My original post included a quick and dirty test on raw hard drive performance using HDTach to give me an idea of what I was working with with my new DroboPro. Of course true to any benchmarking test – there are so many metrics that can be tested that may be able to give me a clearer picture, but I didn’t have the luxury of time to do so.
After publishing my testing, I felt that it was worth going back and getting more data. It’s obvious from the hit count I’ve seen – that this is an interesting topic that a few people actually find interesting. I’m in a unique position as a consumer to be able to test this unit in a great development environment against some really good equipment that others can identify with.
So, without further ado… here’s what I did.
New testing utility – the venerable IOmeter to test the same virtual machine on a vSphere environment.
This time we hit the same three storage devices:
- Direct attached storage: two 72GB 2.5” SAS disks, spinning at 10k rpm, and attached to an HP P400 raid control card with BBU and 512MB cache – configured in RAID 1 (mirrored).
- HP/LeftHand NSM2120 storage modules (2 units clustered for availability). Each unit has 12 x 500GB 7200 RPM SATA disks, each six disks are configured in a RAID 6 configuration.
- DroboPro – with 8 x 500GB 7200 RPM SATA disks, configured in single disk failure protection. Firmware 1.4.1.
with four tests: 4k read test, 4k write test, 32k read test, and a 32k write test.
On each test, we give the test a 5 second ramp up time (especially useful on throughput and response numbers so that cache on all the devices doesn’t skew the numbers). Each test runs for 15 seconds.
Test 1 – Megabytes per second
As you can see the DAS was over the top on reading data – but actually the raid controller hurt the write performance, even below that of the NSM and DroboPro. In this test the DroboPro held its own against the NSM for throughput on all four tests. I was impressed on this, because up until the end of last year – these NSM units were the primary storage for 60 VM servers in our production virtual infrastructure. No, they weren’t fast – and performance was always an issue – but it ran without fail. So I’m comfortable a unit that can match these units would be worthy of a development environment or even a small production requirement.
Results 2 – IOPS (Input / Output Operations Per Second)
This test again, gathers the total number of iops over the entire window of the test. This metric is especially interesting to database administrators or exchange administrators who have users that complain of slow access. Anything transaction based will perform based on these numbers. The DAS wipes the board with its read performance then again drops to below the two units with write performance. The DroboPro achieved an average of 63% of the NSM’s IOPS performance. Ranging from 78% as fast on the 32K reads, to only 34% of the 4k read.
Results 3 – Response Times
Response times are an average during the entire test for each device for each test.
The DroboPro was consistently slower – but not by much – than the NSM unit. Which is impressive considering the cost differences (and annual support renewal costs).
The maximum response graph is also included to show you the peak response times, but these are the single highest access times at any time during the tests – the average times are a better representation of the overall performance.
I have users who have local storage requirements of over 2TB and this unit will fit perfectly on their desk. It’ll also take a load off our SAN and the network segments between their desk and our storage networks. It’ll also provide them a considerable amount of space that they can manage themselves.
If When a drive dies, I send them a new one – any size will work. It won’t have to be an HP 80GB or a Dell supplied 120GB, just whatever is on sale or sitting in my cabinet of spare parts. If they need more room – I send them something bigger. I can still pull backups of this unit over the network using our Veritas / Symantec Backup Exec DLO so the data is still safe.
Performance is much less of an issue now that I’ve seen a more detailed battery of tests. The LeftHand units I tested the Drobo against were capable of hosting 60 virtual machines for an ESX 3.5 environment of 3 ESX hosts. The numbers I’ve seen today have been impressive now that I’ve received these new numbers from testing.
The DroboPro is not the VMWare answer Data Robotics has any longer – even their website has stopped pushing the Pro for VMWare users and are now courting customers with the DroboElite.
Unfortunately the costs of the Elite price it quite higher than the Pro (2x) for the same storage capacity. Although I’d be glad to test it, if Data Robotics would like to provide me a unit to beat on for a week or two to see if the additional costs are worth it. However as of this writing, I can’t get one from our supplier (CDW-G) without hard drives.
Another hitch, remote monitoring.
Until Data Robotics provides me a way to receive failure notifications or monitor the DroboPro remotely (with SNMP or the like) without running the Drobo Dashboard application on a workstation – I can not recommend it for production environments.
As one user in a forum suggested – they could put a web cam in front of the unit to watch the lights on the drive to alert them of issues. I know this is in jest, but seriously – why hasn’t Data Robotics thought of this as part of their rack mount kit? They’d like us to place this unit somewhere other than our desks – so who wants to play roulette with 3 TB worth of data sitting in a rack somewhere?
I’m testing DroboPro’s dashboard service/application in our development environment that may provide me with remote monitoring capabilities. Data Robotics also has a DroboElite en route to me for testing. I’ll post the raw data and details as soon as I get the unit and test it.