14 Comments

  1. said:

    For those that are wondering, it is my opinion that the DroboPro is not quite ready for a production VMware environment. I saw performance anywhere from 5MB/sec to 55MB/sec with no change in configuration while cloning a VM from an EqualLogic to the DroboPro. Being that the DroboPro is VMware certified, I had high hopes for an inexpensive and solid iSCSI target when I purchased the DroboPro. For those that are going to use the DroboPro with the Microsoft iSCSI Software Initiator I was fairly impressed with the product, both performance and configuration ease.

    In case someone at Data Robotics reads this post, these are my thoughts in regards to enhancing the DroboPro:

    1. Make the option to create a volume without a filesystem type – although its not really an issue, formatting it NTFS gets confusing when the intent is to add the LUN to VMware and format with VMFS

    2. Create a web interface for configuring the DroboPro for those that intend to use the device as an iSCSI target. Keep the Drobo Dashboard for USB/Firewire use.

    3. Enable security features such as ACL, CHAP.

    4. Provide some method of knowing what the DroboPro is currently doing. I realize that the engineers want the back-end of the DroboPro to be lock out, however, a simple status would be nice as we wait 15 minutes for the DroboPro to figure out what it is doing.

    I am actually disappointed that I have to return the DroboPro – I would like to spend some more time troubleshooting, however, work duties make the “calls” and I need a functioning/simple iSCSI target yesterday. For those that are wondering, I re-purposed an older Dell server, some external eSATA storage, and installed the opensource OpenFiler product. Performance is pretty good with OpenFiler. If you want more robust SAN capabilities, try OpenSolaris and COMSTAR – just keep in mind that OpenSolaris is picky on the hardware and performance is DIRECTLY affected by the hardware you choose.

    That is it for now…

    -Aaron

    November 9, 2009
    Reply
  2. Jeff said:

    Doing some testing with:

    Dell R610
    Drobo Elite
    Dell Powerconnect 2724

    - jumbo frames enabled start to end (vswitch, vmnic, vmkernel), on the switch, and on the Drobo Elite

    Doing an upload/download from a Windows 7 workstation to the Datastore on the Drobo yields about 20MB/sec throughput (as reported by Windows). A similar upload/download to a normal file server jumps upwards of 80MB/sec.

    I thought performance would be a little better than that…

    January 27, 2010
    Reply
  3. Britton said:

    I can 2nd your results. I’m just finishing my testing of a DroboPro and am VERY dissapointed by the VMware performance. Even connected directly to the server I was only getting at most 6MB/sec. This is unacceptable for a device that costs this much. It’s interesting that a device can be VMware Certified and “Supported” but not actually work.

    January 27, 2010
    Reply
  4. said:

    In our environment, the DroboPro was going to be used as storage for test/DR virtual machines, temporary storage and for the occasional production virtual machine. The performance was so poor that VMware would sometimes fail during a machine migration from one datastore to another. The replacement for the DroboPro was to use OpenFiler on some older Dell equipment. Currently we use OpenFiler as an iSCSI target on a PowerEdge 2650 and PowerEdge 2850, both with eSATA attached storage. Surprisingly, OpenFiler on these older systems is quite stable, 60+ days up time on one – to top it off, OpenFiler’s iSCSI (which is the IET project) has compatibility with ESX 3.5 U3, which the DroboPro couldn’t do.

    January 27, 2010
    Reply
  5. Brian said:

    I have the same problem with the DroboElite, however it seems to be related to VMFS. If I create a LUN on the Elite and attach it to a Windows VM, format it with NTFS, writes are good. If I put VMFS on a DroboElite LUN, I get failures when trying to migrate my storage, and if I clone a VM to the Drobo, I only get 6 to 9MB/sec inside the VM and during the clone. Drobo support tried to pin the problem with VMware.

    March 9, 2010
    Reply
    • said:

      Brian,

      I would agree that VMFS and/or the VMware software iSCSI initiator cause issues with the DroboPro (haven’t tried the DroboElite). The part that I find interesting is that the product is supposedly VMware certified. Although it is great marketing for Drobo, it makes one think: what type of certification requirements does VMware have?; and does Drobo realize that by being VMware “certified”, yet, “incompatible” with VMware, that their marketing plan is having a reverse affect?

      From my experience, the reality is that there are no inexpensive and stable iSCSI storage devices on the market yet. iSCSI must be too new for all of the quirks to be worked out – especially with VMware and it’s unique filesystem.

      -Aaron

      March 13, 2010
      Reply
  6. said:

    Hi All,

    The DroboElite is the rigth solution for your SMB VMware storage solution. The DroboPro is really designed to connect to a single server only. The DroboElite is our multi-host iSCSI SAN solution.

    As the DroboElite is being implemented into more and more complex VMware environments more and more stress has been placed on the system. We have discovered an issue with our firmware that causes the DroboElite to disconnect from the VMware ESX clusters while under heavy load. We have been able to reproduce this corner case in our labs and we currently have a fix in place. This fix is currently under internal test and is also being tested by a handful of external customers that have experienced this issue. The testing results have been very postive to this point and we are projecting a general release of the new DroboElite firmware within the next several weeks once we complete our specific testing surrounding this issue as well as complete our general regression testing required for all general release firmware.

    Brad Meyer
    Product Manager – DroboElite

    June 4, 2010
    Reply
    • said:

      Brad,

      When I purchased the DroboPro, the DroboElite was probably still in development. Not to be too contradictory with a product manager at Data Robotics, if I remember correctly, the documentation for the DroboPro states which iSCSI TCP port to use for multiple connections. Although the design is spectacular and the management application is neat, the disappointing fact was that the DroboPro was supposedly VMware certified. To be honest, this seems like more of a problem of VMware’s certification program; however, the engineering department at Data Robotics should have spent more time with ensuring a solid solution with VMware if they were hoping to produce an exceptional SMB VMware storage solution.

      If I had some money, I would love to try the DroboElite with VMware to see how well it works.

      -Aaron

      June 7, 2010
      Reply
  7. Stephan Fuchs said:

    Hi Aaron,

    I’ve bought the DroboElite 2 month ago. Performance under VMware with VMFS is quite disappointing. (8 Seagate HDD’s a 1.5TB inserted). I also own a QNAP 509 NAS as a fileserver and made a test for fun, exported an iSCSI Target to my ESX host. The performance with only 4 HDD’s!!! was better (I/O’s & MB’s per Second). It seems to be the case that more HDD’s won’t give you more performance. A workmate has a QNAP 859 equipped with 8 HDD’s, through iometer he got over 10000 I/O’s per second, my DroboElite is only able to server max 2500 I/O’s per second…

    Another thing that is annoying is that the 2 iSCSI Ports can’t be bonded together. So when you are backing up your machines through e.g Veeam Backup & Replication it can happen that the ESX hosts looses the connection to DroboElite. At the moment all my machines in my lab have moved to local storage until this issues are fixed, already thinking about to sell the Drobo…

    Stephan

    June 8, 2010
    Reply
    • said:

      Stephan,

      That is disappointing. The more spindles (more drives) you have in an array the quicker it should respond to IO requests, assuming that the array is configured correctly, the correct drivers are installed for the controller and the iSCSI target software is working properly. Data Robotics is likely trying to create their own iSCSI target software, or some variant, and QNAP might be using a well known iSCSI target such as IET or Open-iSCSI (IET is used by OpenFiler).

      Although I think that Data Robotics could make their product faster, I do think that VMware complicates the matter by implementing some unique SCSI calls (or unique methods) that allot of iSCSI targets have a difficult time processing. This has something to do with VMFS. Try the Microsoft iSCSI software initiator to your DroboElite and I bet the performance will be 10 fold. However, it still doesn’t resolve the issue with cheap shared storage for VMware. Are you using the latest ESX or ESXi?

      If you are ambitious, take a look at my OpenSolaris iSCSI article and make your own.

      -Aaron

      June 8, 2010
      Reply
  8. Stephan Fuchs said:

    Aaron,

    I’m working with vSphere4, your OpenSolaris solution sounds interesting although I’m a little bit worried about the further development of OpenSolaris since Oracle bought Sun :(

    June 8, 2010
    Reply
  9. said:

    Has anyone tried the latest Drobo products for business, such as the B1200i? These devices look promising; I wonder how they perform with VMware.

    I’d love to try. Want to send me a demo unit Drobo? I’ll write up an article, good or bad! =)

    http://www.drobo.com/products/drobosanbusiness.php

    February 25, 2011
    Reply
  10. said:

    QNAP performs so well because it uses the LIO iSCSI stack (www.linux-iscsi.org). LIO implements all advanced SCSI commands that are necessary for VMware ESX 4 and vSphere 4 (!) certification, and it implements them very efficiently.

    LIO has become the standard multiprotocol target (iSCSI, FCoE, FC and IB) for Linux with kernel version 2.6.38.

    April 10, 2011
    Reply
  11. I have a drobo (4 bay) unit that has a FireWire800 and USB2.0 (no iSCSI). I haven’t been able to figure out how to connect my drobo directly to the ESXi box and I’m assuming that it isn’t possible. I have it connected to a old workstation running windows Server and then I have StarWind SAN to make my drobo NT volumns iSCSI LUNS. All connected via GB LAN. It works but I’d like to get rid of the NT box. Any experience with the regular Drobo?

    June 8, 2011
    Reply

Leave a Reply to David Petersen Cancel reply

Your email address will not be published. Required fields are marked *