As in my post about Lessons Learned on ESX4 rollout, we had a pretty serious hiccup with our storage and the ESX systems in December while trying to bring up our ESX4 environment. Â The primary trouble uncovered was what I’ll call “controller ping-pong”.
An EVA normally has two (maybe more, I’m not primarily a storage guy) controllers and those handle all the requests received through the SAN. Â For every LUN, one controller is its master. Â Both controllers can handle requests for the LUN, but only one actually handles the access. Â If the controller on fabric A is the primary but the controller on fabric B is getting more requests, eventually the EVA swaps control for the LUN to fabric B — wherever the majority of requests are coming.
This behavior would only become a problem if you had hosts configured to access the LUN on different fabrics.  ESX4 is ALUA (asymmetric logical unit access) aware, meaning it should automatically determine the optimal path and in the case of an EVA.  The EVA, I’m told by HP support, is supposed to respond an ALUA request for the optimal path by responding with the controller that is the master over the LUN.
If you, like us, have an ESX 3.5 cluster with preferred paths setup, you should proceed with caution. Â The ALUA information isn’t apparently shared between clusters. Â And if your clusters get different optimal paths, you could end up with controller ping-pong as requests are sent down both fabrics and the volume changes between the two, resulting in more on Fabric A followed by more on Fabric B — forcing the controller to switch masters.
So, while in a migratory state, I think my safest route is to configure the ESX4 hosts to use a preferred path like the ESX3.5 cluster nodes. Â I hate to move from the default ESX configuration and this isn’t an official recommendation from HP support, but it certainly makes the most sense to define the paths being used (except in a failure).
I post this because I feel like there have to be other HP Storageworks customers who have the same situation or have experienced something similar.  I would love to hear from you…