Start a Conversation

Unsolved

This post is more than 5 years old

B

1745

July 25th, 2018 13:00

Vplex : LUN numbers in storage groups

I have Vplex Metro between two sites and a distributed virtual volume that I'm supposed to present to storage views at both sites.

Looking at the existing distributed virtual volumes that my predecessors have presented at both sites, I see that each vol always occupies the same LUN number in all storage views: For example, if a certain distributed volume is LUN 2 in the storage views at the prod location, it is also LUN 2 in the storage views at the DR location. Is this a requirement for distributed vols?

I ask because the storage views where I'm supposed to present this new distributed volume don't have the same number of LUN's. The storage views at the prod location have gotten up to 41 LUN's, so the next LUN I add will be Number 41. The storage views at the DR location have only 39 LUN's, so the next LUN will be Number 39, and I'll have a discrepancy in LUN numbering for this distributed volume.

If it is necessary for the distributed volume to have the same LUN number in all storage views where it's presented, I can skip over two LUN numbers at the DR location and assign Number 41 to this volume so it will be the same in all storage views at both sites. In that case, are there any possible unforeseen consequences from skipping over some numbers when assigning LUN numbers in a storage view?

286 Posts

July 26th, 2018 08:00

I believe it makes your life a lot easier if they are the same. There is no issue skipping numbers. In an ESX environment for example you would know what volumes make up the distributed volume from the ESX vcenter gui if they are the same. If they are different you will need to look at the host lun id in ESX then go to the vplex storage view and match it. As far as it being a requirement, it will depend on if the host uses the lun ID when mounting. I dont know of any (perhaps AIX) that do.

I suppose you could test it with a new empty lun.

No Events found!

Top