Welcome to the 2nd iteration of my home lab! I’ve been working over the last few weeks to get this post going so I could show you what I’ve been working on for quite some time. One of my awesome customers was gracious enough to sell me some of their decommissioned equipment incredibly cheap. This is what I wanted to have in the first go around of my home lab but I couldn’t find the right gear for the right price. Anyway, on to the fun nerdy stuff!
I’ve been working on my old home lab since January of last year and it finally hit the skids when vSphere 6.7 came out. Of course I immediately tried to install it on my Dell R710 and got the following message.VMware has added functionality that requires newer processors and that completely negates the use of some older hardware. I’ve read that there’s a bunch of people upgrading the processors in these servers and vSphere 6.7 will install but I had a better opportunity to go with newer generation gear. For a full list of unsupported Processors check out the tail end of my vSphere 6.7 article here.
To be fair though, that wasn’t the only reason I needed to make a change. I had also started to experience some storage related latency issues. My Western Digital NAS that runs iSCSI just isn’t cut out to run more than 5 or 6 VM’s without causing latency issues. It does a good job for what it is, but I need more. My need to build things and test things is too great for the little WD NAS. It’s still a great NAS and has served me well running Plex, Resiliio Sync, iSCSI and file shares without any issues at all. I will continue to use it for my daily home use and as a Veeam backup repository.
As to how I came across the new lab gear. I’ve been working with this customer for many years and I’ve built and rebuilt multiple datacenters for them in that time. This equipment was relegated to their DR site and last year we replaced it all with new compute and storage platforms. So it had been sitting there powered off and just taking up space. A few months ago I convinced them to sell it to me at an amazing price and here we are. They have some empty rack space to use and I have new lab gear!
The Home Lab Checklist
My requirements for a home lab haven’t changed at all other than vSphere version support as you can see below. They may be rather lofty for most people but I deploy vSphere and Horizon on a daily basis. If you consider that then this may make more sense.
- A solution that would be as close to a small customer environment as possible
- Enough RAM scalability to deploy 30-60 VM’s
- Enough CPU scalability to deploy 30-60 VM’s
- Multiple physical servers to utilize vMotion, HA, FT, etc.
- Ability to run vSphere 6.7 and up
- iSCSI based storage (for more than 5 VMs, lol)
- 1GB networking (I’m working on a 10GB upgrade. More detail below)
When this deal came up it ticked every box on the list and then some. I procured the newer hardware and displaced my old lab almost completely except for the old 24-port switch, which I’m possibly going to replace soon. WIth the new gear I can run just about everything I need to, including VDI desktops, management components, NSX and even vSAN if I buy a few extra disks. It’s perfect!
VILab 2.0 Components
- Internet – 300 Mbps down, 20 Mbps Up
- Linksys EA9300 AC4000 Tri-Band Wireless Router
- Western Digital DL4100, 16TB (4 x 4TB) – NAS, Plex, Resilio Sync
- Cisco Catalyst 3560G – 24 Port – IP Routing, VLANs for different components, Ports trunked to hosts
- 3 x HP DL380P Gen8’s each with Dual Intel Xeon E5-2670, 2.60Ghz, 8-Core CPU, 128GB RAM, Triple Quad Port 1GB NICs (12 ports total) + ILO port
- Nimble CS210 Hybrid Array, 8TB Raw, 4TB Useable, 160GB Flash
- APC 24U Half Rack, AR3104
- Philips Hue Lightstrip Plus
I updated my logical lab Visio as well to show the new hardware.
On to the pictures!
So last time I didn’t post any pictures and I figured I’d pretty it up this time to show you what it looks like all put together. This first one is obviously with the rack door closed. This 24U APC rack was super cheap on Craigslist! It’s funny how you don’t really realize how big the racks and equipment are until it’s sitting in your basement next to your desk.Here’s the shiny bezels of the HP DL380P Gen8 Servers and the old blue logo Nimble CS210! When I first turned all this on it was crazy loud. Unbearably loud even. After a little work on the servers I was able to kick down the BIOS settings to the point where all the fans are running at 25% and I can’t hear them at all. I am also saving power if you can believe it. The old labs Dell R710s were each using ~170 Watts of power each. The new servers are running at ~80 Watts each which I was pretty happy about.
Quieting down the Nimble was an entirely different story. I ended up using the console cable to boot up each controller individually and then changed the BIOS fan settings to as low as I could get them. Those settings only seemed to slow down half of the fans on each controller so it’s still louder than it could be but honestly I think the 120MM fans on my home PC are louder so it’s not a big deal.
I have limited power resources so as you can see below, so I don’t have all the power supplies connected right now. I used white Cat5E for the HP ILO ports, blue Cat5E for the data ports and black for the iSCSI ports. I made most of the cables to exact lengths as needed. I used Velcro instead of zip ties with what you can and what you can’t see to make it easier to change things if I needed to later.
Here’s the Philips Hue Lighstrip Plus in action! I have these lightstrips and the bulbs all over the house. They’re a little on the expensive side but work flawlessly and are crazy bright. I can control them seamlessly with the app on my phone or the integration with Amazon Alexa. It’s fun saying “Hey Alexa, turn on my lab!” Even if it’s just turning on the LED lights. I’d also like to try some of the deeper and more serious integrations of Amazon Alexa in the future like this one. Cody over at TheHumbleLab.com is doing some amazing stuff with it.
Upgrading and Testing
The first thing I did with all this gear was update firmware, ILO, BIOS to the latest available versions and updated the firmware on the Nimble Array to 5.0.3. I had to migrate stuff from my old lab to my new one which was an interesting process but I got it all done. I upgraded my existing VCSAs to 6.7. I ran into a major issue with the VCSA upgrade that I’m going to detail in an article in a few days. I’ll post a link here when it’s done.
The next step was to upgrade the ESXi hosts to 6.7 using a new baseline I created after uploading the HPE vSphere 6.7 custom ISO to Update Manager. They upgraded quickly and without any issues. I also installed the latest Nimble Connection Manager plugin on the hosts to get that multipathing goodness. The only issue with Nimble compatibility on vSphere 6.7 is that the VMware Integration plugin isn’t currently compatible with 6.7. That’s not a huge deal, so it’s back to the old manual way of provisioning datastores.
Once the hosts were all configured and squared away I rebuilt all the datastores as VMFS 6 datastores to take advantage of that Automatic Space Reclamation.I got the nice new cluster running and ready to go. Way more resources than I had available previously.I did some quick IOMeter testing from a Windows 10 VM and the results look pretty promising. I used a 75% read 25% write workload for this test and got almost 13K IOPs out of this old Nimble CS210. Sweet!
What does the future hold?
Anyone in technology knows it’s a never-ending battle trying to keep up with current technology and trends in the industry. The lab is always going to be the same battle. Learning about new technology and having the hardware to learn about it is the challenge. It’s not cheap. There’s certainly other ways to accomplish the end result. That’s being said, it’s also incredibly fun!
I’m currently trying to figure out how to implement, as well as justify a 10GB switch and NIC upgrades on all this stuff. I don’t really need it if I’m being honest, but why not! I also have to wonder out loud how far these CPU’s are going to sustain me in the next vSphere upgrade lifecycle. Will the next release make them incompatible? I can see potentially needing more storage eventually. Nimble arrays for home labs at a decent price don’t grow on trees though.
So that’s my new gear. It’s spinning like a top and upgraded to the latest shiny new versions of everything. Thanks for reading and if you have any questions throw me a comment below!
Nice work. Thank you for the information
I have the same server. About the fans, something strange with ESXI.
I use the 6.5 u3 and the fans are 6,6,6,16,27,27% but if I use Windows server 2012 they are all at 6%.
Your’s are at same speed?
No all 6 of the fans runs between 25-27%. It’s not loud at all surprisingly but would be nice if they were quieter. I’m curious to know how you got them that low.
Because I don’t have any card on pci, only at lom.
Another curiosity is if I boot with windows server all the fans goes at 6%.
Only with OS based on Linux I have the las 3 Fans ramp up to 27%
How were you able to upgrade the Nimble. I’ve a couple on hand that I’d like to repurpose and throw into the lab. Lmk!