We have 3 ESXi servers that each have their public IP for manageability, however for the backups we need the servers to have an internal on a different NIC.
However, when we've added a new VMKernel network, the original (public IP) network won't connect anymore, resulting in the server being only reachable via the newly added LAN network.
Is there a solution we can use so the servers are reachable on both NICs/IPs ?
The 3 servers have these configuration for network:
Interface 1: Dell iDRAC
Interface 2: VMWare public management network (public)
Interface 3: VMWare private management network (10.0.0.1/24)
Interface 4-5: Double redundant uplink
Interface 6-7: LAN network trunked
You may use the same switch (with 2 uplinks and explicit LBFO settings for different port groups) or two different switches each using its own uplink - one for external and another for internal management network.
I think you can keep external management network setup as it is now (same vSwitch, same management port group, the same vmk0 adapter in default TCP/IP stack). This vmk0 adapter may have IP configuration like this:
For internal management network, just create another vSwitch, new management port group and new vmk1 adapter. Imagine you want to use internal management network like this:
Because we cannot have 2 gateways in default TCP/IP stack, you can define gateway directly on vmk1 (this is supported in ESXi 6.5):
esxcli network ip interface ipv4 set -g 10.5.5.1 -i vmk1 -t static -I 10.5.5.5 -N 255.255.255.0
Once you do this, I think both internal and external management networks should work for you. There may be some edge cases with routing where this scheme may not work, but I think for your use-case it should be fine.