Hyperv node blue screens when rebooting a VM
Im asking here since our MS team cant seem to resolve the problem.
I am responsible for the Linux servers in our organization.
When a VM is rebooted the hyperv node where it lives will blue screen. All the VMs on the node is powered off and started on one of the active nodes.
The node blue screen will only happen once and after the node is up I can reboot VMs at will and hyperv stays happy until I need to reboot a VM a day or more after. This has been a problem since we moved from vmware to hyperv in 2020.
Ive been told we have the latest hardware drivers installed, ie network etc.. We are running on Windows Server 2019. Fully patched.
There are other hyperv clusters in our organization but they only host MS VMs. Our MS team started blaming the Linux guys for the problem but cannot give me any proof or where I can start looking.
On the Linux VMs we have the latest hyperv integration modules installed. What else can I investigate to resolve this problem?
Im asking here since our MS team cant seem to resolve the problem. I am responsible for the Linux servers in our organization.When a VM is rebooted the hyperv node where it lives will blue screen. All the VMs on the node is powered off and started on one of the active nodes.The node blue screen will only happen once and after the node is up I can reboot VMs at will and hyperv stays happy until I need to reboot a VM a day or more after. This has been a problem since we moved from vmware to hyperv in 2020. Ive been told we have the latest hardware drivers installed, ie network etc.. We are running on Windows Server 2019. Fully patched. There are other hyperv clusters in our organization but they only host MS VMs. Our MS team started blaming the Linux guys for the problem but cannot give me any proof or where I can start looking. On the Linux VMs we have the latest hyperv integration modules installed. What else can I investigate to resolve this problem? Read More