Hyper-V R2 (Server 2008 R2) host, Server 2008 guest (not R2), Server 2008 R2 guest
The situation plays out like this:
I create a VM and install Server 2008 R2 Standard.
The external network, connecting me to the internet, works fine and the guest OS automatically has a working network device and connection.
I shut down this VM and leave it off.
I then create another VM, exactly the same, choosing the same external network as above. This time I install Server 2008 Standard (non-R2).
The server does not find its network card, and therefore cannot connect to the internet. When I look at the guest OS, there is a yellow splat in the device manager for the virtual network adapter.
My first question in response to this:
Did you install the Integration Components within the VM?
Here is why:
Server 2008 has the ICs built in to the OS (they are extremely similar to device drivers).
But, versioning issues can come into play - the host and guest must match for optimum performance.
With the release of R2 the ICs included backward compatibility - thus allowing an R2 VM to 'just work' when you install it onto a Hyper-V v1 (2008, but not 2008 R2) host.
Now, when you have a Server 2008 VM on an R2 host - the VM built-in ICs are older than the host ICs - thus the ICs in the VM need to be updated.
Using the Hyper-V manager Console, open the console of the VM and choose Action, Install Integration Services - then respond to any prompts.
Thank you “Kelly AZ” for describing this problem so well in the TechNet forum.