I can’t find any documentation for how to enable this feature.
No configuration is needed. NoMachine is designed to select automatically the encoding method that will provide the best performance among the available ones. The hardware encoder will be chosen as the preferred one when supported.
Does it work with AVC?
It will work with any software or hardware decoder you may have on client side, NoMachine AVC pack included.
Does performance update rate improve if I use AVC and a server side graphics card?
Using the hardware encoder on server side can improve performance and most importantly will offload the CPU. AVC on client side should not be needed if you are running a Windows or Mac client. NoMachine will use anyway the hardware decoder that is provided on your client computer.
Lastly, will this work with Kepler based GPUs on AWS EC2?
Unfortunately only Maxwell based (or next ones) GPUs are supported. Hardware encoder is not used in Kepler or older GPUs due to their insufficient feature level.