Forum / NoMachine for Linux / 3840×1600 Resolution
Tagged: 4K resolution H264
- This topic has 9 replies, 2 voices, and was last updated 5 years ago by fra81.
-
AuthorPosts
-
September 5, 2019 at 07:48 #23532avvidParticipant
Hello,
I’m using NoMachine on Linux with an NVIDIA GPU and VirtualGL enabled.
My monitor is 3840×1600 and whenever I use my max resolution and H.264, I get artifacts when scrolling or using any other fast moving animation. Pictures attached. These all go away with a lower resolution.
Any thoughts?
Attachments:
September 5, 2019 at 09:33 #23541fra81ModeratorHi,
can I assume you already tried to use VP8 encoder and that solved the problem?
Please check if any of the following options solves the problem with H.264 (try them in this order):
1) disable hardware decoding by checking the ‘Disable client side hardware decoding’ option in the session menu, Display settings
2) disable hardware encoding on server side by adding this line to the /usr/NX/etc/node.cfg file (then restart the server):
EnableHardwareEncoding 0
September 5, 2019 at 14:08 #23542avvidParticipantHey,
the results are different than I thought they would be.
VP8 worked flawlessly before I created this post, you were correct. Downside of VP8 was that it used my client side GPU, which wasn’t desired. So I tried your options in turn:
1) Disable client side hardware decoding: No change
2) Disable Hardware Encoding: Works great!
My fear was/is that my client’s GPU would be used now, but that doesn’t seem to be the case at all; maybe my CPU is being used slightly more, but that’s it. Also, nxserver no longer appears in my nvidia server-side process list like it used to.
So, my question is, if my server-side card isn’t encoding, and my client GPU isn’t used. Then who is doing all the work?
Thanks for the help.
September 5, 2019 at 15:11 #23545fra81ModeratorNote that server and client side can either use the GPU or the CPU independently: server side can use the hardware encoder or a software fallback; client side can use the hardware decoder or a software fallback. And both sides can use the hardware (GPU) at the same time, or one of them can, or none. It will depend on hardware capabilities and on settings you’re using. To answer your question, when the GPU is not used (on either side), the CPU does the work.
That said, it seems there is a problem with hardware encoding. Please provide more info so we can investigate further:
– server OS version (as I understand Fedora 29)
– Nvidia card model
– video drivers type and version
Also server side logs would be useful. You can gather them as explained in https://www.nomachine.com/AR10K00697 and send them to forum[at]nomachine[dot]com, if you prefer.
September 6, 2019 at 08:00 #23546avvidParticipantGot it.
Please see the information below and attached:
– Fedora 29 Kernel 5.2.7
– GeForce GTX 1050
– Nvidia Driver Version: 435.21
– CUDA Version 10.1
Attachments:
September 11, 2019 at 12:41 #23589fra81ModeratorUnfortunately we’re not able to reproduce the issue with hardware encoding in our labs. Would you test one more thing? That would be to restore the EnableHardwareEncoding key back to the original value and change the encoder’s rate control mode instead (EncoderMode key), i.e.:
EnableHardwareEncoding 1
EncoderMode bitrate
September 21, 2019 at 22:37 #23704avvidParticipantHello —
I ended up rebuilding the machine and installing slightly different drivers and NoMachine 6.8.1. The problem is no longer present. Thanks for the help.
September 23, 2019 at 09:20 #23710fra81ModeratorHi,
would you tell us what new drivers exactly?
September 24, 2019 at 08:19 #23726avvidParticipantSure.
Originally I used the official Nvidia Drivers from nvidia.com.
This time I used a fedora package from this repository: https://negativo17.org/nvidia-driver/
September 24, 2019 at 09:55 #23738fra81ModeratorSo it seems the official ones are affected. Thank you for the info 😉
-
AuthorPosts
This topic was marked as solved, you can't post.