Forum Replies Created
-
AuthorPosts
-
fra81
ModeratorNote that server and client side can either use the GPU or the CPU independently: server side can use the hardware encoder or a software fallback; client side can use the hardware decoder or a software fallback. And both sides can use the hardware (GPU) at the same time, or one of them can, or none. It will depend on hardware capabilities and on settings you’re using. To answer your question, when the GPU is not used (on either side), the CPU does the work.
That said, it seems there is a problem with hardware encoding. Please provide more info so we can investigate further:
– server OS version (as I understand Fedora 29)
– Nvidia card model
– video drivers type and version
Also server side logs would be useful. You can gather them as explained in https://www.nomachine.com/AR10K00697 and send them to forum[at]nomachine[dot]com, if you prefer.
fra81
ModeratorHi,
can I assume you already tried to use VP8 encoder and that solved the problem?
Please check if any of the following options solves the problem with H.264 (try them in this order):
1) disable hardware decoding by checking the ‘Disable client side hardware decoding’ option in the session menu, Display settings
2) disable hardware encoding on server side by adding this line to the /usr/NX/etc/node.cfg file (then restart the server):
EnableHardwareEncoding 0
fra81
ModeratorHi Dario,
generally reducing image quality doesn’t decrease CPU usage, but reducing frame rate does. It is possible to set any value from 1 to 1000 by editing manually the node.cfg file on server side. For example, to have 10 frames per second, you can uncomment and set the following keys:
DisplayServerVideoFrameRate 10
DisplayServerUseVideoFrameRate 1
Another thing you can do is disabling the “lossless refinement” algorithm:
DisplayServerExtraOptions “-refinement 0”
fra81
ModeratorHi Dario,
at the moment NoMachine needs Nvidia Video Codec (NVENC) and that is not supported on Jetson.
There is work in progress to leverage OMX encoders and decoders in the future, mainly for mobile and embedded platforms, but I wouldn’t be able to say a date yet.
fra81
ModeratorPlease send us logs from the host machine as explained in https://www.nomachine.com/AR10K00697.
And client side logs as explained in https://www.nomachine.com/DT10O00163#2.3.
You can also send them to forum[at]nomachine[dot]com.
fra81
ModeratorAt the moment, the only way not to use all monitors is unplugging them physically from the server. Artifacts could be due to a bug in the H.264 software encoder with such a high horizontal resolution. Please try to use a different codec: open Server preferences -> Performance tab, check ‘Use a specific display encoding’ and make sure VP8 is selected, then restart the server and connect again.
fra81
ModeratorHi Juerg,
this problem is unknown. As a first try, please open Server preferences -> Performance tab on your host machine, and unselect ‘Use acceleration for display processing’.
fra81
ModeratorYou seem to have two 4K monitors, that give the total screen size reported in the logs:
Info: Using screen size 7280×1080.
Unfortunately NVENC (Nvidia’s encoder) supports up to 4K, as you can see here:
https://developer.nvidia.com/nvidia-video-codec-sdk#NVENCFeatures
However this Feature Request, when implemented, will address your case:
https://www.nomachine.com/FR05N03113
For the records, you can see that the hardware encoder is used when there is a single monitor:
Info: Using screen size 3840×1080.
Info: Using Nvidia H.264 hardware encoder.fra81
ModeratorHi,
can you please send all the logs. Whole ‘/usr/NX/var/log’ directory if possible. You can send it to forum[at]nomachine[dot]com.
August 6, 2019 at 17:13 in reply to: Connection from Debian amd64 to i386 makes high load on CPU? #23213fra81
Moderator15% of CPU usage doesn’t seem too much if the screen is updating. Are you using latest NoMachine versions on client and server? Recent versions always ship the H.264 encoder which minimizes CPU usage. Anyway you can check the used codec in the Display settings panel.
fra81
ModeratorHi,
for the moment NoMachine doesn’t have an option to select a specific GPU and NoMachine just iterates by default through available cards. However we’ve created a new feature request that will cover this case:
fra81
ModeratorDoes setting max quality in https://www.nomachine.com/DT07M00087#5.7 make any difference? You can also try to check ‘Disable network-adaptive display quality’ and ‘Disable multi-pass display encoding’ in the same panel.
You may also try to disabled UDP. To do so, enter the Edit panel of the connection, click Advanced and uncheck ‘Use UDP communication for multimedia data’.
If nothing works, please send a video recording showing the issue.
fra81
ModeratorHi,
for a start, you can try to disable hardware decoding, by checking the box in the session menu:
https://www.nomachine.com/DT07M00087#5.7
and hardware encoding, by uncommenting and changing the relevant key in the <installation directory>/etc/node.cfg file:
EnableHardwareEncoding 0
fra81
ModeratorHi gabriele,
can you specify the driver version? If possible, send us the logs as explained in my post above.
And does disabling ‘client side hardware decoding’ fix the problem also in your case?
fra81
ModeratorThe problem is, that all opened windows on display “1” (programs, terminals, etc…) opens in display “0” (I see it in task manager), except NoMachine client and server windows, they are opens normally. So, what am I doing wrong? And how fix it?
I assume that both desktops are owned by the same system user. In that case, how the display to connect to is chosen depends on the application. What you can do is starting the second desktop as a different user. Not sure if you can configure those applications to behave differently.
-
AuthorPosts