Forum Replies Created
-
AuthorPosts
-
fra81
ModeratorWith the last update, H.264 encoder is now available by default in all NoMachine products, free ones included, so it is not possible to exclude that some new problem could be found when hardware encoding or decoding is in use with specific graphics cards or drivers, since, as you can imagine, it is not possible to test every possible combination of hardware, drivers and configurations in our labs.
Can you confirm you had tried to check the ‘Disable client side hardware decoding’ box?
And could you send us the session logs so we can investigate further? You can gather server side logs as explained in https://www.nomachine.com/AR10K00697 and client side logs as in https://www.nomachine.com/DT10O00163#2. You can send the files to forum[at]nomachine[dot]com.
Also a screenshot showing the issue could be useful.
-
This reply was modified 5 years, 10 months ago by
fra81.
fra81
ModeratorHi.
Strange. There is no such “error” in the NoMachine code and I honestly don’t understand how it could have been generated and make the session fail.
Could you take a screenshot of that error?
Please also provide more info on the operating system and the NoMachine product installed.
fra81
ModeratorHi,
I assume you have a Retina display on your Macbook and so it could depend on the scaling settings of the display. Could you attach a screenshot of your Display system preferences?
fra81
ModeratorHi,
isn’t just disabling ‘Fit to window’ doing what you want? If that’s not the case, could you explain further what behaviour you would like to achieve?
March 11, 2019 at 12:10 in reply to: How to improve performance of NoMachine when it is used through Windows Remote Desktop #21708fra81
ModeratorYes, indeed.
In general, when streaming a Windows desktop, RDP has access to the internal OS graphics primitives. But this doesn’t occur when streaming the NoMachine session, whose content is a Linux screen that is rendered remotely.
RDP is not able to stream this type of content as efficiently, so it’s not a NoMachine problem. This is a performance issue that must be solved in RDP.
February 27, 2019 at 10:28 in reply to: Black screen on connection to headless CentOS 7 NoMachine #21601fra81
ModeratorYes, getting the dongle should be the easiest solution to have GPU acceleration, but probably you are good to go with the llvmpipe solution, so I’d suggest to wait for your user đ
February 20, 2019 at 11:42 in reply to: Xrandr fails to split the overwide virtual screen to match the physical monitors #21516fra81
ModeratorHi,
I think this Feature Request describes what you need:
https://www.nomachine.com/FR12K02799
Check the box to be notified when implemented đ
February 18, 2019 at 13:04 in reply to: Black screen on connection to headless CentOS 7 NoMachine #21484fra81
ModeratorHi,
the fact the X server starts up doesn’t mean necessarily that rendering actually happens when no montitor is connected to the GPU. You could try one of those dongles that simulate a monitor. Try to search for “headless hdmi dongle”.
Or you could give one more try with the NoMachine’s virtual display server. In this regard, the following article should be more appropriate to address your issue:
fra81
ModeratorHi,
this is not expected in general, but consider that when you disconnect without logging out, desktop environement and applications continue running, doing their stuff and possibly making requests to the X server (which is built in the nxnode process in this case).
If you feel the CPU usage is not justified anyway, you can try to gather more info on what is happening with the following commands:
# strace -f -p <pid>
# top -b -n 1 -H -p <pid>
Where <pid> is the process ID of the nxnode process which is consuming CPU. You can send the outputs of those commands to forum[at]nomachine[dot]com. Please send along also server side logs gathered as explained in https://www.nomachine.com/AR10K00697.
fra81
ModeratorHowever, for NoMachine got a new error: âNvEncode: ERROR! Error is 15, âInvalid versionâ.â
This will be fixed in the upcoming software update. It is a compatibility problem with most recent drivers (and the Trouble Report you mentions is indeed related). However this error should not prevent sessions from working correctly.
fra81
ModeratorI was unable to get NoMachine to start my desktop environment. I tried adding unix-xsession-default to AvailableSessionTypes in node.cfg and server.cfg, and also setting DefaultDesktopCommand to â/usr/bin/dbus-launch âexit-with-session /usr/bin/mate-sessionâ, but mate-session couldnât connect to DBus.
It should work out of the box. Please check this article for some common issues with the desktop environment:
fra81
ModeratorIt could be a problem with slow access to video memory. We happened to observe such behaviour in the past on headless machines, but it could also depend on the drivers. Please attach the output of glxinfo.
You can also test what would happen if you use the virtual framebuffer provided by NoMachine instead of the “physical” output. In order to try that you have to turn off the graphical environment on the server (‘sudo systemctl stop gdm‘ or ‘sudo systemctl stop lightdm‘ or any other command which is suitable for the display manager in use) and then restart the NoMachine service (‘sudo /usr/NX/bin/nxserver –restart‘). So NoMachine will create a virtual framebuffer which you can connect to and test performance.
fra81
ModeratorHi,
if you are connecting to the “physical” display of your Ubuntu machine, it is a matter of configuring your X.org server. For example in this link you can find a few hints on how to add virtual monitors: https://askubuntu.com/questions/453109/add-fake-display-when-no-monitor-is-plugged-in. For example I tested myself successfully the configuration with the Intel driver (VirtualHeads option).
If you are using one of the NoMachine products providing virtual desktops, the implementation of this Feature Request should address your case: https://www.nomachine.com/FR12K02799.
fra81
ModeratorHi!
I start to say that this is strange. Given that server and client should settle, more or less, for a frame rate of 30 fps by default, added lag should be no more than 1000/30, that is 33 ms.
Please check the Display settings panel in the session menu. What value frame rate is set to?
And can you describe exactly how you are measuring such input lag? What procedure, what measurement method?That said, are you connecting to the physical display of the server? Is it a headless machine? I would also check cpu usage on both client and server side.
fra81
ModeratorHello,
please gather logs from the server as explained in https://www.nomachine.com/AR10K00697. If possible, also client side logs could be useful. Instructions here: https://www.nomachine.com/DT10O00163#2. You can send everything to forum[at]nomachine[dot]com.
In the meanwhile you could try to uncheck the ‘Use acceleration for display processing’ option in Server preferences -> Performance tab.
-
This reply was modified 5 years, 10 months ago by
-
AuthorPosts