Forum / General Discussions / Performance over LAN
- This topic has 6 replies, 3 voices, and was last updated 3 years, 7 months ago by dark_sylinc.
-
AuthorPosts
-
April 20, 2021 at 09:03 #32970hellcatsParticipant
A very pleasant use-case for NoMachine is using the Mac client to connect to PC and Linux boxes over a gigabit LAN. I create multiple full-screen desktops on the Mac and capture the keyboard and mouse in NoMachine. I set quality at max with a specific frame rate of 50 or 60 hz. The effect is nearly indistinguishable from directly using the remote computers. You can flick between systems using 4-finger swipe on the Mac trackpad. Lovely.
But (there’s always a but), the high frame rate will slowly degrade after a few minutes of use, settling at around 24-27 FPS (measured using a digital camera). Not bad, but not as great as just after first connecting. Disconnecting and reconnecting restores the high frame rate once again, but only for a few more minutes. Max. network traffic is about 40Mbps when dragging a window rapidly around the screen, so bandwidth isn’t really an issue. I presume that NoMachine is applying some kind of adaptive flow-control to smooth-out the bandwidth, but on a LAN I really don’t need that. I would really enjoy having the option to disable the adaptive algorithm so that the high frame rate is retained at all times.
My config.
NoMachine vers 7.4.1 on all systems.
- 15″ MacBook Pro with Radeon Pro 560 4 GB (with 3440×1440 external monitor)
- Linux box #1: Ubuntu 20.04; AMD 5800x, nVidia 3090, 64 Gb RAM
- Linux box #2: Ubuntu 18.04; Intel 6-core, nVidia 1080Ti, 64 Gb RAM
- PC: Win 10; Intel 6-core (5820), nVidia 980 Ti, 64 Gb RAM
April 23, 2021 at 18:06 #33047fra81ModeratorHi,
this is strange indeed. Did you have the chance to verify if this behaviour only occurs when connecting from the Mac? Does using the integrated monitor instead of the external one change anything? And would you run a debug package to gather more information?
April 26, 2021 at 16:21 #33079hellcatsParticipantI performed some more tests this weekend and can replicate the problem from the PC client as well as the Mac client connecting to a Linux box. I can more easily get it to switch into “slow mode” using the Linux machine with the NVidia 3090 than the one with the 1080Ti (fyi the 3090 machine is an AMD 5800X and the 1080Ti machine is an intel 6 core i7). I also tried using the Macbook’s built-in Retina display and the performance was abysmal. Another thing I noticed is that the GPU usage on the Linux server reported by nvidia-smi hovers around 1-2% when things are working well, but climbs into the 3-6% range and stays there when things slow down. Disconnecting and reconnecting restores the fast frame rate and lowers GPU usage.
I am willing to use a debug build to try to track this issue down. Thanks for looking into it.
April 27, 2021 at 10:25 #33090dark_sylincParticipantThis may be a silly answer but I noticed some of the rigs you mention (particularly your two Linux boxes) are severely prone to overheating.
Check your sensors to see you’re not just being throttled. For your AMD rig on Ubuntu you’ll need a mainline kernel of 5.11+ otherwise CPU sensor data will show up as 0°C.
If you suspect it’s an adaptive flow control, try literally disabling it (Ctrl+Alt+0 -> Display -> Change Settings -> Modify -> Disable Network-adaptive display quality). There’s also other options you can try tweaking.
Another thing you can try is Toggling HW acceleration server-side
Cheers,
Note: I don’t work for NoMachine, I’m just another user.
April 27, 2021 at 18:49 #33135fra81ModeratorYou don’t work for NoMachine, but it seems like you do! 😉
I’d try the suggestions dark_sylinc rightly provided, starting from toggling hardware acceleration. This can also be done from the Server settings GUI, Performance panel.
May 3, 2021 at 08:41 #33198hellcatsParticipantWell, my response didn’t submit successfully.
Basically, I’ve tried changing settings, etc. on both the client and server-side to try to get the highest performance over LAN, and I can get high performance for a little while, but the frame rate always degrades over time to around 20-25 fps. Not unusable, but also not as good as NoMachine is capable of.
I see the problem more easily when connecting to the Linux box with the Nvidia 3090 (both from Mac and PC clients), but I also see it on the other Linux computer as well (with a 1080Ti).
I am willing to run a debug version to try to get to the bottom of this if someone on the development team is interested in having me run some tests.
– hellcats
May 10, 2021 at 10:55 #33327dark_sylincParticipantBasically, I’ve tried changing settings, etc. on both the client and server-side to try to get the highest performance over LAN, and I can get high performance for a little while, but the frame rate always degrades over time to around 20-25 fps. Not unusable, but also not as good as NoMachine is capable of.
Did you try to disable HW acceleration on HW as I suggested? Because disabling it should in theory get you slower performance, but in practice enabling it can give a ton of problemsIf you have it enabled, on the client, it should say: Display 1920×1080, codec H.264 NVENC/VAAPI, audio Opus 22kHZ stereo
If you have it disabled on the client it should say: Display 1920×1080, codec H.264/VAAPI, audio Opus 22kHZ stereo
Another source of slowdown may be Ubuntu’s compositor. I’m not familiar with Gnome’s compositor to suggests tweaks for it. On XFCE you’d go to Window Manager Tweaks -> Compositor -> Enable display compositor (turn it off).
Try using OBS (Open Broadcaster System) and start recording the screen, and see if such slowdown appears after a while (since how OBS and NoMachine capture the display appear to be very similar).
-
AuthorPosts
This topic was marked as solved, you can't post.