Forum / NoMachine for Mac / Mac host HW encode on AMD RX series?
- This topic has 14 replies, 2 voices, and was last updated 4 months, 3 weeks ago by Rexiy.
-
AuthorPosts
-
May 24, 2024 at 15:25 #48243RexiyParticipant
Hello,
I’m using the latest version of NoMachine 8.11.3 (free version) on macOS 14.5 with AMD RX series video adapter and I’m unable to activate HW encoding while the rest of the system is able to utilize it without problem (QuickTime export to both H264 and HEVC and other video encoders). This is the host side.
The Mac has a display connected however, in the logs I found a reference that the server runs in a headless mode.
The client (remote) side is a Windows 11 machine and when I connect to the Mac, on the connection status I can see: SW encoding, H264 / HW decoding, DXVA
As I understood from the documentation, HW encoding for AMD adaptors on Mac should be supported since NoMachine 8 version release. Or am I mistaken? If not, help me please 🙂
Attachments:
May 27, 2024 at 18:37 #48269BritgirlKeymasterHi, we checked the logs and we see that encoding fails at a very early stage, at a point where it seems unlikely that it can be a NoMachine issue. Can you confirm in any way that the other apps you mention are using HW encoding and not a SW fallback? How have you checked that they are using HW encoding?
May 27, 2024 at 21:00 #48271RexiyParticipantHi,
As I mentioned, one method I used to verify this was through the QuickTime export option for both H264 and HEVC formats. A test video of 5 minutes was converted to H.264 in approximately 3 minutes and to HEVC (H.265) in 5 minutes, with minimal CPU usage and significant GPU load (monitored using the iStat Menus application). A CPU-only conversion would have taken considerably longer. These conversion times are more typical for modern 8-core CPUs that utilize all 16 threads, not for my outdated 4-core Xeon CPU that is 14 years old.
Additionally, I confirmed this with VideoProc Converter, which includes a tool to check if the GPU can be used for video conversion. I have attached a screenshot as evidence.
If you know of a better method to verify this, please inform me, and I will test it myself.
Attachments:
May 29, 2024 at 14:56 #48303BritgirlKeymasterCan we send you a debug version?
May 29, 2024 at 22:38 #48306RexiyParticipantYes. but currently I do not have access to my email and unable to edit it in the profile. Would you please share a link to the download?
May 29, 2024 at 22:52 #48307RexiyParticipantsorry, please disregard my previous message. I was able to update my email address and have access to it now
June 3, 2024 at 20:26 #48374RexiyParticipantHi,
Would you still like to send me the debug version?
June 4, 2024 at 08:37 #48379BritgirlKeymasterYes 🙂 We are preparing it and we will be sending you a link to a package for you to download.
June 5, 2024 at 17:35 #48397BritgirlKeymasterHi, I sent you an email with the link to download the debug package 🙂
June 6, 2024 at 00:02 #48400RexiyParticipantJune 6, 2024 at 14:29 #48411BritgirlKeymasterI’ve sent you a link to a new debug package.
June 6, 2024 at 19:05 #48419RexiyParticipantJune 14, 2024 at 10:40 #48495RexiyParticipantHi,
here are the logs from the latest package you’ve sent today.
Thank you for not forgetting about me 🙂
Attachments:
June 21, 2024 at 17:25 #48614BritgirlKeymasterHi, sorry for the delay in coming back. It’s not good news. We looked at the last set of debug logs and they still don’t pinpoint to the specific problem. What we do know is that a parameter that NoMachine relies on when performing GPU encoding is not being recognized. The error is “Invalid parameter”, but we’ve not been able to pinpoint which specific parameter despite the high level of logs and we’ve been through every single parameter we use. These are our standard parameters, and permit GPU encoding across several dozens of combinations of macOS systems and hardware combinations with diverse video cards. It’s possible that this particular encoder doesn’t support one of the features that we need, hence the Invalid parameter error. As for the other apps which you say are using the GPU, they are most likely using a different set of parameters. We could of course continue debugging on our own by replicating your set-up to a tee, purchasing the hardware and AMD RX 560 and disabling the parameters one-by-one until we find which parameter is causing the issue. But it would not bring us to a solution because it would not be possible to change the way NoMachine uses the encoder. So, the workaround is to continue using software encoding (or if possible try an alternative GPU unit).
June 21, 2024 at 18:07 #48615RexiyParticipantHi, thank you for the comprehensive explanation.
Using an alternative GPU unit is not a viable solution at the moment, as the one included with the computer is too outdated to support the newer OS version and H.264 hardware encoding. I could consider using an RX 580 unit, which is architecturally similar but faster, which I own; however, it is currently in another country, and it’s improbable that I will be able to retrieve it soon. The plan was to use the computer in headless mode since only CPU power is needed for the intended tasks. Therefore, the most affordable GPU that seemed to meet my requirements was selected (H.264 support by the GPU is one of them). The issue is that software encoding is CPU intensive, which results in laggy streaming and diverts CPU resources from the primary processes.
I could attempt to upgrade the CPU to the fastest one available for this model (or even switch to a dual CPU setup, although this would be an expensive alternative), to see if it improves the situation.
Thank you for your cooperation.
-
AuthorPosts
This topic was marked as solved, you can't post.