Bug 93885

Summary: radeon: allow the user to set a maximum HDMI pixel clock (in MHz) by a kernel parameter
Product: xorg Reporter: Elmar Stellnberger <estellnb>
Component: Driver/RadeonAssignee: Elmar Stellnberger <estellnb>
Status: RESOLVED MOVED QA Contact: Xorg Project Team <xorg-team>
Severity: enhancement    
Priority: medium CC: sirfixabyte
Version: 7.6 (2010.12)   
Hardware: x86-64 (AMD64)   
OS: Linux (All)   
Whiteboard:
i915 platform: i915 features:
Attachments:
Description Flags
patched radeon_dvi_mode_valid + debug/patch trial for TMDS frequency
none
journal.log for patch001 radeon.hdmikhz=297000 radeon.hdmimhz=300
none
working patch that introduces radeon.hdmimhz none

Description Elmar Stellnberger 2016-01-27 14:39:56 UTC
As far as I have seen radeon sets the pixel clock (in MHz) for the HDMI/DVI output to a value as recommended by the firmware, only. Since kernel 4.5.0 the nouveau driver has a kernel parameter called hdmimhz to override this setting by hand: As far as I could test it the results I get that way are just wonderful (see for Bug 93405).
  The situation with my '4K ready' XFX Radeon R5 230 card is similar than it was with the nouveau driver before kernel 4.5.0: The promised 3840x2160@30 mode is not offered by default and when I try to set this mode manually the screen remains black (no signal). It was exactly like this with my Nv. Geforce 9600M GT until the hdmimhz kernel parameter appeared. Since then it works even better than officially specified by Nvidia with nouveau (nobody would have believed that).
  Why isn`t there a similar parameter for radeon?

kernel version: 4.5.0-rc1-ARCH #2 SMP PREEMPT
radeon driver (xf86-video-ati/arch): 1:7.6.1-1
Comment 1 Alex Deucher 2016-01-27 14:53:02 UTC
Please attach your xorg log and dmesg output.  The driver will properly filter modes in conjunction with the hw limits of the PLL.  Overriding this is not recommended.  If the clock is within the hw limits and it's not working, it's just a bug that needs to be fixed.
Comment 3 Alex Deucher 2016-01-27 14:58:57 UTC
(In reply to Ernst Sjöstrand from comment #2)
> I think this is the same value I had problems with in bug 91896
> 
> Seems like it's bumped to a good value every now and then:
> http://cgit.freedesktop.org/~agd5f/linux/commit/?h=drm-next-4.6-
> wip&id=9368931db826d57b6b88b3145a00276626b48df0
> http://cgit.freedesktop.org/~agd5f/linux/commit/?h=drm-next-4.6-
> wip&id=80c083c5e4dc35fa37c01f000b1393c51294b9de

Those are only relevant for DP.  The R5 230 only supports 4k on DP, not HDMI.
Comment 4 Elmar Stellnberger 2016-01-27 15:25:12 UTC
 Only with DP? How can that be? I have just had a look at various Radeon R5 230 cards via geizhals.at and all producers (MSI,Asus,XFX) feature at least 625MHz for the clock and just VGA, DVI and HDMI as output. I have never heard of a DisplayPort connector for that type of card.
Comment 5 Roland Scheidegger 2016-01-27 15:44:25 UTC
The chip certainly would support DP, but unfortunately graphic card manufacturers usually do not include it on low end cards, so that's why (even the next higher up cards typically do not have it, on the nvidia side it's even worse fwiw).
And unfortunately your card is listed as hdmi 1.4, but (as is very often the case, as this is allowed per hdmi spec) it's one of those old pseudo-hdmi 1.4 implementations which support some hdmi 1.4 features almost noone ever cares about but does not support the higher link rates introduced with even hdmi 1.3, so the 2560x1600@60Hz and 3840x1260@30Hz modes are not possible.
Comment 6 Alex Deucher 2016-01-27 15:56:07 UTC
(In reply to Elmar Stellnberger from comment #4)
>  Only with DP? How can that be? I have just had a look at various Radeon R5
> 230 cards via geizhals.at and all producers (MSI,Asus,XFX) feature at least
> 625MHz for the clock and just VGA, DVI and HDMI as output. I have never
> heard of a DisplayPort connector for that type of card.

The hw supports DP, it's up to the AIB vendors to decide what configurations they want to produce.  The 625 Mhz is the 3D engine clock; nothing to do with displays.
Comment 7 Elmar Stellnberger 2016-01-27 16:34:01 UTC
  Likewise for the G96M [GeForce 9600M GT] nobody would have believe that this card can yield 3840x2160, be it with 23Hz or 46Hz interlaced. Gonna provide the logs tomorrow when the computer with the XFX radeon card is free for testing. Just wanna tell that I still hope for a similar radeon tuning parameter like hdmimhz. The fact that the card was sold with HDMI as 4K ready should be a strong indication that 3840x2160@30/24/23 is possible. If I remember that correctly 3840x2160@30 was initially stated to be supported officially by ATI for the XFX card (though withdrawn now). I would even take the risk to test it if the card should not work like this for any reason (old HDMI1.4 incompatibility or so.).
Comment 8 Roland Scheidegger 2016-01-27 16:50:12 UTC
The card has a dual-link DVI port - it should, in theory, be possible to support 3840x2160@30Hz through that (as dual-link DVI has twice the bandwidth of that HDMI port). For that to work you need of course a monitor which has a dual-link dvi port (obviously, passive dvi->hdmi adapters won't help).
That is however from a theoretical point of view (that is, such a resolution should not exceed bandwidth limits) - YMMV.
Comment 9 Alex Deucher 2016-01-27 16:56:33 UTC
(In reply to Elmar Stellnberger from comment #7)
>   Likewise for the G96M [GeForce 9600M GT] nobody would have believe that
> this card can yield 3840x2160, be it with 23Hz or 46Hz interlaced. Gonna
> provide the logs tomorrow when the computer with the XFX radeon card is free
> for testing. Just wanna tell that I still hope for a similar radeon tuning
> parameter like hdmimhz. 

That hw was not designed to support 4k over hdmi.  If you want to hack the driver, you are welcome to (take a look at radeon_dvi_mode_valid()), but it's not something I want to enable or support out of the box.  If you break your card or monitor, you get to keep the pieces.
Comment 10 Elmar Stellnberger 2016-01-27 17:07:15 UTC
Hmm; the u2868pqu is not said to support a dual-link DVI port. However I have a DeLock DVI-HDMI adapter here and have already tested it with the DVI output of a Fujitsu Celsius mobile H270 equipped with a Nvidia G96GLM [Quadro FX 770M]. It did feature 3840x2160@30 even under Windows 7 without a problem. So if it does not work over the specs then that adapter should be a pretty intelligent one.
Comment 11 Elmar Stellnberger 2016-01-27 17:11:57 UTC
Roland; the nouveau developers have told me that hdmimhz=297 would be sufficient for 3840x2160@30; there should be a report online about it. Maybe you err because dual link DVI should be equivalent to hdmimhz=330. Even more hdmimhz=225 is sufficient for 3840x2160@23 as far as I have tested it.
Comment 12 Roland Scheidegger 2016-01-27 17:27:57 UTC
Ahh the joys of TMDS overclocking.
Yeah I suppose overclocking to ~225Mhz (should be sufficient for 24Hz with the right modeline) or so should have some chance of working (seems extremely unlikely it would damage something, albeit working stable is an entirely different question, this is 40% higher than the official limit after all).
In any case, AMD doesn't want to support it, so you'd have to hack the driver yourself.
Comment 13 Elmar Stellnberger 2016-02-03 14:07:54 UTC
Dear radeon developers;
  Today I have tried to give the radeon kernel module a hack: By changing radeon_dvi_mode_valid to return MODE_OK as soon as the clock is below what I state in a new kernel parameter called radeon.hdmimhz it was possible to make new graphics modes appear when hdmimhz is high enough. Besides this the new parameter has no effect; i.e. My monitor still stays black on higher modes while it boots with any hdmimhz setting. This is contrary to the behaviour of nouveau.hdmimhz where it blackscreens on boot when the setting is wrong.
  At next I tried to override max_tmds_clock and mode_clock in radeon_get_monitor_bpc; without success. Finally I have tried to extend radeon_best_single_encoder to look through all available encoders and pick the one with the highest radeon-pixel_clock. Unfortunately there was only one such decoder to decide from.
  When I look at the output of journalctl after enabling drm.debug=1 it seems to honor the settings at first but then always goes down to a dotclock of 165MHz or sth. very similar. I have no idea on what to do; further guesses may be of little success either. 
  Does anyone here have an idea on what should/could be done or on how to resolve the issue? It needs to be possible in some way to set TMDS timing for the card in deed and not just simulate higher TMDS for graphics mode detection. thx.
Comment 14 Elmar Stellnberger 2016-02-07 13:11:57 UTC
Created attachment 121571 [details]
patched radeon_dvi_mode_valid + debug/patch trial for TMDS frequency

While the patch in radeon_dvi_mode_valid seems to work (new modes are offered at a high enough radeon.hdmimhz) the patch around radeon_get_monitor_bpc/radeon.hdmikhz seems insufficient to keep a high enough TMDS frequency.
Comment 15 Elmar Stellnberger 2016-02-07 13:19:33 UTC
Created attachment 121572 [details]
journal.log for patch001 radeon.hdmikhz=297000 radeon.hdmimhz=300

... and here comes a corresponding journal.log. If you grep the output of drm.debug=1 you see that drm_calc_timestamping_constants seems to reduce the frequency to 164250 kHz which is not sufficient for 3840x2160. In spite of debugging and backtracing both function calls (drm_calc_timestamping_constants and radeon_get_monitor_bpc) I have not found out by which function and in what way they are called. Please help.
Comment 16 Elmar Stellnberger 2016-03-01 13:42:15 UTC
  I have now had a look at the manual of my XFX Radeon R5 230 Core 2GB graphics card. It is clearly marketed as 4K ready there though the card only has a DVI, a HDMI and a VGA port. Any claim that 4K would only be possible over a non existant display port thus needs to be clearly stated as wrong.
  Once again I wanna ask if anyone here would be ready to help me with a patch for the driver; or if anyone here could attempt such a patch including a radeon.hdmimhz parameter without me? Why should it not work? It has proven in practice for the nouveau driver.
Comment 17 Alex Deucher 2016-03-01 14:43:30 UTC
(In reply to Elmar Stellnberger from comment #16)
>   I have now had a look at the manual of my XFX Radeon R5 230 Core 2GB
> graphics card. It is clearly marketed as 4K ready there though the card only
> has a DVI, a HDMI and a VGA port. Any claim that 4K would only be possible
> over a non existant display port thus needs to be clearly stated as wrong.

As Roland mentioned, you could run 4k over dual link DVI.

>   Once again I wanna ask if anyone here would be ready to help me with a
> patch for the driver; or if anyone here could attempt such a patch including
> a radeon.hdmimhz parameter without me? Why should it not work? It has proven
> in practice for the nouveau driver.

As I said before, the hw was not designed to support it.  You'll need to hack radeon_dvi_mode_valid() to return MODE_OK and radeon_dig_monitor_is_duallink() to return false.
Comment 18 Elmar Stellnberger 2016-03-01 19:39:02 UTC
Unfortunately making radeon_dig_monitor_is_duallink return true under all or certain circumstances produces distortions of otherwise working modes while it can not enable a single mode that would not have worked before. Besides this I heavily doubt that a true-return for dual-link would be necessary at all for a hdmimhz of 225 and modes like 3840x2160_23.00. Is there anything else I could try? - or was it neXus mobile who has betrayed me claiming a card 4K ready which definitely isn`t?
Comment 19 Alex Deucher 2016-03-01 22:26:36 UTC
(In reply to Elmar Stellnberger from comment #18)
> Unfortunately making radeon_dig_monitor_is_duallink return true under all or
> certain circumstances produces distortions of otherwise working modes while
> it can not enable a single mode that would not have worked before. Besides
> this I heavily doubt that a true-return for dual-link would be necessary at
> all for a hdmimhz of 225 and modes like 3840x2160_23.00. Is there anything
> else I could try? - or was it neXus mobile who has betrayed me claiming a
> card 4K ready which definitely isn`t?

Read my comment again.  You need to return FALSE in radeon_dig_monitor_is_duallink().  HDMI is always single link.  You need to return false so the driver attempts to set up the link as single rather than dual.
Comment 20 Elmar Stellnberger 2016-03-02 17:30:18 UTC
Great! Stunning! Now that has worked right from the start. A little change to radeon_dvi_mode_valid and radeon_dig_monitor_is_duallink was fully sufficient to achieve 3840x2160@30 - and the mode is offered automatically by a mhz of 297.
Many thanks for hinting me, Alex! I wouldn`t have found it entirely on my own.
Are there any chances to get a well prepared patch for the radeon.hdmimhz parameter into the final 4.5.0 release?
Comment 21 Roland Scheidegger 2016-03-02 22:53:57 UTC
That's quite amazing you could overclock a tmds link from its design target of 165Mhz to 297Mhz. Albeit I'd suggest looking at reduced blanking modes, they might be more stable if you run into problems (the signal quality may not be very good, and I wouldn't be surprised if you'd sometimes run into problems for instance even if things just get a little warm).
Comment 22 Elmar Stellnberger 2016-03-03 13:30:22 UTC
Created attachment 122097 [details]
working patch that introduces radeon.hdmimhz

 Here comes the radeon.hdmimhz patch, fully verified to work at least for the settings here at me. No heat issues after an uptime of more than half a day while the passive cooler has stayed moderately warm (touched it with my fingers).
Comment 23 Elmar Stellnberger 2016-03-07 09:39:32 UTC
Hi; did anyone already have a look at my patch? Is it ok like this or would you prefer to have a separate radeon.duallink parameter for disabling the duallink feature when using radeon.hdmimhz? Have I sufficiently met kernel coding style? When will the patch be accepted soonest (4.5.0?)?
Comment 24 Andreaz 2016-11-29 22:51:10 UTC
Hi folks! 

I know this is not a bug but more like feature request for the radeon drm kernel module but I didn't find a better place to jump in and I hope you don't mind (or point me to a better place)...

At first I want to give big thanks to Elmar for his patch because it make my HD6770 run 2560x1440 over (single-link) HDMI without any flaws!

To be more precise, I applied kernel 4.8 patch from https://bugs.mageia.org/show_bug.cgi?id=19231 because 4.8 is the default kernel for my box at the moment (Archlinux). 

It worked out of the box (radeon.hdmimhz=250), I didn't have to add a custom modeline, autodetected modeline is equal to what a different computer display port connected to the same monitor would select (because it's advertised by the monitor via DDC/EDID):

Modeline "2560x1440"  241.50  2560 2608 2640 2720  1440 1443 1448 1481 +hsync -vsync



What I did not manage to find out and like to ask you folks is: what's the official status of this patch? It has'n been included to official kernel sources yet (no hdmimhz in there: https://git.kernel.org/cgit/linux/kernel/git/stable/linux-stable.git/tree/drivers/gpu/drm/radeon/radeon_drv.c?id=refs/tags/v4.8.11)
Comment 25 ich+freedesktop 2018-12-29 22:48:30 UTC
Hi, are there any updates on this?

I'm owning a Radeon HD 6970 and I'd like to be able to run 4k@60hz with it, either using HDMI or DP.

I tried the patch a year ago, but since it was not included in the official kernel, I stopped using it.
It's just too annoying to compile each kernel update by hand.

Are there any chances to get this into mainline?
Comment 26 sirfixabyte 2018-12-30 15:44:14 UTC
I ask also: Any updates on getting this into kernel ?

Thanks to Elmar Stellnberger patch, I have been running 4K on Radeon HD6570 for a month. 

works GREAT: It seems to run cool, and image is sharp & stable.

I am not using it for gaming. 

I am running at 3840 x 2160 at 23 Hz refresh- 225MHz pixel clock on a Mint 19 system.

I second the statement in comment #25 that it is 'annoying to have to manually compile kernel' to enable this feature - especially when 4K pixelclock patch has been available IN Linux KERNEL for NVIDIA for years. (Also, it has been available to Windows users - with no reports of GPU failure due to pixelclock overclocking that I could find).
Comment 27 sirfixabyte 2018-12-30 15:51:21 UTC
BTW: The Radeon HD6570 runs 4K MUCH cooler than my Nvidia P361 / GTS240 - so I would prefer to continue using the HD6570.
Comment 28 sirfixabyte 2018-12-31 18:42:43 UTC
adding details to above 2 comments: 

for NVIDIA GTS240 'Tesla' GPU: inxi -s reports 50C shortly after bootup running at 1080P & then running 4K (3840x2160 @ 22.73 Hz) reports 60C after 10 minutes watching full screen Youtube video.

for Radeon HD6570 'Turks PRO' GPU: inxi -s reports 40C after bootup running 4K (3840x2160 at 30.0HZ) and then climbs to 56C after 45 minutes running same movie on Youtube at full screen. Also drops back to 42.0C minutes after turning off movie ( while editing this comment in a full screen window).

Interesting that the HD6570 runs COOLER while running at 30HZ screen refresh rate than the GTS240 does running at 23 Hz refresh. 

My best guess is 2 or 3 years newer die fabrication technology helped reduce power consumption - overall roughly 1/2 the watts. (cooling fan also 1/2 the size on HD6570). 

My HD5750 "Juniper' GPU running at 4K (3840 x 2160 at 23 Hz) runs about 46C NOT running full screen movie. 
So, it runs a few degrees C hotter than HD6570  while doing similar tasks. 

Since Radeon HD5750 was made approximately same time as Nvidia GTS240, I assume it might run approimately same temperature if running 4K LCD and a full screen Youtube video.
Comment 29 Joachim Hoss 2019-02-23 12:46:02 UTC
With the help of Elmar Stellenberger's patch to kernel 4.20 I can now run my old Radeon HD 5870 at 3840x1600 resolution, which it does "out of the box" under widows. 
I would also like to recommend this patch for inclusion in the kernel.
Comment 30 Martin Peres 2019-11-19 07:55:37 UTC
-- GitLab Migration Automatic Message --

This bug has been migrated to freedesktop.org's GitLab instance and has been closed from further activity.

You can subscribe and participate further through the new bug through this link to our GitLab instance: https://gitlab.freedesktop.org/xorg/driver/xf86-video-ati/issues/153.

Use of freedesktop.org services, including Bugzilla, is subject to our Code of Conduct. How we collect and use information is described in our Privacy Policy.