Summary: | radeon: allow the user to set a maximum HDMI pixel clock (in MHz) by a kernel parameter | ||
---|---|---|---|
Product: | xorg | Reporter: | Elmar Stellnberger <estellnb> |
Component: | Driver/Radeon | Assignee: | Elmar Stellnberger <estellnb> |
Status: | RESOLVED MOVED | QA Contact: | Xorg Project Team <xorg-team> |
Severity: | enhancement | ||
Priority: | medium | CC: | sirfixabyte |
Version: | 7.6 (2010.12) | ||
Hardware: | x86-64 (AMD64) | ||
OS: | Linux (All) | ||
Whiteboard: | |||
i915 platform: | i915 features: | ||
Attachments: |
Description
Elmar Stellnberger
2016-01-27 14:39:56 UTC
Please attach your xorg log and dmesg output. The driver will properly filter modes in conjunction with the hw limits of the PLL. Overriding this is not recommended. If the clock is within the hw limits and it's not working, it's just a bug that needs to be fixed. I think this is the same value I had problems with in bug 91896 Seems like it's bumped to a good value every now and then: http://cgit.freedesktop.org/~agd5f/linux/commit/?h=drm-next-4.6-wip&id=9368931db826d57b6b88b3145a00276626b48df0 http://cgit.freedesktop.org/~agd5f/linux/commit/?h=drm-next-4.6-wip&id=80c083c5e4dc35fa37c01f000b1393c51294b9de (In reply to Ernst Sjöstrand from comment #2) > I think this is the same value I had problems with in bug 91896 > > Seems like it's bumped to a good value every now and then: > http://cgit.freedesktop.org/~agd5f/linux/commit/?h=drm-next-4.6- > wip&id=9368931db826d57b6b88b3145a00276626b48df0 > http://cgit.freedesktop.org/~agd5f/linux/commit/?h=drm-next-4.6- > wip&id=80c083c5e4dc35fa37c01f000b1393c51294b9de Those are only relevant for DP. The R5 230 only supports 4k on DP, not HDMI. Only with DP? How can that be? I have just had a look at various Radeon R5 230 cards via geizhals.at and all producers (MSI,Asus,XFX) feature at least 625MHz for the clock and just VGA, DVI and HDMI as output. I have never heard of a DisplayPort connector for that type of card. The chip certainly would support DP, but unfortunately graphic card manufacturers usually do not include it on low end cards, so that's why (even the next higher up cards typically do not have it, on the nvidia side it's even worse fwiw). And unfortunately your card is listed as hdmi 1.4, but (as is very often the case, as this is allowed per hdmi spec) it's one of those old pseudo-hdmi 1.4 implementations which support some hdmi 1.4 features almost noone ever cares about but does not support the higher link rates introduced with even hdmi 1.3, so the 2560x1600@60Hz and 3840x1260@30Hz modes are not possible. (In reply to Elmar Stellnberger from comment #4) > Only with DP? How can that be? I have just had a look at various Radeon R5 > 230 cards via geizhals.at and all producers (MSI,Asus,XFX) feature at least > 625MHz for the clock and just VGA, DVI and HDMI as output. I have never > heard of a DisplayPort connector for that type of card. The hw supports DP, it's up to the AIB vendors to decide what configurations they want to produce. The 625 Mhz is the 3D engine clock; nothing to do with displays. Likewise for the G96M [GeForce 9600M GT] nobody would have believe that this card can yield 3840x2160, be it with 23Hz or 46Hz interlaced. Gonna provide the logs tomorrow when the computer with the XFX radeon card is free for testing. Just wanna tell that I still hope for a similar radeon tuning parameter like hdmimhz. The fact that the card was sold with HDMI as 4K ready should be a strong indication that 3840x2160@30/24/23 is possible. If I remember that correctly 3840x2160@30 was initially stated to be supported officially by ATI for the XFX card (though withdrawn now). I would even take the risk to test it if the card should not work like this for any reason (old HDMI1.4 incompatibility or so.). The card has a dual-link DVI port - it should, in theory, be possible to support 3840x2160@30Hz through that (as dual-link DVI has twice the bandwidth of that HDMI port). For that to work you need of course a monitor which has a dual-link dvi port (obviously, passive dvi->hdmi adapters won't help). That is however from a theoretical point of view (that is, such a resolution should not exceed bandwidth limits) - YMMV. (In reply to Elmar Stellnberger from comment #7) > Likewise for the G96M [GeForce 9600M GT] nobody would have believe that > this card can yield 3840x2160, be it with 23Hz or 46Hz interlaced. Gonna > provide the logs tomorrow when the computer with the XFX radeon card is free > for testing. Just wanna tell that I still hope for a similar radeon tuning > parameter like hdmimhz. That hw was not designed to support 4k over hdmi. If you want to hack the driver, you are welcome to (take a look at radeon_dvi_mode_valid()), but it's not something I want to enable or support out of the box. If you break your card or monitor, you get to keep the pieces. Hmm; the u2868pqu is not said to support a dual-link DVI port. However I have a DeLock DVI-HDMI adapter here and have already tested it with the DVI output of a Fujitsu Celsius mobile H270 equipped with a Nvidia G96GLM [Quadro FX 770M]. It did feature 3840x2160@30 even under Windows 7 without a problem. So if it does not work over the specs then that adapter should be a pretty intelligent one. Roland; the nouveau developers have told me that hdmimhz=297 would be sufficient for 3840x2160@30; there should be a report online about it. Maybe you err because dual link DVI should be equivalent to hdmimhz=330. Even more hdmimhz=225 is sufficient for 3840x2160@23 as far as I have tested it. Ahh the joys of TMDS overclocking. Yeah I suppose overclocking to ~225Mhz (should be sufficient for 24Hz with the right modeline) or so should have some chance of working (seems extremely unlikely it would damage something, albeit working stable is an entirely different question, this is 40% higher than the official limit after all). In any case, AMD doesn't want to support it, so you'd have to hack the driver yourself. Dear radeon developers; Today I have tried to give the radeon kernel module a hack: By changing radeon_dvi_mode_valid to return MODE_OK as soon as the clock is below what I state in a new kernel parameter called radeon.hdmimhz it was possible to make new graphics modes appear when hdmimhz is high enough. Besides this the new parameter has no effect; i.e. My monitor still stays black on higher modes while it boots with any hdmimhz setting. This is contrary to the behaviour of nouveau.hdmimhz where it blackscreens on boot when the setting is wrong. At next I tried to override max_tmds_clock and mode_clock in radeon_get_monitor_bpc; without success. Finally I have tried to extend radeon_best_single_encoder to look through all available encoders and pick the one with the highest radeon-pixel_clock. Unfortunately there was only one such decoder to decide from. When I look at the output of journalctl after enabling drm.debug=1 it seems to honor the settings at first but then always goes down to a dotclock of 165MHz or sth. very similar. I have no idea on what to do; further guesses may be of little success either. Does anyone here have an idea on what should/could be done or on how to resolve the issue? It needs to be possible in some way to set TMDS timing for the card in deed and not just simulate higher TMDS for graphics mode detection. thx. Created attachment 121571 [details]
patched radeon_dvi_mode_valid + debug/patch trial for TMDS frequency
While the patch in radeon_dvi_mode_valid seems to work (new modes are offered at a high enough radeon.hdmimhz) the patch around radeon_get_monitor_bpc/radeon.hdmikhz seems insufficient to keep a high enough TMDS frequency.
Created attachment 121572 [details]
journal.log for patch001 radeon.hdmikhz=297000 radeon.hdmimhz=300
... and here comes a corresponding journal.log. If you grep the output of drm.debug=1 you see that drm_calc_timestamping_constants seems to reduce the frequency to 164250 kHz which is not sufficient for 3840x2160. In spite of debugging and backtracing both function calls (drm_calc_timestamping_constants and radeon_get_monitor_bpc) I have not found out by which function and in what way they are called. Please help.
I have now had a look at the manual of my XFX Radeon R5 230 Core 2GB graphics card. It is clearly marketed as 4K ready there though the card only has a DVI, a HDMI and a VGA port. Any claim that 4K would only be possible over a non existant display port thus needs to be clearly stated as wrong. Once again I wanna ask if anyone here would be ready to help me with a patch for the driver; or if anyone here could attempt such a patch including a radeon.hdmimhz parameter without me? Why should it not work? It has proven in practice for the nouveau driver. (In reply to Elmar Stellnberger from comment #16) > I have now had a look at the manual of my XFX Radeon R5 230 Core 2GB > graphics card. It is clearly marketed as 4K ready there though the card only > has a DVI, a HDMI and a VGA port. Any claim that 4K would only be possible > over a non existant display port thus needs to be clearly stated as wrong. As Roland mentioned, you could run 4k over dual link DVI. > Once again I wanna ask if anyone here would be ready to help me with a > patch for the driver; or if anyone here could attempt such a patch including > a radeon.hdmimhz parameter without me? Why should it not work? It has proven > in practice for the nouveau driver. As I said before, the hw was not designed to support it. You'll need to hack radeon_dvi_mode_valid() to return MODE_OK and radeon_dig_monitor_is_duallink() to return false. Unfortunately making radeon_dig_monitor_is_duallink return true under all or certain circumstances produces distortions of otherwise working modes while it can not enable a single mode that would not have worked before. Besides this I heavily doubt that a true-return for dual-link would be necessary at all for a hdmimhz of 225 and modes like 3840x2160_23.00. Is there anything else I could try? - or was it neXus mobile who has betrayed me claiming a card 4K ready which definitely isn`t? (In reply to Elmar Stellnberger from comment #18) > Unfortunately making radeon_dig_monitor_is_duallink return true under all or > certain circumstances produces distortions of otherwise working modes while > it can not enable a single mode that would not have worked before. Besides > this I heavily doubt that a true-return for dual-link would be necessary at > all for a hdmimhz of 225 and modes like 3840x2160_23.00. Is there anything > else I could try? - or was it neXus mobile who has betrayed me claiming a > card 4K ready which definitely isn`t? Read my comment again. You need to return FALSE in radeon_dig_monitor_is_duallink(). HDMI is always single link. You need to return false so the driver attempts to set up the link as single rather than dual. Great! Stunning! Now that has worked right from the start. A little change to radeon_dvi_mode_valid and radeon_dig_monitor_is_duallink was fully sufficient to achieve 3840x2160@30 - and the mode is offered automatically by a mhz of 297. Many thanks for hinting me, Alex! I wouldn`t have found it entirely on my own. Are there any chances to get a well prepared patch for the radeon.hdmimhz parameter into the final 4.5.0 release? That's quite amazing you could overclock a tmds link from its design target of 165Mhz to 297Mhz. Albeit I'd suggest looking at reduced blanking modes, they might be more stable if you run into problems (the signal quality may not be very good, and I wouldn't be surprised if you'd sometimes run into problems for instance even if things just get a little warm). Created attachment 122097 [details]
working patch that introduces radeon.hdmimhz
Here comes the radeon.hdmimhz patch, fully verified to work at least for the settings here at me. No heat issues after an uptime of more than half a day while the passive cooler has stayed moderately warm (touched it with my fingers).
Hi; did anyone already have a look at my patch? Is it ok like this or would you prefer to have a separate radeon.duallink parameter for disabling the duallink feature when using radeon.hdmimhz? Have I sufficiently met kernel coding style? When will the patch be accepted soonest (4.5.0?)? Hi folks! I know this is not a bug but more like feature request for the radeon drm kernel module but I didn't find a better place to jump in and I hope you don't mind (or point me to a better place)... At first I want to give big thanks to Elmar for his patch because it make my HD6770 run 2560x1440 over (single-link) HDMI without any flaws! To be more precise, I applied kernel 4.8 patch from https://bugs.mageia.org/show_bug.cgi?id=19231 because 4.8 is the default kernel for my box at the moment (Archlinux). It worked out of the box (radeon.hdmimhz=250), I didn't have to add a custom modeline, autodetected modeline is equal to what a different computer display port connected to the same monitor would select (because it's advertised by the monitor via DDC/EDID): Modeline "2560x1440" 241.50 2560 2608 2640 2720 1440 1443 1448 1481 +hsync -vsync What I did not manage to find out and like to ask you folks is: what's the official status of this patch? It has'n been included to official kernel sources yet (no hdmimhz in there: https://git.kernel.org/cgit/linux/kernel/git/stable/linux-stable.git/tree/drivers/gpu/drm/radeon/radeon_drv.c?id=refs/tags/v4.8.11) Hi, are there any updates on this? I'm owning a Radeon HD 6970 and I'd like to be able to run 4k@60hz with it, either using HDMI or DP. I tried the patch a year ago, but since it was not included in the official kernel, I stopped using it. It's just too annoying to compile each kernel update by hand. Are there any chances to get this into mainline? I ask also: Any updates on getting this into kernel ? Thanks to Elmar Stellnberger patch, I have been running 4K on Radeon HD6570 for a month. works GREAT: It seems to run cool, and image is sharp & stable. I am not using it for gaming. I am running at 3840 x 2160 at 23 Hz refresh- 225MHz pixel clock on a Mint 19 system. I second the statement in comment #25 that it is 'annoying to have to manually compile kernel' to enable this feature - especially when 4K pixelclock patch has been available IN Linux KERNEL for NVIDIA for years. (Also, it has been available to Windows users - with no reports of GPU failure due to pixelclock overclocking that I could find). BTW: The Radeon HD6570 runs 4K MUCH cooler than my Nvidia P361 / GTS240 - so I would prefer to continue using the HD6570. adding details to above 2 comments: for NVIDIA GTS240 'Tesla' GPU: inxi -s reports 50C shortly after bootup running at 1080P & then running 4K (3840x2160 @ 22.73 Hz) reports 60C after 10 minutes watching full screen Youtube video. for Radeon HD6570 'Turks PRO' GPU: inxi -s reports 40C after bootup running 4K (3840x2160 at 30.0HZ) and then climbs to 56C after 45 minutes running same movie on Youtube at full screen. Also drops back to 42.0C minutes after turning off movie ( while editing this comment in a full screen window). Interesting that the HD6570 runs COOLER while running at 30HZ screen refresh rate than the GTS240 does running at 23 Hz refresh. My best guess is 2 or 3 years newer die fabrication technology helped reduce power consumption - overall roughly 1/2 the watts. (cooling fan also 1/2 the size on HD6570). My HD5750 "Juniper' GPU running at 4K (3840 x 2160 at 23 Hz) runs about 46C NOT running full screen movie. So, it runs a few degrees C hotter than HD6570 while doing similar tasks. Since Radeon HD5750 was made approximately same time as Nvidia GTS240, I assume it might run approimately same temperature if running 4K LCD and a full screen Youtube video. With the help of Elmar Stellenberger's patch to kernel 4.20 I can now run my old Radeon HD 5870 at 3840x1600 resolution, which it does "out of the box" under widows. I would also like to recommend this patch for inclusion in the kernel. -- GitLab Migration Automatic Message -- This bug has been migrated to freedesktop.org's GitLab instance and has been closed from further activity. You can subscribe and participate further through the new bug through this link to our GitLab instance: https://gitlab.freedesktop.org/xorg/driver/xf86-video-ati/issues/153. |
Use of freedesktop.org services, including Bugzilla, is subject to our Code of Conduct. How we collect and use information is described in our Privacy Policy.