|
Bugzilla – Full Text Bug Listing |
| Summary: | Now the modesetting driver is not inactive anymore | ||
|---|---|---|---|
| Product: | [openSUSE] openSUSE Distribution | Reporter: | Dr. Werner Fink <werner> |
| Component: | X11 3rd Party Driver | Assignee: | Gfx Bugs <gfx-bugs> |
| Status: | RESOLVED FIXED | QA Contact: | Stefan Dirsch <sndirsch> |
| Severity: | Major | ||
| Priority: | P5 - None | ||
| Version: | Leap 15.2 | ||
| Target Milestone: | --- | ||
| Hardware: | x86-64 | ||
| OS: | SUSE Other | ||
| Whiteboard: | |||
| Found By: | --- | Services Priority: | |
| Business Priority: | Blocker: | --- | |
| Marketing QA Status: | --- | IT Deployment: | --- |
| Bug Depends on: | 1134123 | ||
| Bug Blocks: | |||
| Attachments: |
/etc/X11/xorg.conf.d/99-local.conf
/etc/X11/xorg.conf.d/99-local.conf |
||
|
Description
Dr. Werner Fink
2019-06-13 13:14:15 UTC
The kenrel command line is BOOT_IMAGE=/vmlinuz-4.12.14-lp151.28.4-default root=UUID=df469636-2ce6-4a8b-a88b-d0ac5cb85660 resume=/dev/nvme0n1p1 splash=silent i915.preliminary_hw_support=1 rcutree.rcu_idle_gp_delay=1 plymouth.enable=0 systemd.show_status=true acpi_os_name=Linux acpi_backlight=vendor "acpi_osi=!Windows 2012" usbcore.autosuspend=-1 acpi=force acpi_sleep=s3_bios scsi_mod.use_blk_mq=1 video=DP-0:1920x1080-32@60 nvidia-drm.modeset=1 nvidia_drm.modeset=1 modeset=1 quiet showopts loglevel=0 I've added
Option "AccelMethod" "none"
for modesetting to disable glamorgl and also commented out the monitor informations for nvidia. This results in only one avtive monitor eDP-1-1 driven by the modesetting part. Interesting with glxinfo I see
> glxinfo | head -n 50
name of display: :0
display: :0 screen: 0
direct rendering: Yes
server glx vendor string: NVIDIA Corporation
server glx version string: 1.4
server glx extensions:
GLX_ARB_context_flush_control, GLX_ARB_create_context,
GLX_ARB_create_context_no_error, GLX_ARB_create_context_profile,
GLX_ARB_create_context_robustness, GLX_ARB_fbconfig_float,
GLX_ARB_multisample, GLX_EXT_buffer_age,
GLX_EXT_create_context_es2_profile, GLX_EXT_create_context_es_profile,
GLX_EXT_framebuffer_sRGB, GLX_EXT_import_context, GLX_EXT_libglvnd,
GLX_EXT_stereo_tree, GLX_EXT_swap_control, GLX_EXT_swap_control_tear,
GLX_EXT_texture_from_pixmap, GLX_EXT_visual_info, GLX_EXT_visual_rating,
GLX_NV_copy_image, GLX_NV_delay_before_swap, GLX_NV_float_buffer,
GLX_NV_robustness_video_memory_purge, GLX_SGIX_fbconfig, GLX_SGIX_pbuffer,
GLX_SGI_swap_control, GLX_SGI_video_sync
client glx vendor string: NVIDIA Corporation
client glx version string: 1.4
client glx extensions:
GLX_ARB_context_flush_control, GLX_ARB_create_context,
GLX_ARB_create_context_no_error, GLX_ARB_create_context_profile,
GLX_ARB_create_context_robustness, GLX_ARB_fbconfig_float,
GLX_ARB_get_proc_address, GLX_ARB_multisample, GLX_EXT_buffer_age,
GLX_EXT_create_context_es2_profile, GLX_EXT_create_context_es_profile,
GLX_EXT_fbconfig_packed_float, GLX_EXT_framebuffer_sRGB,
GLX_EXT_import_context, GLX_EXT_stereo_tree, GLX_EXT_swap_control,
GLX_EXT_swap_control_tear, GLX_EXT_texture_from_pixmap,
GLX_EXT_visual_info, GLX_EXT_visual_rating, GLX_NV_copy_buffer,
GLX_NV_copy_image, GLX_NV_delay_before_swap, GLX_NV_float_buffer,
GLX_NV_multisample_coverage, GLX_NV_present_video,
GLX_NV_robustness_video_memory_purge, GLX_NV_swap_group,
GLX_NV_video_capture, GLX_NV_video_out, GLX_SGIX_fbconfig,
GLX_SGIX_pbuffer, GLX_SGI_swap_control, GLX_SGI_video_sync
GLX version: 1.4
GLX extensions:
GLX_ARB_context_flush_control, GLX_ARB_create_context,
GLX_ARB_create_context_no_error, GLX_ARB_create_context_profile,
GLX_ARB_create_context_robustness, GLX_ARB_fbconfig_float,
GLX_ARB_get_proc_address, GLX_ARB_multisample, GLX_EXT_buffer_age,
GLX_EXT_create_context_es2_profile, GLX_EXT_create_context_es_profile,
GLX_EXT_framebuffer_sRGB, GLX_EXT_import_context, GLX_EXT_stereo_tree,
GLX_EXT_swap_control, GLX_EXT_swap_control_tear,
GLX_EXT_texture_from_pixmap, GLX_EXT_visual_info, GLX_EXT_visual_rating,
GLX_NV_copy_image, GLX_NV_delay_before_swap, GLX_NV_float_buffer,
GLX_NV_robustness_video_memory_purge, GLX_SGIX_fbconfig, GLX_SGIX_pbuffer,
GLX_SGI_swap_control, GLX_SGI_video_sync
OpenGL vendor string: NVIDIA Corporation
OpenGL renderer string: GeForce GTX 960M/PCIe/SSE2
and the e.g. the glxgears works perfect ...
Looks like modesetting is using GLX from nvidia even with
> xrandr | grep DP
DP-0 disconnected primary (normal left inverted right x axis y axis)
DP-1 disconnected (normal left inverted right x axis y axis)
eDP-1-1 connected 1920x1080+0+0 (normal left inverted right x axis y axis) 381mm x 214mm
DP-1-2 disconnected (normal left inverted right x axis y axis)
I currently do not see a way to enable nvidia and disable modesetting with e.g. xrandr ... I'd like to know for what --setprovideroffloadsink can be used and where I find an expample/explanation as the manual page is somehow short here
Using only the module modesetting without Device and Screen let nvidia start but then I see a blank screen with a blinking cursor in the left upper corner. Even with EDID specified. This looks more and more like a bug in the nmodesetting driver/module as it is required to set up the DP/eDP connected monitor but as the modsetting is ignoring the `Inactive' in the layout as well as the xrand command line option --setprovideroutputsource to switch to nvidia as provider ... then the X server creates a screen doubled in size Werner, I'm afraid we'll never support laptops with outputs connected to both GPUs. At least not out-of-the-box. You may find some configuration at the end, which works for you though. OTOH it looks rather good on Optimus systems, where the NVIDIA GPU is only used for rendering. Thanks to suse-prime. We ship this since Leap 15.1/sle15-sp1. (In reply to Stefan Dirsch from comment #4) > Werner, I'm afraid we'll never support laptops with outputs connected to > both GPUs. At least not out-of-the-box. You may find some configuration at > the end, which works for you though. > > OTOH it looks rather good on Optimus systems, where the NVIDIA GPU is only > used for rendering. Thanks to suse-prime. We ship this since Leap > 15.1/sle15-sp1. Hmmm ... it had worked since 2014 and now it is gone with Leap 15.1 ??? ... What is the cause of this change of behaviour? How can I stop the modesetting driver to use eDP-1-1 (which it has not on Leap 15.0!). It is a fact that nvidia does not work without the help of the modesetting part but now the modesetting cause an ultra wide display (1920x2160) on an 1920x1080 monitor :((( The
Option "UseDisplayDevice" "None"
indeed uses only the nvidia glx (see below) but how do I check if the modesetting driver uses the render of the nvidia hardware?
> xrandr
Screen 0: minimum 8 x 8, current 1920 x 1080, maximum 16384 x 16384
eDP-1-1 connected 1920x1080+0+0 (normal left inverted right x axis y axis) 381mm x 214mm
1920x1080_60 60.03*+
1920x1080 60.03 + 60.01 59.97 59.96 60.03 59.93
[..]
360x202 59.51 59.13
320x180 59.84 59.32
DP-1-2 disconnected (normal left inverted right x axis y axis)
HDMI-1-1 disconnected (normal left inverted right x axis y axis)
HDMI-1-2 disconnected (normal left inverted right x axis y axis)
1920x1080_50 49.93
1368x768 59.88
1280x800 59.81
1280x720 59.86
1024x768 60.00
[...]
> glxinfo | head -n 50
name of display: :0
display: :0 screen: 0
direct rendering: Yes
server glx vendor string: NVIDIA Corporation
server glx version string: 1.4
server glx extensions:
GLX_ARB_context_flush_control, GLX_ARB_create_context,
GLX_ARB_create_context_no_error, GLX_ARB_create_context_profile,
GLX_ARB_create_context_robustness, GLX_ARB_fbconfig_float,
GLX_ARB_multisample, GLX_EXT_buffer_age,
GLX_EXT_create_context_es2_profile, GLX_EXT_create_context_es_profile,
GLX_EXT_framebuffer_sRGB, GLX_EXT_import_context, GLX_EXT_libglvnd,
GLX_EXT_stereo_tree, GLX_EXT_swap_control, GLX_EXT_swap_control_tear,
GLX_EXT_texture_from_pixmap, GLX_EXT_visual_info, GLX_EXT_visual_rating,
GLX_NV_copy_image, GLX_NV_delay_before_swap, GLX_NV_float_buffer,
GLX_NV_robustness_video_memory_purge, GLX_SGIX_fbconfig, GLX_SGIX_pbuffer,
GLX_SGI_swap_control, GLX_SGI_video_sync
client glx vendor string: NVIDIA Corporation
client glx version string: 1.4
client glx extensions:
GLX_ARB_context_flush_control, GLX_ARB_create_context,
GLX_ARB_create_context_no_error, GLX_ARB_create_context_profile,
GLX_ARB_create_context_robustness, GLX_ARB_fbconfig_float,
GLX_ARB_get_proc_address, GLX_ARB_multisample, GLX_EXT_buffer_age,
GLX_EXT_create_context_es2_profile, GLX_EXT_create_context_es_profile,
GLX_EXT_fbconfig_packed_float, GLX_EXT_framebuffer_sRGB,
GLX_EXT_import_context, GLX_EXT_stereo_tree, GLX_EXT_swap_control,
GLX_EXT_swap_control_tear, GLX_EXT_texture_from_pixmap,
GLX_EXT_visual_info, GLX_EXT_visual_rating, GLX_NV_copy_buffer,
GLX_NV_copy_image, GLX_NV_delay_before_swap, GLX_NV_float_buffer,
GLX_NV_multisample_coverage, GLX_NV_present_video,
GLX_NV_robustness_video_memory_purge, GLX_NV_swap_group,
GLX_NV_video_capture, GLX_NV_video_out, GLX_SGIX_fbconfig,
GLX_SGIX_pbuffer, GLX_SGI_swap_control, GLX_SGI_video_sync
GLX version: 1.4
GLX extensions:
GLX_ARB_context_flush_control, GLX_ARB_create_context,
GLX_ARB_create_context_no_error, GLX_ARB_create_context_profile,
GLX_ARB_create_context_robustness, GLX_ARB_fbconfig_float,
GLX_ARB_get_proc_address, GLX_ARB_multisample, GLX_EXT_buffer_age,
GLX_EXT_create_context_es2_profile, GLX_EXT_create_context_es_profile,
GLX_EXT_framebuffer_sRGB, GLX_EXT_import_context, GLX_EXT_stereo_tree,
GLX_EXT_swap_control, GLX_EXT_swap_control_tear,
GLX_EXT_texture_from_pixmap, GLX_EXT_visual_info, GLX_EXT_visual_rating,
GLX_NV_copy_image, GLX_NV_delay_before_swap, GLX_NV_float_buffer,
GLX_NV_robustness_video_memory_purge, GLX_SGIX_fbconfig, GLX_SGIX_pbuffer,
GLX_SGI_swap_control, GLX_SGI_video_sync
OpenGL vendor string: NVIDIA Corporation
OpenGL renderer string: GeForce GTX 960M/PCIe/SSE2
OpenGL core profile version string: 4.5.0 NVIDIA 390.116
Documentation about PRIME is not really perfect, so say the least. :-( Best I could find is https://wiki.archlinux.org/index.php/PRIME#PRIME_GPU_offloading xrandr --listproviders and "glxinfo | grep OpenGL renderer" may help you here ... (In reply to Stefan Dirsch from comment #7) > Documentation about PRIME is not really perfect, so say the least. :-( Best > I could find is > > https://wiki.archlinux.org/index.php/PRIME#PRIME_GPU_offloading > > xrandr --listproviders > > and > > "glxinfo | grep OpenGL renderer" > > may help you here ... Here we are: /local/werner> xrandr --listproviders Providers: number : 2 Provider 0: id: 0x244; cap: 0x1 (Source Output); crtcs: 0; outputs: 0; associated providers: 1; name: NVIDIA-0 Provider 1: id: 0x46; cap: 0xf (Source Output, Sink Output, Source Offload, Sink Offload); crtcs: 3; outputs: 4; associated providers: 1; name: modesetting output eDP-1-1 output DP-1-2 output HDMI-1-1 output HDMI-1-2 /local/werner> glxinfo | grep "OpenGL renderer" OpenGL renderer string: GeForce GTX 960M/PCIe/SSE2 it seems that Nvidia GLX is used at all even with modesetting drvier. Beside this from the actual HTML doc I see that offloading seems to be default as well as /local/werner> xrandr --setprovideroffloadsink NVIDIA-0 modesetting X Error of failed request: BadValue (integer parameter out of range for operation) Major opcode of failed request: 140 (RANDR) Minor opcode of failed request: 34 (RRSetProviderOffloadSink) Value in failed request: 0x244 Serial number of failed request: 16 Current serial number in output stream: 17 We indeed have some magic in our Xserver in order to autoconfigure the output sinks for Optimus laptops ... --> n_xserver-optimus-autoconfig-hack.patch (package xorg-x11-server) You can switch between using NVIDIA's and Mesa's GLX/libGL (Xserver module/OpenGL lib) on Leap 15.1 via prime-select Then restart Xserver of course. Any improvements with Leap 15.2-Beta or Tumbleweed? Created attachment 846986 [details]
/etc/X11/xorg.conf.d/99-local.conf
Works flawless now
With Leap 15.2 no optimus and own configuration as shown in attachment Thanks for (positive) feedback! |