Created attachment 244369 [details] dmesg for server in question After updating all packages on an existing (working) 13-stable system, startx fails with "Cannot run in framebuffer mode. Please specify busIDs for all framebuffer devices". Rolling back just xorg-server to the previous installed version (21.1.8,1) from /var/cache/pkg fixes the problem. I'm not absolutely certain, but I suspect https://reviews.freebsd.org/D39886 may be the culprit, as it appears to be the only non-trivial changes that could affect my (amdgpu) configuration between the working and failing xorg-server versions. Attaching dmesg and Xorg.0.log for the working and failing cases.
Created attachment 244370 [details] Xorg.0.log for failing case
Created attachment 244371 [details] Xorg.0.log for working case
Looks like that in the failing case xf86-video-amdgpu is used (or attempted to be used), what's happening if you remove the package ?
Removing the package didn't help: with the package removed, the rolled back version of xorg-server still worked fine, while the updated version failed with the same error, although with slightly different logfile contents (attached).
Created attachment 244405 [details] Xorg.0.log for failing case, with xf86-video-amdgpu removed
Mhm, looks like you have an aspeed bmc in there ([101050.850] (--) PCI: (6@0:0:0) 1a03:2000:15d9:1b8e rev 82, Mem @ 0xa4000000/16777216, 0xa5000000/262144, I/O @ 0x00000000/128, BIOS @ 0x????????/65536), maybe something changed in Xorg autoconfiguration and it tries to use it. Can you test with a configuration file ? Just generate one and remove the aspeed video card from it.
Created attachment 244566 [details] Generated xorg.conf with working (older) xorg-server package
Created attachment 244567 [details] Generated xorg.conf with newer (failing) xorg-server package
Created attachment 244568 [details] xorg.conf.failing, but edited to remove ASpeed card
Created attachment 244569 [details] Xorg.0.log for failed X server start with xorg.conf.edited
Unfortunately generating and editing the config file didn't fix anything. I've attached the auto-generated config files for both the working and failing xorg-server packages, although they're identical. As expected, the unedited auto-generated config file worked (X started successfully) with the older package but not the newer package. After editing the config file, X still failed to start with the newer package; logfile attached. Based on the logfile, it looks as though the newer xorg-server package is still trying to use the ASpeed card even though its Card0 entry has been removed from the config file. Maybe a little bit more info about my server is in order here: I have an older Samsung monitor with both VGA and DVI inputs. The VGA input is connected to the ASpeed BMC, because the motherboard only produces video output on the BMC during early UEFI init, and it's useful as a fallback in case there's a problem with the amdgpu kld. The DVI input is hooked to the R550; I have amdgpu.ko configured to auto-load at boot, which places the console in framebuffer mode on the monitor's DVI input. It looks as though maybe the older xorg-server was able to auto-detect that the framebuffer was running on the R550 and use that for video output, while the newer package doesn't seem to do that. And as you might expect, if I don't load amdgpu the new xorg-server package works fine on the VGA input, but of course with crappy resolution.
Austin can you help here ? I'm not familiar with xorg-server internals. Thanks,
Can you please confirm if that commit actually introduces the regression? I agree it's the most likely culprit but it would be good to know for sure. Fwiw I had tested that on a amdgpu laptop in AMD-only mode and hadn't had issues, not that it means nothing can be wrong here.
Well, I didn't specifically do an A/B test of that change, but I'm pretty sure it was the culprit, because I just now managed to get things working with the newer xorg-server package by reading through https://badland.io/prime-configuration.md and creating /usr/local/etc/X11/xorg.conf.d/20-amdgpu.conf: Section "OutputClass" Identifier "AMD" MatchDriver "amdgpu" Driver "modesetting" Option "PrimaryGPU" "yes" EndSection So it seems as though xorg-server sees my configuration as a PRIME configuration with the BMC as the "iGPU" unless it's explicitly told to use the "dGPU" as the primary output?
Weird, that file should have been installed by xf86-video-amdgpu albeit in the /usr/local/share/X11/xorg.conf.d directory? For me I see the following from my resident ports builds: ashafer@mick:src/nvidia-drm % cat /usr/ports/x11-drivers/xf86-video-amdgpu/work/stage/usr/local/share/X11/xorg.conf.d/10-amdgpu.conf Section "OutputClass" Identifier "AMDgpu" MatchDriver "amdgpu" Driver "amdgpu" EndSection% ashafer@mick:src/nvidia-drm % cat /usr/ports/x11-drivers/xf86-video-amdgpu/work/stage/usr/local/share/X11/xorg.conf.d/20-amdgpu.conf Section "OutputClass" Identifier "AMD" MatchDriver "amdgpu" Driver "amdgpu" Option "PrimaryGPU" "yes" EndSection Interesting that it builds two, one with the PrimaryGPU option. We only add 10-amdgpu.conf to the PLIST_FILES so it's the only one installed. I'm not really familiar with your exact machine and X's autoconfig behavior is still confusing to me. To me it looks like it sees a bunch of the integrated VGA/etc entries in the PCI list. Due to the new probing stuff being enabled by D39886 it can't pick any of them to be the primary, so it needs one to be marked PrimaryGPU in the OutputClass configuration. I would try unplugging the VGA connector to the Aspeed card? Maybe X is seeing that both GPUs display to the same output and it decides it's a PRIME setup? Not totally sure there. Does that amdgpu config file *have* to have PrimaryGPU? I assume it does, just wanting to confirm. I'd also check that the one installed in /usr/local/share/X11/xorg.conf.d is there too.
*** Bug 275501 has been marked as a duplicate of this bug. ***