I just got a second hand laptop. It came with Windows 10 installed, and 1 hard disk and 2 SSD (NVMe) were detected. These two SSD are: - THNSN51T02DUK NVMe TOSHIBA Bus nbr 1, Target Id 0, LUN 0 - THNSN51T02DUK NVMe TOSHIBA Bus nbr 3, Target Id 0, LUN 0 Then I replaced Windows by FreeBSD -CURRENT, and these NVMe cannot be used: # grep -i nvme /var/run/dmesg.boot nvme0: <Generic NVMe Device> mem 0xdd600000-0xdd603fff irq 16 at device 0.0 on pci3 nvme1: <Generic NVMe Device> mem 0xdd400000-0xdd403fff irq 16 at device 0.0 on pci6 # cat /boot/loader.conf kern.geom.label.disk_ident.enable="0" kern.geom.label.gptid.enable="0" cryptodev_load="YES" zfs_load="YES" nvme_load="YES" nvd_load="YES" # nvmecontrol devlist nvme0: IDENTIFY (06) sqid:0 cid:0 nsid:0 cdw10:00000001 cdw11:00000000 nvme0: ABORTED - BY REQUEST (00/07) crd:0 m:0 dnr:0 sqid:0 cid:0 cdw0:0 nvme1: IDENTIFY (06) sqid:0 cid:0 nsid:0 cdw10:00000001 cdw11:00000000 nvme1: ABORTED - BY REQUEST (00/07) crd:0 m:0 dnr:0 sqid:0 cid:0 cdw0:0 # ls -l /dev/nvme* crw------- 1 root wheel 0x36 Mar 22 18:49 /dev/nvme0 crw------- 1 root wheel 0x37 Mar 22 18:49 /dev/nvme1 # ls -l /dev/nda* ls: /dev/nda*: No such file or directory Full dmesg and devinfo outputs are on https://wiki.freebsd.org/Laptops/Dell_Alienware_17R4. Any tip to make them usable?
I have tried many things: BIOS tweaking, with or without hw.nvme.use_nvd, with vmd_load, etc., but no satisfying result ATM. Note: initially, a device /dev/ntfs was shown and locked one NVMe. It was caused by KDE, and freed when I booted the machine without starting KDE. This is very strange: at some time /dev/nvd0 and /dev/nvd1 were there, nvmecontrol identify was OK for the two devices, gpart destroy of the Windows stuff, gpart create and gpart add -t freebsd-zfs were successfull, and I created two zpools on them. But they are not persistent! After a reboot, I see sometimes only one of them, and sometimes none… Remark 1: when I can see a zpool, this is always the same. And yes, when the machine ran Windows, both SSDs were always active. Remark 2: it seems that chances to get at least one NVMe are greater after a cold boot, and chances to see none of them are greater after a reboot.
Created attachment 241246 [details] Verbose dmesg after a successful boot After this boot the two NVMe were usable.
Created attachment 241247 [details] Verbose dmesg after an unsuccessful boot After this boot, none of the NVMe were shown. Same configuration than for the previous file.
(In reply to Thierry Thomas from comment #1) For completeness, here is the /boot/loader.conf giving the better results (as described in comment #1): kern.geom.label.disk_ident.enable="0" kern.geom.label.gptid.enable="0" cryptodev_load="YES" zfs_load="YES" nvme_load="YES" nvd_load="YES" #vmd_load="YES" hw.vga.textmode=1 nvidia-modeset="YES" # No success with nda (4) #hw.nvme.use_nvd=0 # See PR 264172 #hw.pci.enable_pcie_hp=1
(In reply to Thierry Thomas from comment #4) I'm not sure, but since you're not using `nda(4)` can you comment out `nvme_load="YES"` in your /boot/loader.conf ? I've Intel Optane M.2 SSD using nvd(4) driver and it works great (on 13.1).
(In reply to Zhenlei Huang from comment #5) You are right: this is not necessary, because nvme module is already in kernel. Then with or without it does not change the result.
Marcus Oliveira reported a similar problem in https://bugs.freebsd.org/bugzilla/show_bug.cgi?id=262969#c15 But the question remains: with the same PSU, why are these NVMe usable on Windows and not on FreeBSD?
I might add that the problem also raises on Linux... I've tried all distributions and hypervisors you could think of. It only works on Windows. Marcus Oliveira