ASRock.com Homepage
Forum Home Forum Home > Technical Support > Intel Motherboards
  New Posts New Posts RSS Feed - NVMe RAID0 Speed decreased
  FAQ FAQ  Forum Search Search  Events   Register Register  Login Login

NVMe RAID0 Speed decreased

 Post Reply Post Reply
Author
Message Reverse Sort Order
Souths1der View Drop Down
Newbie
Newbie
Avatar

Joined: 07 Aug 2016
Location: SW Suburbs of C
Status: Offline
Points: 19
Post Options Post Options   Thanks (0) Thanks(0)   Quote Souths1der Quote  Post ReplyReply Direct Link To This Post Topic: NVMe RAID0 Speed decreased
    Posted: 15 Aug 2016 at 8:37am
I'm in Lemont.

Yes, Intel, hardware level RAID array.

Yes F6->IRST Driver.

I'm using version 14.6 since that's the version I had handy.  I haven't downloaded the latest 14.8 yet.

Not only did my Array survive the BIOS update, it survived the transplant to the new board.  Although I did have the performance degradation.  It could be possible I did not lose my array because on the new board with 1.50 BIOS I did not have the array configured yet.  I didn't do the RAID configuration until after the update to 2.60.  So, the array did not have to go through the BIOS update.

I did see the three drives listed in the Windows IRST program.

Update:  I blew away the array and rebuilt it, reinstalled the OS (still with 14.6).  I have not installed the windows IRST program yet, or done any other windows based array optimizations.  I ran an AS SSD, 3979.  So it seems my speed is back up there, and also shows that something happened to the array after all, just not total loss I suppose.  I have no idea what that something could be though.
Back to Top
parsec View Drop Down
Moderator Group
Moderator Group
Avatar

Joined: 04 May 2015
Location: USA
Status: Offline
Points: 4996
Post Options Post Options   Thanks (0) Thanks(0)   Quote parsec Quote  Post ReplyReply Direct Link To This Post Posted: 14 Aug 2016 at 9:58pm
Originally posted by Souths1der Souths1der wrote:

Thanks guys.  I was leaning toward my gut reaction to rebuild the OS because of "hidden gems" as you are eluding to.  I will probably go ahead and do that.  Thanks for easing my mind.

I don't have all the answers to your questions, but here goes with what I know:

I know for sure I was on UEFI2.60 on the old, and upgraded to that same BIOS from the UEFI1.50 on the replacement board before ever booting into Windows.

The NVMe drives were installed prior to board installation.  However, I  did not destroy my RAID0 array because the settings weren't correct.  After a reboot and re-enter BIOS after making the changes to activate IRST in the UEFI, my array was recognized, and obviously I can boot into Windows, so the data was safe.

I do not have screenshots of the benchmark runs, I'm really bad about forgetting screenshots of things.  I tend to get focused on the results, and forget all the routine tasks from time to time.

The two runs were a week apart.  The 4104 was on 8/6/16, on a fresh Win10 install after switching to the B BIOS for good.  Temperature issues popped up over the next couple days, RMA was Wednesday evening.  New board came Thursday afternoon (WOW Amazon).  1470 run was yesterday, 8/13/16.

No overclocks were applied to either of these runs.


Your RAID 0 array, is that an Intel IRST RAID 0 array, or a Windows RAID array? I talked about IRST in my last post, and you did not correct me so I just want to be 100% clear. Smile  It seems to me it is an IRST RAID 0 array.

It seems that you updated the UEFI on the new board from 1.50 to 2.60, with your 950 Pros in the board, is that correct? If that is right, that is the first time I've heard of a PCIe NVMe RAID 0 array surviving a UEFI update while mounted in the M.2 slots. Confused  I'm not saying that is not true, I'm glad your RAID 0 volume survived, but that is not what myself and others have experienced. The question becomes, why did yours survive a UEFI update. Given your performance issue, maybe you experienced some damage to the RAID 0 volume.

To review, the failure of a PCIe NVMe RAID 0 array happens after a UEFI clear and UEFI update. It is not known (well, I don't know and have never seen an explanation about it, but have some suspicions) why that happens. It happens on a RAID 0 volume of 950 Pros whether it is an OS volume or not. I've tested that several times and other 950 Pro RAID users in this forum verified they also experienced this themselves. I've never tested this with the only other NVMe SSDs usually used in PCs, the Intel 750, and now the OCZ RD400. There are other Intel NVMe SSD models. Just an FYI... no, warning for you, since losing your OS after clearing or updating your board's UEFI is not good, and not what happens with IRST RAID and SATA SSDs.

Questions, which version of IRST did you install when you installed Windows? Or did you not install the "F6" IRST RAID driver? Did you install the Intel Rapid Storage Technology driver and utility ver:14.8.0.1042 software from your board's download page? If so, if you run the Windows IRST program, you should see your three 950s in that software.

We have yet to see any bad 950 Pros, so I highly doubt your SSDs are the problem. Also, I have not heard of or had any problems with the M.2 slots. Of course, what could be worse than the SATA data connection. Angry

It looks like you live fairly close to me, Romeoville, is that right? I live in Bartlett, so really not that far.
Back to Top
Souths1der View Drop Down
Newbie
Newbie
Avatar

Joined: 07 Aug 2016
Location: SW Suburbs of C
Status: Offline
Points: 19
Post Options Post Options   Thanks (0) Thanks(0)   Quote Souths1der Quote  Post ReplyReply Direct Link To This Post Posted: 14 Aug 2016 at 7:07pm
Thanks guys.  I was leaning toward my gut reaction to rebuild the OS because of "hidden gems" as you are eluding to.  I will probably go ahead and do that.  Thanks for easing my mind.

I don't have all the answers to your questions, but here goes with what I know:

I know for sure I was on UEFI2.60 on the old, and upgraded to that same BIOS from the UEFI1.50 on the replacement board before ever booting into Windows.

The NVMe drives were installed prior to board installation.  However, I  did not destroy my RAID0 array because the settings weren't correct.  After a reboot and re-enter BIOS after making the changes to activate IRST in the UEFI, my array was recognized, and obviously I can boot into Windows, so the data was safe.

I do not have screenshots of the benchmark runs, I'm really bad about forgetting screenshots of things.  I tend to get focused on the results, and forget all the routine tasks from time to time.

The two runs were a week apart.  The 4104 was on 8/6/16, on a fresh Win10 install after switching to the B BIOS for good.  Temperature issues popped up over the next couple days, RMA was Wednesday evening.  New board came Thursday afternoon (WOW Amazon).  1470 run was yesterday, 8/13/16.

No overclocks were applied to either of these runs.
Back to Top
Xaltar View Drop Down
Moderator Group
Moderator Group
Avatar

Joined: 16 May 2015
Location: Europe
Status: Online
Points: 26933
Post Options Post Options   Thanks (0) Thanks(0)   Quote Xaltar Quote  Post ReplyReply Direct Link To This Post Posted: 14 Aug 2016 at 4:38pm
Originally posted by parsec parsec wrote:


Another situation with a single 950 Pro was solved with a new Windows 10 installation. This apparently happened when a 950 Pro was connected to the mother board and used with an existing Windows installation. The reason for the performance loss was never identified, and RAID was not being used or enabled.

...............................................................................

One of the prices of being a moderator... Pinch  Wink

Just to elaborate a little on that, even a different BIOS version between the 2 boards could cause OS inconsistencies. It is one thing to update the BIOS and another to swap the board to one with a different BIOS, there are numerous hidden values in any UEFI that are particular to a given board. Windows 10 in particular seems to be sensitive to some of these. When you set up windows on a system there are changes made to the UEFI by the installation process (the hidden values I eluded to above) if the OS was not installed on this board there may be entries on the new one that differ from the old. 

If you have a spare drive knocking about you could try installing Windows 10 on that with the raid array disconnected and see if that resolves the issue (after hooking up the RAID array again). It may require another raid array to properly set the values though. Worth a shot considering how quick an OS install is these days.

"Price of being a Moderator" LOL  
I hear that Wink
Back to Top
parsec View Drop Down
Moderator Group
Moderator Group
Avatar

Joined: 04 May 2015
Location: USA
Status: Offline
Points: 4996
Post Options Post Options   Thanks (0) Thanks(0)   Quote parsec Quote  Post ReplyReply Direct Link To This Post Posted: 14 Aug 2016 at 12:44pm
Originally posted by Souths1der Souths1der wrote:

A week ago I received my first Z170 OC Formula.  I set it up with 3 x Samsung 950 Pro's in RAID0 as my boot/only drive.  I ended up having 2 issues (BIOS A was corrupt and core Temp readings were off).  I did a little bit of troubleshooting, but finally figured since I purchased from Amazon that I would just exchange for a new one.  Got the new board in and put all the hardware into it, including placing all 3 NVMe drives in the same slots as they were in the previous board.  I set all the same BIOS settings and booted into Windows 10, no issues.  BIOS A is fine and temperatures are reading proper.  It was all rather easy.

However, I ran an AS SSD benchmark on my RAID0 and the benchmark dropped from 4104 on the old board to 1470 on the new one.  My gut is telling me to blow the array away and rebuild it, reinstall all the software.  Anybody else run into an issue like this, or have any ideas on what I could possibly be missing?


We have seen an occasional issue with performance dropping with 950 Pros, both single SSDs and RAID 0 arrays. The main one we have is a performance drop after waking from Windows Sleep. That remains a mystery, although it may be related to a certain type of registers in the CPU used by DRAM memory whose contents were not maintained when resuming from Sleep. I don't see that as related to your problem, since a simple reboot fixes this issue.

Another situation with a single 950 Pro was solved with a new Windows 10 installation. This apparently happened when a 950 Pro was connected to the mother board and used with an existing Windows installation. The reason for the performance loss was never identified, and RAID was not being used or enabled.

Since the Samsung Magician software cannot deal with RAID, you cannot get any information about each of the 950 Pros, like this, right:



You're familiar with the Z170 OC Formula, and its switches like the Slow Mode switch for the CPU that is set to On from the factory. I'm not aware of another switch or UEFI setting that would affect the speed of your RAID 0 array. I imagine you have the PCIe Remapping options set in Storage Configuration (or whatever they are called.)

You would think moving an OS installation to the identical model mother board would not cause any issues, but who knows what underlying things in the Windows Registry, for example, might be causing some weird problem.

When was the last time you ran AS SSD on the RAID 0 array in the old board? How long between running the AS SSD benchmark result of 4104, and getting the new board? I'm wondering if something happened to your RAID array before you moved to the new board? You mentioned you thought you had a corrupt main UEFI, I wonder if that is related to this situation?

I'm curious about what the differences in the AS SSD results between the two boards. Different over clocks, memory speeds, chipset and CPU power saving options, and Windows Power Plan settings alone or combined should not cause that much of a performance difference. I'd love to see both screenshots of the AS SSD results, if possible.

Did you have different UEFI versions on the old and new boards? Or is that information lost forever... Wink  Any chance your new board has the 2.50 UEFI with the "Update NTFS module." change? I don't know if that makes a difference or not, just an observation.

RAID with NVMe PCIe SSDs is still very new, IRST version 14 is the first to give us RAID with NVMe SSDs. It's a miracle it works IMO, and thanks to Intel (and ASRock) for giving us this capability. But compared to IRST RAID with SATA drives, IRST RAID with PCIe NVMe SSDs is what I tend to call "fragile".

For example, are you aware that simply clearing the UEFI with the board's jumper or clr CMOS button will ruin the RAID 0 array if you simply start the PC into the UEFI after the UEFI/CMOS clear? One 950 Pro RAID 0 user in this forum literally physically removes his 950 Pros from his board before clearing the UEFI, and puts them back only after re-establishing all the required UEFI settings. That is the only way we know of currently of preventing the failure of a RAID array of NVMe SSDs after a UEFI/CMOS clear. This is also true when a UEFI update is applied to the board, since that sets all the UEFI options to their default values.

Bottom line, I've got nothing I can tell you that will fix it. I wish I did. I stopped using RAID 0 with my 950 Pros, because I clear and update my Z170 Extreme7+ board's UEFI all the time. One of the prices of being a moderator... Pinch  Wink
Back to Top
Souths1der View Drop Down
Newbie
Newbie
Avatar

Joined: 07 Aug 2016
Location: SW Suburbs of C
Status: Offline
Points: 19
Post Options Post Options   Thanks (0) Thanks(0)   Quote Souths1der Quote  Post ReplyReply Direct Link To This Post Posted: 14 Aug 2016 at 8:23am
A week ago I received my first Z170 OC Formula.  I set it up with 3 x Samsung 950 Pro's in RAID0 as my boot/only drive.  I ended up having 2 issues (BIOS A was corrupt and core Temp readings were off).  I did a little bit of troubleshooting, but finally figured since I purchased from Amazon that I would just exchange for a new one.  Got the new board in and put all the hardware into it, including placing all 3 NVMe drives in the same slots as they were in the previous board.  I set all the same BIOS settings and booted into Windows 10, no issues.  BIOS A is fine and temperatures are reading proper.  It was all rather easy.

However, I ran an AS SSD benchmark on my RAID0 and the benchmark dropped from 4104 on the old board to 1470 on the new one.  My gut is telling me to blow the array away and rebuild it, reinstall all the software.  Anybody else run into an issue like this, or have any ideas on what I could possibly be missing?
Back to Top
 Post Reply Post Reply
  Share Topic   

Forum Jump Forum Permissions View Drop Down

Forum Software by Web Wiz Forums® version 12.04
Copyright ©2001-2021 Web Wiz Ltd.

This page was generated in 0.141 seconds.