ASRock.com Homepage
Forum Home Forum Home > Technical Support > Intel Motherboards
  New Posts New Posts RSS Feed - RAID Setup Menu
  FAQ FAQ  Forum Search Search  Events   Register Register  Login Login

RAID Setup Menu

 Post Reply Post Reply Page  <123>
Author
Message
parsec View Drop Down
Moderator Group
Moderator Group
Avatar

Joined: 04 May 2015
Location: USA
Status: Offline
Points: 4996
Post Options Post Options   Thanks (0) Thanks(0)   Quote parsec Quote  Post ReplyReply Direct Link To This Post Posted: 31 Dec 2016 at 2:12am
Originally posted by TwymanD TwymanD wrote:

Thank eComposer. I can see in the video, the menu I am missing in my BIOS options... sucks. I feel ASRock just needs to do a firmware update for the x99 Taichi. I built my entire computer around this motherboard. First I had to send back the i7 6800 due to it not having 40 lanes, meaning it could not even see both PCIe NVMe SSDs. Now that I have the i7 6850, I find out I cant even RAID it. FML, I am now forced to decide;
1. Send the ASRock x99 Taichi Motherboard and i7-6850 LGA2011 CPU back, and reorder (AGAIN)
2. See if I can get ASRock to update the firmware to enable the m.2 slots to RAID0
3. Figure out if a software RAID is an option.
4. Live with it, which sucks the most. As I paid for high performance rig. Just knowing my two m.2's aren't in RAID0 will fester and build in my mind as hatred for this situation and ASRock/Newegg.


Sorry to say, we can't do anything we'd like to do on every PC platform. Particularly with very new types of hardware like NVMe SSDs. We cannot ignore the details of the hardware specifications, like the number of PCIe lanes available or the features the chipset provides and supports. If we don't closely check every detail of a board's specifications, and the specs of the processors we can use, we are at fault if the hardware does not provide the features we want.

The features a board supports is not controlled by a mother board manufacture. All features are provided by the CPU and chipset used in a board and any required software, by the manufacture of the CPU and chipset. That is Intel in this case. A mother board manufacture cannot change any features and limitations of the CPU and chipset. Mother board manufactures do not write the OS drivers or Option ROMs included in a board's firmware, that is strictly done by the CPU and chipset manufacture. Assuming anything about these things is a mistake. Hating ASRock and Newegg for that does not make sense.

The ONLY chance, which I highly doubt is even possible, is if Intel, and ONLY Intel, found a way to provide IRST RAID 0 for PCIe NVMe SSDs on the X99 chipset. Given the differences between the X99 and Z170 chipsets, that will never happen IMO. I suggest you don't hope or plan on that happening.

The Z170 and other Intel 100 series chipsets that support RAID are the first and ONLY systems that can use PCIe NVMe SSDs in IRST RAID. NVMe is a new and different storage protocol than SATA, using its own driver and storage controllers that are part of the SSD itself. SATA controllers are part of the board's chipset, and are not part of a SATA drive. How Intel managed to get their IRST RAID driver to work with NVMe SSDs is a miracle, but there is much more involved than that software.

Don't expect a different mother board manufacture to provide RAID with PCIe NVMe SSDs on their X99 boards, they can't. If the hardware/chipset does not provide that feature, it cannot be added later.

Sorry to say, software RAID in Windows is normally not an option. I have NO IDEA if the following would work, but you could try to clone an existing OS installation onto a Windows software RAID array. But if that worked we'd know about it now, and I've never heard that it does.

Have you ever used a RAID 0 array of SATA SSDs? I have many times, and I've had a RAID 0 array of 950 Pros. Why people think that RAID 0 will be a magical, super speed result, I don't understand. In reality, it isn't. IF you are working with very large, 100MB - 1GB+ files and folders all the time, then a RAID 0 array can read and write them in half the time of a single drive. Otherwise, for booting an OS, there is no difference in speed. Plus the way the Intel IRST driver works with PCIe NVMe SSDs now, you will NOT get twice the large file sequential read speed. The best we saw with two 950 Pros was about 3,200MBs. I don't expect the new 960 SSDs will do any better, again limited by the RAID software for some reason.
Back to Top
Xaltar View Drop Down
Moderator Group
Moderator Group
Avatar

Joined: 16 May 2015
Location: Europe
Status: Online
Points: 22928
Post Options Post Options   Thanks (0) Thanks(0)   Quote Xaltar Quote  Post ReplyReply Direct Link To This Post Posted: 31 Dec 2016 at 2:44am
Software RAID cannot be bootable, period. The array is only populated once the OS loads and as such the drives are seen as individual drives during boot. I spent a lot of time and effort trying to get a software RAID array to be bootable some years back and had someone rather rudely inform me of this fact LOL None the less the information was useful and I was grateful that they stopped me wasting more time on the impossible. 

The only way an X99 board will support NVMe RAID is if a 3rd party controller is added to the board directly by the manufacturer or as an add in card. I have yet to see any such controllers or add in cards on the market however, I am not saying they don't exist, I just have not seen any (searched to no avail). I don't doubt that there likely will be add in cards at some later date with this functionality. Then you may well be able to indulge your RAID 0 NVMe itch but for now not only is it rather pointless but also only possible on 100 series based boards. Take a look online for RAID 0 NVMe performance results, you will quickly see that the performance is not worth the cost of the 2 (or more) drives. 

Parsec is our resident storage guru and if he says the performance is not worth it, I believe him.
Back to Top
eComposer View Drop Down
Newbie
Newbie
Avatar

Joined: 30 Dec 2016
Location: LA
Status: Offline
Points: 26
Post Options Post Options   Thanks (0) Thanks(0)   Quote eComposer Quote  Post ReplyReply Direct Link To This Post Posted: 31 Dec 2016 at 6:47pm
Originally posted by parsec parsec wrote:


No idea which UEFI/BIOS version you have in your board, although they should all work. New versions are better, you should use the latest version. With earlier UEFI versions, any time you cleared the UEFI/BIOS, the RAID 0 array of PCIe NVMe SSDs would be lost, as I learned the hard way. That has been fixed since then, as well as a new Intel IRST Option ROM added.

Install the two 960 EVOs. Since the M.2 slots share resources with the SATA ports, be sure you don't have any SATA drives in the SATA ports that are shared with the M.2 slots. Also, as in ANY Windows installation, do NOT have any other drives connected (powered up) to the PC.

Since I don't know what options you set in the UEFI, clear it with the clear CMOS jumper on the board. DO NOT SKIP THIS STEP!

Go into the UEFI/BIOS, and go to the Storage Configuration screen. Set the SATA Mode Selection to RAID. You MUST MAINTAIN this setting at all times!

If you have an earlier UEFI version, at this point you must Save and Exit the UEFI, and go right back in again to continue. If you have a later UEFI version, the options I'll discuss now may already be shown, but if not, Save and Exit the UEFI and go right back in again to the Storage Configuration screen. Don't skip the Save and Exit, you'll be sorry if you do.

Find the Launch Storage OpROM Policy option, set it to UEFI Only. If you don't find the Launch Storage OpROM Policy option in Storage Configuration, you'll need to go to the Boot screen, find the CSM option at the very bottom, and set the Launch Storage OpROM Policy option to UEFI Only.

You should find two options, called PCIe Remapping, one for each SSD in the M.2 slots that you want to use for the RAID array. Set both of those options to Enabled.

Once again, Save and Exit the UEFI, and go right back in again. Go to the Advanced screen, and at the bottom you should find an entry for Intel Rapid Storage Technology. Use that to create the RAID 0 array. I assume you know how to do that. Be sure to choose the 128K stripe size for the RAID 0 array, since that will give you the best performance. Finally done with creating the RAID 0 array. If you don't follow this procedure, you won't have a working RAID 0 array of your 960 EVOs.

Next, what IRST driver are you loading during the Windows 10 installation? Only one is correct, which is the SATA Floppy Image ver:15.2.0.1020 driver on your board's download page. Why they still use that terminology, floppy, I don't know but that is the driver file you need. Of course you must unzip the download file, and copy the f6flpy-x64 folder to a USB flash drive. During the Win 10 installation, connect both USB flash drives to USB ports on the board's IO panel ONLY. Also, do NOT REMOVE either USB flash drive from the PC until your Windows 10 installation completely boots to the Desktop for the first time.

Another detail, about your Win 10 installation USB flash drive. In the boot order in the UEFI, you MUST select the entry for the USB flash drive that is, "UEFI: <flash drive name>". That is a required step. Be sure that entry is selected as first in the boot order, and Save and Exit to start the Win 10 installation.

You seem to know how the Win 10 installation goes, choosing Custom, where you won't see the RAID 0 array until after you load the driver from the f6flpy-x64 folder on the second USB flash drive. If the RAID 0 array does not appear after loading the driver, something was not done correctly.

Not quite done yet. Back from the Load Driver screen, with the RAID 0 array now shown, you must format it. Find the New button, click it, which will display a message about creating multiple partitions, just click Ok, and it should format it correctly. Continue with the installation from there.

That's it.

Performance of a storage device is not shown in one speed spec. For example, the 960 EVO "sequential read" speed is ~3,200MBs, but that is only for large files, over 128KB, and varies depending upon the size of the file. No SSD reads all data/files at 3,200MBs, or its maximum sequential read speed spec. Small 4KB - 16KB files will be read at ~55MB/s, which is called the "4K Random Read" speed. That is normal for any drive, the small file read speed is much lower. One side affect of a RAID 0 array is some loss of the 4K read speed performance, not a lot but it will be at ~50MB/s instead of ~55MB/s. That is normal and happens with any type of drive in a RAID 0 array.

Loading/booting an OS involves reading many

Thanks Parsec,

Interestingly I had followed all the steps listed above (except I used 64kb strip per Tweaktown, and did not clear the CMOS).

I'm not sure what is going wrong, it just won't give the RAID 0 array as an option to install Windows to...

From what you're saying though, RAID 0 has performance penalties to just using a single SSD.

I'm mixing music with multiple channels (often 50+ depending on using "stems"), and each has multiple plugins running (including many high end emulations and samples etc, effects, amp simulators, console emulations etc etc), and I/O is key to avoid drop outs, distortion and other issues detracting from the real time audio output.

Would you suggest just using a single Pcie 3 x4 SSD as the boot drive vs RAID 0, as I/O is the objective here to support the best mixing conditions I can achieve?  (Essentially looking for the best streaming config available). 

I thought the specs for RAID 0 using the M 2 Pcie 3 x4 slots leveraging NVMe via the ASRock Z170 Extreme 7+ was supposed to deliver the best I/O, but your comment seems to indicate the reverse (that is non RAID vs RAID 0).

FYI:  I'm using the Thunderbolt 2 ASRock card with a high end Thunderbolt audio interface to minimize latency etc (especially when recording multiple tracks "real time" and monitoring vs mixing).

RAID 0 or not RAID? - This would be helpful to know since I've held off a full update to build a new config from scratch aiming to reinstall everything on RAID 0 (and have frequent backup capabilities set up to guard against RAID 0 failure).  Maybe avoiding RAID 0 is the better option then?
Back to Top
eComposer View Drop Down
Newbie
Newbie
Avatar

Joined: 30 Dec 2016
Location: LA
Status: Offline
Points: 26
Post Options Post Options   Thanks (0) Thanks(0)   Quote eComposer Quote  Post ReplyReply Direct Link To This Post Posted: 31 Dec 2016 at 7:16pm
I'm finding multiple technical analysis that's showing that RAID 0 should deliver considerable performance advantages over non RAID as outlined in the following publication:


Check out the whole article and the attached charts on that site, and there seem to me to be distinct advantages to RAID 0 vs a single equivalent SSD, unless I'm not understanding this correctly.

Note:  The I/O to the CPU, and in my case external audio interface will be a factor, not to mention RAM performance, but given we're just talking the SSD equation, then I'm not following why a single SSD vs RAID 0 in the configs we're talking about will deliver better I/O for my scenario.

Interested to see more on this, as this Forum seems to be turning all that I've read and understood on it's head.  :)
Back to Top
eComposer View Drop Down
Newbie
Newbie
Avatar

Joined: 30 Dec 2016
Location: LA
Status: Offline
Points: 26
Post Options Post Options   Thanks (0) Thanks(0)   Quote eComposer Quote  Post ReplyReply Direct Link To This Post Posted: 01 Jan 2017 at 9:10am
Hmmm, this is one of several interesting conclusions drawn by Tweaktown:


While this isn't an exact test of the 960 Evo pcie 3 x4 SSD, it's certainly an indication of potential performance with RAID 0 outperforming a single drive.

I'm still not clear yet though on other choke points for I/O such as the overall bus limitations and CPU capacity, RAM, and the speed that the variety of audio software programs in complex situations that real time mixing throws up in terms of performance needs and actual real world outcomes.

Still, I have a strong suspicion that combining Thunderbolt throughput (to a Thunderbolt Audio Interface) with fast M 2 PCIe RAID arrays, and overclocked CPU and RAM on the Z170 Extreme 7+ is going to outperform most other PC based approaches at this point in time - Specifically for what I'm doing with mixing multiple audio channels via DAWs and all the processing associated with audio plugins (compressors, EQ, console emulations, amp simulations etc etc). It's just taking time to work this out, so any perspectives on this would be most welcome!  :)
Back to Top
parsec View Drop Down
Moderator Group
Moderator Group
Avatar

Joined: 04 May 2015
Location: USA
Status: Offline
Points: 4996
Post Options Post Options   Thanks (0) Thanks(0)   Quote parsec Quote  Post ReplyReply Direct Link To This Post Posted: 01 Jan 2017 at 11:29am
Originally posted by eComposer eComposer wrote:

Thanks Parsec,

Interestingly I had followed all the steps listed above (except I used 64kb strip per Tweaktown, and did not clear the CMOS).

I'm not sure what is going wrong, it just won't give the RAID 0 array as an option to install Windows to...

From what you're saying though, RAID 0 has performance penalties to just using a single SSD.

I'm mixing music with multiple channels (often 50+ depending on using "stems"), and each has multiple plugins running (including many high end emulations and samples etc, effects, amp simulators, console emulations etc etc), and I/O is key to avoid drop outs, distortion and other issues detracting from the real time audio output.

Would you suggest just using a single Pcie 3 x4 SSD as the boot drive vs RAID 0, as I/O is the objective here to support the best mixing conditions I can achieve?  (Essentially looking for the best streaming config available). 

I thought the specs for RAID 0 using the M 2 Pcie 3 x4 slots leveraging NVMe via the ASRock Z170 Extreme 7+ was supposed to deliver the best I/O, but your comment seems to indicate the reverse (that is non RAID vs RAID 0).

FYI:  I'm using the Thunderbolt 2 ASRock card with a high end Thunderbolt audio interface to minimize latency etc (especially when recording multiple tracks "real time" and monitoring vs mixing).

RAID 0 or not RAID? - This would be helpful to know since I've held off a full update to build a new config from scratch aiming to reinstall everything on RAID 0 (and have frequent backup capabilities set up to guard against RAID 0 failure).  Maybe avoiding RAID 0 is the better option then?


That's strange the RAID 0 array can't be seen by the Win 10 installer. I went through the procedure again, and I don't think I left anything out. The RAID 0 array won't be shown as a drive until after the IRST "F6" driver is installed. Can you do a "refresh" in main the Custom installation screen, as in look for drives again?

I did not mention that you must format the RAID 0 array after the F6 driver is loaded. All you do is click the New button and the installer will format the RAID array correctly, as GPT and all required partitions.

Also do not remove the USB drive with the F6 driver from the PC until the Windows installation is complete. It's been a while since I've used a RAID 0 array of 950 Pros. But I know it works. The 960 should be no different than a 950 in RAID for a Windows installation.

When the Z170 Extreme7+ board was first released, we had a thread in this forum about creating and using 950 Pro's in RAID 0 arrays. Several forum members and I worked out the details ourselves. One guy had three 950 Pro's in RAID 0. At that time with the very first IRST driver (14.0...) that supported NVMe SSDs in RAID with Z170 boards, the difference between the benchmark results of two vs three SSDs was minimal, at best 500MB/s faster for sequential read speed. That guy was disappointed, but we never could improve the results. Also, anything less than a 64K stripe size would result in terrible benchmark results with 950 Pros in RAID 0. At that time with IRST version 14, we all agreed the 128K stripe size was the best. If that has changed with newer versions of IRST, great.

Personally, I always configure a full UEFI booting installation, meaning CSM set to Disabled. The only problem with that is your video source must be GOP compatible, a UEFI booting protocol. Older video cards (pre-Nvidia 700 series) may not be GOP compatible without a VBIOS update. No idea about ATI/AMD video cards. EVGA 600 series cards needed a VBIOS update to be GOP compatible, but it worked. Intel integrated graphics is GOP compatible since Sandy Bridge.

Regarding the articles about how fast and great RAID 0 arrays of NVMe SSDs are: By all means, be my guest and use them! The only way to really know what they are like is to have and use them.

I'll make one comment about the articles, the graphs in particular. Yes, you can see the clear difference in the benchmark tests with the RAID 0 arrays, with their multi-hundreds of thousands of IOPs. But check one axis of the graphs labeled Queue Depth (QD.) QD is the number of outstanding IO requests waiting to be serviced by the drive or RAID array. NVMe SSDs have even better high QD performance, and better 4K random read performance than SATA SSDs.

It is well known that in home PC usage, since even a single SSD is so fast, that the number of outstanding IO requests is rarely, if ever, four. That is called a QD of 4, or QD = 4. That statistic was done with SATA SSDs.

Notice in the test graphs, the maximum QD=32 for IOPs, and for latency, QD=16. Unless you are hosting a website on your PC, or running database queries against millions of data records, you'll never be doing IO at even QD=4. In short, yes the performance potential is there, but most people never use it. I can't predict what benefits you will get from you usage case, but do you think you will ever use the ~200,000 IOPs (IO Operations Per second) of a single NVMe SSD? Do we need 400,000+ IOPs of the RAID 0 array?

I'm also very certain that a RAID 0 array of NVMe SSDs will not boot Windows faster than a single identical NVMe SSD. The same is true of SATA SSDs in RAID 0. From a cold start/boot, run Task Manager and click the Startup tab. Check the Last BIOS Time at the top right.
Back to Top
eComposer View Drop Down
Newbie
Newbie
Avatar

Joined: 30 Dec 2016
Location: LA
Status: Offline
Points: 26
Post Options Post Options   Thanks (1) Thanks(1)   Quote eComposer Quote  Post ReplyReply Direct Link To This Post Posted: 01 Jan 2017 at 6:41pm
Originally posted by parsec parsec wrote:

Originally posted by eComposer eComposer wrote:

Thanks Parsec,

Interestingly I had followed all the steps listed above (except I used 64kb strip per Tweaktown, and did not clear the CMOS).

I'm not sure what is going wrong, it just won't give the RAID 0 array as an option to install Windows to...

From what you're saying though, RAID 0 has performance penalties to just using a single SSD.

I'm mixing music with multiple channels (often 50+ depending on using "stems"), and each has multiple plugins running (including many high end emulations and samples etc, effects, amp simulators, console emulations etc etc), and I/O is key to avoid drop outs, distortion and other issues detracting from the real time audio output.

Would you suggest just using a single Pcie 3 x4 SSD as the boot drive vs RAID 0, as I/O is the objective here to support the best mixing conditions I can achieve?  (Essentially looking for the best streaming config available). 

I thought the specs for RAID 0 using the M 2 Pcie 3 x4 slots leveraging NVMe via the ASRock Z170 Extreme 7+ was supposed to deliver the best I/O, but your comment seems to indicate the reverse (that is non RAID vs RAID 0).

FYI:  I'm using the Thunderbolt 2 ASRock card with a high end Thunderbolt audio interface to minimize latency etc (especially when recording multiple tracks "real time" and monitoring vs mixing).

RAID 0 or not RAID? - This would be helpful to know since I've held off a full update to build a new config from scratch aiming to reinstall everything on RAID 0 (and have frequent backup capabilities set up to guard against RAID 0 failure).  Maybe avoiding RAID 0 is the better option then?


That's strange the RAID 0 array can't be seen by the Win 10 installer. I went through the procedure again, and I don't think I left anything out. The RAID 0 array won't be shown as a drive until after the IRST "F6" driver is installed. Can you do a "refresh" in main the Custom installation screen, as in look for drives again?

I did not mention that you must format the RAID 0 array after the F6 driver is loaded. All you do is click the New button and the installer will format the RAID array correctly, as GPT and all required partitions.

Also do not remove the USB drive with the F6 driver from the PC until the Windows installation is complete. It's been a while since I've used a RAID 0 array of 950 Pros. But I know it works. The 960 should be no different than a 950 in RAID for a Windows installation.

When the Z170 Extreme7+ board was first released, we had a thread in this forum about creating and using 950 Pro's in RAID 0 arrays. Several forum members and I worked out the details ourselves. One guy had three 950 Pro's in RAID 0. At that time with the very first IRST driver (14.0...) that supported NVMe SSDs in RAID with Z170 boards, the difference between the benchmark results of two vs three SSDs was minimal, at best 500MB/s faster for sequential read speed. That guy was disappointed, but we never could improve the results. Also, anything less than a 64K stripe size would result in terrible benchmark results with 950 Pros in RAID 0. At that time with IRST version 14, we all agreed the 128K stripe size was the best. If that has changed with newer versions of IRST, great.

Personally, I always configure a full UEFI booting installation, meaning CSM set to Disabled. The only problem with that is your video source must be GOP compatible, a UEFI booting protocol. Older video cards (pre-Nvidia 700 series) may not be GOP compatible without a VBIOS update. No idea about ATI/AMD video cards. EVGA 600 series cards needed a VBIOS update to be GOP compatible, but it worked. Intel integrated graphics is GOP compatible since Sandy Bridge.

Regarding the articles about how fast and great RAID 0 arrays of NVMe SSDs are: By all means, be my guest and use them! The only way to really know what they are like is to have and use them.

I'll make one comment about the articles, the graphs in particular. Yes, you can see the clear difference in the benchmark tests with the RAID 0 arrays, with their multi-hundreds of thousands of IOPs. But check one axis of the graphs labeled Queue Depth (QD.) QD is the number of outstanding IO requests waiting to be serviced by the drive or RAID array. NVMe SSDs have even better high QD performance, and better 4K random read performance than SATA SSDs.

It is well known that in home PC usage, since even a single SSD is so fast, that the number of outstanding IO requests is rarely, if ever, four. That is called a QD of 4, or QD = 4. That statistic was done with SATA SSDs.

Notice in the test graphs, the maximum QD=32 for IOPs, and for latency, QD=16. Unless you are hosting a website on your PC, or running database queries against millions of data records, you'll never be doing IO at even QD=4. In short, yes the performance potential is there, but most people never use it. I can't predict what benefits you will get from you usage case, but do you think you will ever use the ~200,000 IOPs (IO Operations Per second) of a single NVMe SSD? Do we need 400,000+ IOPs of the RAID 0 array?

I'm also very certain that a RAID 0 array of NVMe SSDs will not boot Windows faster than a single identical NVMe SSD. The same is true of SATA SSDs in RAID 0. From a cold start/boot, run Task Manager and click the Startup tab. Check the Last BIOS Time at the top right.

Interesting.

Startup speed to me is really not an issue, marginal improvements really don't matter.

The key objective is seamless mixing with heavy sound production workloads:

The killer there is that music is sequential, and basically failure depends on the "slowest ship" - particularly in recording, although mixing around 100 separate tracks each with multiple plugins eats up CPU, RAM and especially I/O from storage.

(Examples:  VU meter emulations, compressors, console emulations, EQ emulations, amp sims/samples etc, and multiple applications like Kontact, BFD 3, Cinesamples, Cinestrings, and all the other sample apps, plus multiple channel bus setups, and a whole cadre of master Chanel processors, all running at the same time).  

Hard Drives were the Achilles heel to being able to mix large pieces in real time.  SSDs helped to ameliorate this.  With the capability to utilize Thunderbolt in the Intel space also helped a lot too.

Hence it may well be that 400,000+ IOPs will contribute to being able to avoid drop outs, clicks, distortion and specific plugins failing or crashing...  The challenge is that ALL key areas such as CPU, RAM, Storage, and Audio interface I/O cannot fail at any point in terms of a delay in real time.  This kills the flow, and can ruin live performances that are being recorded.

As you can imagine the variable are very wide ranging, so my aim is to minimize latency and optimize I/O in all aspects aiming for a seamless playback/mixing/mastering/recording.

Hope that makes sense.  

FYI:  Still no joy for the RAID 0.  Something is still impeding this, and I honestly don't know what it is yet...  

Greatly appreciate your input, good to know the history and get a feel for what others have discovered so far.  I do want to try the raid, then load it with my most demanding projects, that will tell us a lot about performance.

Postscript:  Personally, I think this is a very exciting time where you can effectively have very similar powers of sound production that not long ago would have cost hundreds of thousands if not millions of dollars in physical hardware, that now can be achieved with a tiny fraction of those costs.  :)  I'm sure this translates to other areas at a similar level too - ASRock have done some amazing development in this space, and glad I've been using their Motherboards for many years now!
Back to Top
parsec View Drop Down
Moderator Group
Moderator Group
Avatar

Joined: 04 May 2015
Location: USA
Status: Offline
Points: 4996
Post Options Post Options   Thanks (0) Thanks(0)   Quote parsec Quote  Post ReplyReply Direct Link To This Post Posted: 01 Jan 2017 at 8:01pm
So you can create the RAID 0 array with the Intel utility in the UEFI/BIOS, and you can see it in the Intel utility if you restart the PC, but nothing at all in the Windows installation process? I forget, are you using Win 7?

I can't believe Samsung changed something with the 960 series... so I get to buy two 960s to check this?! Angry  I doubt that is true.

Another reason I am (was) less than thrilled about the RAID 0 of NVMe SSDs, is they tended to be... delicate, as I termed it. What I mean is, if you had one created and Windows installed on it, if you simply cleared the UEFI/BIOS, the RAID 0 array would fail on the following restart of the PC. That would never happen with SATA drive RAID arrays. That seems to have been fixed with an update to the UEFI, by keeping the PCIe Remapping options alone during a UEFI/BIOS clear.

If you have a Windows installation on another drive, try creating the RAID 0 array of 960s in the UEFI, and then boot from the other OS. In Windows, check if Disk Management sees the RAID 0 array and lets you format it.

Don't forget the SATA ports are shared with the M.2 slots. If you have SATA drives connected to the shared ports, the M.2 SSDs won't work correctly.


Back to Top
eComposer View Drop Down
Newbie
Newbie
Avatar

Joined: 30 Dec 2016
Location: LA
Status: Offline
Points: 26
Post Options Post Options   Thanks (0) Thanks(0)   Quote eComposer Quote  Post ReplyReply Direct Link To This Post Posted: 02 Jan 2017 at 6:32pm
Originally posted by parsec parsec wrote:

So you can create the RAID 0 array with the Intel utility in the UEFI/BIOS, and you can see it in the Intel utility if you restart the PC, but nothing at all in the Windows installation process? I forget, are you using Win 7?

I can't believe Samsung changed something with the 960 series... so I get to buy two 960s to check this?! Angry  I doubt that is true.

Another reason I am (was) less than thrilled about the RAID 0 of NVMe SSDs, is they tended to be... delicate, as I termed it. What I mean is, if you had one created and Windows installed on it, if you simply cleared the UEFI/BIOS, the RAID 0 array would fail on the following restart of the PC. That would never happen with SATA drive RAID arrays. That seems to have been fixed with an update to the UEFI, by keeping the PCIe Remapping options alone during a UEFI/BIOS clear.

If you have a Windows installation on another drive, try creating the RAID 0 array of 960s in the UEFI, and then boot from the other OS. In Windows, check if Disk Management sees the RAID 0 array and lets you format it.

Don't forget the SATA ports are shared with the M.2 slots. If you have SATA drives connected to the shared ports, the M.2 SSDs won't work correctly.



Hello parsec,

Good news - finally got the RAID 0 to work.  Essentially stripped the whole PC to components and rebuilt (except CPU).  Also cleared cmos, and only connected the 960 Evo SSDs.  The followed the same procedure as you outlined that I'd done for the past couple of days over and over and over... LOL.

No idea what caused the challenges, but hopefully taking everything to square one fixed whaterver it was.

Re operating system, I'm running Windows 10 64 bit.

Re the Sata ports and the M 2 slots, fully aware of this, and have been careful not to "double book".  My build plan took this into account.  Incidentally I use the 1 and 3 M 2 slots, given the GPU covers the "2" slot, and thought it would make sense to minimize heat given both may get hot if I drive them hard enough.  Heat and noise are major challenges for sound recording.  :)

Anyway, thanks for the input.  Having all the steps laid out by you was helpful in that no one else commented about clearing the CMOS, which makes a lot of sense.

ASRock should surely provide this wisdom in the documentation.  It would have been so much easier just having all the steps you laid out provided as a matter of course.  Although I suppose it's only the enthusiasts and "professionals" who want this kind of firepower that would be looking for this info.  Still, I'm guessing as time moves forwards, that using this kind of approach will become increasingly popular once users wake up and realize the benefits, and also as the pricepoint for these kinds of SSDs becomes more accessible.  
Back to Top
parsec View Drop Down
Moderator Group
Moderator Group
Avatar

Joined: 04 May 2015
Location: USA
Status: Offline
Points: 4996
Post Options Post Options   Thanks (0) Thanks(0)   Quote parsec Quote  Post ReplyReply Direct Link To This Post Posted: 03 Jan 2017 at 1:09am
Originally posted by eComposer eComposer wrote:

Hello parsec,

Good news - finally got the RAID 0 to work.  Essentially stripped the whole PC to components and rebuilt (except CPU).  Also cleared cmos, and only connected the 960 Evo SSDs.  The followed the same procedure as you outlined that I'd done for the past couple of days over and over and over... LOL.

No idea what caused the challenges, but hopefully taking everything to square one fixed whaterver it was.

Re operating system, I'm running Windows 10 64 bit.

Re the Sata ports and the M 2 slots, fully aware of this, and have been careful not to "double book".  My build plan took this into account.  Incidentally I use the 1 and 3 M 2 slots, given the GPU covers the "2" slot, and thought it would make sense to minimize heat given both may get hot if I drive them hard enough.  Heat and noise are major challenges for sound recording.  :)

Anyway, thanks for the input.  Having all the steps laid out by you was helpful in that no one else commented about clearing the CMOS, which makes a lot of sense.

ASRock should surely provide this wisdom in the documentation.  It would have been so much easier just having all the steps you laid out provided as a matter of course.  Although I suppose it's only the enthusiasts and "professionals" who want this kind of firepower that would be looking for this info.  Still, I'm guessing as time moves forwards, that using this kind of approach will become increasingly popular once users wake up and realize the benefits, and also as the pricepoint for these kinds of SSDs becomes more accessible.  


I'm glad you have it working now. I have a feeling either clearing the UEFI/BIOS and then doing the UEFI option configuration, or setting them and applying them via Save and Exit, made the difference.

Regarding documenting the procedure for creating the RAID 0 array of NVMe SSDs, did you happen to see this:

http://asrock.pc.cdn.bitgravity.com/Manual/RAID/X99%20Taichi/English.pdf

While it might not have every last detail about the Windows installation, it has all of the RAID 0 configuration information.

IMO, Samsung and Microsoft, the manufactures of the SSD and the OS software, should provide information about installing Windows on their products, in various scenarios. I don't think it is fair to place all the responsibility for that on the mother board manufacture.



Back to Top
 Post Reply Post Reply Page  <123>
  Share Topic   

Forum Jump Forum Permissions View Drop Down

Forum Software by Web Wiz Forums® version 12.04
Copyright ©2001-2021 Web Wiz Ltd.

This page was generated in 0.156 seconds.