ASRock.com Homepage
Forum Home Forum Home > Technical Support > Intel Motherboards
  New Posts New Posts RSS Feed - RAID Setup Menu
  FAQ FAQ  Forum Search Search  Events   Register Register  Login Login

RAID Setup Menu

 Post Reply Post Reply Page  123>
Author
Message Reverse Sort Order
Brenner View Drop Down
Newbie
Newbie
Avatar

Joined: 16 Jan 2017
Location: Michigan USA
Status: Offline
Points: 2
Post Options Post Options   Thanks (0) Thanks(0)   Quote Brenner Quote  Post ReplyReply Direct Link To This Post Topic: RAID Setup Menu
    Posted: 16 Jan 2017 at 9:35pm
Hi, thanks for your help...

I created the RAID drive using the BIOS. There was only one way to do it that I could see.

I used a bootable USB flash drive with a windows ISO on it. It was listed as the default boot device.

I used the custom windows installation.

I used the IRST 15.2 F6/"SATA Floppy Image" driver from my board's download page. It was version 15.2.

In the windows installation I tried using the format command, and then partioning the drive. Windows did not proceed automatically. It gave me a message telling me that it was unable to use the RAID drive as a bootable drive due to hardware limitations, and it was unable to partition the drive.

Everything worked fine right up until the point where I needed Windows to install itself on the Raid drive.

Back to Top
parsec View Drop Down
Moderator Group
Moderator Group
Avatar

Joined: 04 May 2015
Location: USA
Status: Offline
Points: 4996
Post Options Post Options   Thanks (0) Thanks(0)   Quote parsec Quote  Post ReplyReply Direct Link To This Post Posted: 16 Jan 2017 at 11:27am
Originally posted by Brenner Brenner wrote:

I have an Asrock Z170 OC Formula motherboard that I'm trying to set up with a pair of Samsumg 960 EVO SSDs in a Raid 0 configuration.

I have followed all of the instructions on this forum and I am able to get all the way to where the Windows setup process recognizes my Raid drive, but at this point Widows gives me a message that it is unable to partition the drive as a boot drive, and I'm stuck.

This is extremely frustrating!

Can anyone offer me some advice as to what I do now?


Yes, we need to discover what you've missed or if a mistake was made. Or do you think there is something broken or not working correctly?

Virtually no information provided, so let the questions begin...

What utility or feature did you use to create the RAID 0 array?

Installing Windows ???? from a USB flash drive? If so, what was the entry in the boot order for the USB flash drive installation media?

Are you doing a Custom Windows installation?

Did you install the IRST 15.2 F6/"SATA Floppy Image" driver from your board's download page during the Windows installation? If you did not use version 15.2, what version did you use?

Did you attempt to format the RAID 0 array in Windows? Or was the formatting (attempted) automatically during the Windows installation?

I have a RAID 0 array of 960 EVOs as the Windows volume on a similar ASRock board, so it is possible to do.
Back to Top
Brenner View Drop Down
Newbie
Newbie
Avatar

Joined: 16 Jan 2017
Location: Michigan USA
Status: Offline
Points: 2
Post Options Post Options   Thanks (0) Thanks(0)   Quote Brenner Quote  Post ReplyReply Direct Link To This Post Posted: 16 Jan 2017 at 9:47am
I have an Asrock Z170 OC Formula motherboard that I'm trying to set up with a pair of Samsumg 960 EVO SSDs in a Raid 0 configuration.

I have followed all of the instructions on this forum and I am able to get all the way to where the Windows setup process recognizes my Raid drive, but at this point Widows gives me a message that it is unable to partition the drive as a boot drive, and I'm stuck.

This is extremely frustrating!

Can anyone offer me some advice as to what I do now?
Back to Top
eComposer View Drop Down
Newbie
Newbie
Avatar

Joined: 30 Dec 2016
Location: LA
Status: Offline
Points: 26
Post Options Post Options   Thanks (0) Thanks(0)   Quote eComposer Quote  Post ReplyReply Direct Link To This Post Posted: 03 Jan 2017 at 4:16pm
Originally posted by parsec parsec wrote:


I'm glad you have it working now. I have a feeling either clearing the UEFI/BIOS and then doing the UEFI option configuration, or setting them and applying them via Save and Exit, made the difference.

Regarding documenting the procedure for creating the RAID 0 array of NVMe SSDs, did you happen to see this:

http://asrock.pc.cdn.bitgravity.com/Manual/RAID/X99%20Taichi/English.pdf

While it might not have every last detail about the Windows installation, it has all of the RAID 0 configuration information.

IMO, Samsung and Microsoft, the manufactures of the SSD and the OS software, should provide information about installing Windows on their products, in various scenarios. I don't think it is fair to place all the responsibility for that on the mother board manufacture.



Parsec:

Noted and partially agreed that Microsoft, Intel and Samsung should also provide detailed information on the windows installation steps, but surely it would be easy to incorporate this in the ASRock documentation.

Interestingly I think it was the doc in the link, or one like it that I used for the UEFI steps to set up the Array, not a problem.  The step involving using the right Intel RAID driver and how to do this correctly was unclear, and frankly your and a couple of other sources were the only ones that spelt this out clearly. 
Also, including your knowledge of not swapping between RAID and AHCI was beneficial, as was the advice to clear the CMOS surely cannot be that arduous for ASRock?  Just helps to make things clear.  Otherwise it's the wild west as it is now where there are multiple different docs out there saying very different things, and many are often misleading...  Just saying.  :)

Anyway, thanks for the pointers, I'm up and running, so no more frustration on that front at least!  :)
Back to Top
parsec View Drop Down
Moderator Group
Moderator Group
Avatar

Joined: 04 May 2015
Location: USA
Status: Offline
Points: 4996
Post Options Post Options   Thanks (0) Thanks(0)   Quote parsec Quote  Post ReplyReply Direct Link To This Post Posted: 03 Jan 2017 at 1:09am
Originally posted by eComposer eComposer wrote:

Hello parsec,

Good news - finally got the RAID 0 to work.  Essentially stripped the whole PC to components and rebuilt (except CPU).  Also cleared cmos, and only connected the 960 Evo SSDs.  The followed the same procedure as you outlined that I'd done for the past couple of days over and over and over... LOL.

No idea what caused the challenges, but hopefully taking everything to square one fixed whaterver it was.

Re operating system, I'm running Windows 10 64 bit.

Re the Sata ports and the M 2 slots, fully aware of this, and have been careful not to "double book".  My build plan took this into account.  Incidentally I use the 1 and 3 M 2 slots, given the GPU covers the "2" slot, and thought it would make sense to minimize heat given both may get hot if I drive them hard enough.  Heat and noise are major challenges for sound recording.  :)

Anyway, thanks for the input.  Having all the steps laid out by you was helpful in that no one else commented about clearing the CMOS, which makes a lot of sense.

ASRock should surely provide this wisdom in the documentation.  It would have been so much easier just having all the steps you laid out provided as a matter of course.  Although I suppose it's only the enthusiasts and "professionals" who want this kind of firepower that would be looking for this info.  Still, I'm guessing as time moves forwards, that using this kind of approach will become increasingly popular once users wake up and realize the benefits, and also as the pricepoint for these kinds of SSDs becomes more accessible.  


I'm glad you have it working now. I have a feeling either clearing the UEFI/BIOS and then doing the UEFI option configuration, or setting them and applying them via Save and Exit, made the difference.

Regarding documenting the procedure for creating the RAID 0 array of NVMe SSDs, did you happen to see this:

http://asrock.pc.cdn.bitgravity.com/Manual/RAID/X99%20Taichi/English.pdf

While it might not have every last detail about the Windows installation, it has all of the RAID 0 configuration information.

IMO, Samsung and Microsoft, the manufactures of the SSD and the OS software, should provide information about installing Windows on their products, in various scenarios. I don't think it is fair to place all the responsibility for that on the mother board manufacture.



Back to Top
eComposer View Drop Down
Newbie
Newbie
Avatar

Joined: 30 Dec 2016
Location: LA
Status: Offline
Points: 26
Post Options Post Options   Thanks (0) Thanks(0)   Quote eComposer Quote  Post ReplyReply Direct Link To This Post Posted: 02 Jan 2017 at 6:32pm
Originally posted by parsec parsec wrote:

So you can create the RAID 0 array with the Intel utility in the UEFI/BIOS, and you can see it in the Intel utility if you restart the PC, but nothing at all in the Windows installation process? I forget, are you using Win 7?

I can't believe Samsung changed something with the 960 series... so I get to buy two 960s to check this?! Angry  I doubt that is true.

Another reason I am (was) less than thrilled about the RAID 0 of NVMe SSDs, is they tended to be... delicate, as I termed it. What I mean is, if you had one created and Windows installed on it, if you simply cleared the UEFI/BIOS, the RAID 0 array would fail on the following restart of the PC. That would never happen with SATA drive RAID arrays. That seems to have been fixed with an update to the UEFI, by keeping the PCIe Remapping options alone during a UEFI/BIOS clear.

If you have a Windows installation on another drive, try creating the RAID 0 array of 960s in the UEFI, and then boot from the other OS. In Windows, check if Disk Management sees the RAID 0 array and lets you format it.

Don't forget the SATA ports are shared with the M.2 slots. If you have SATA drives connected to the shared ports, the M.2 SSDs won't work correctly.



Hello parsec,

Good news - finally got the RAID 0 to work.  Essentially stripped the whole PC to components and rebuilt (except CPU).  Also cleared cmos, and only connected the 960 Evo SSDs.  The followed the same procedure as you outlined that I'd done for the past couple of days over and over and over... LOL.

No idea what caused the challenges, but hopefully taking everything to square one fixed whaterver it was.

Re operating system, I'm running Windows 10 64 bit.

Re the Sata ports and the M 2 slots, fully aware of this, and have been careful not to "double book".  My build plan took this into account.  Incidentally I use the 1 and 3 M 2 slots, given the GPU covers the "2" slot, and thought it would make sense to minimize heat given both may get hot if I drive them hard enough.  Heat and noise are major challenges for sound recording.  :)

Anyway, thanks for the input.  Having all the steps laid out by you was helpful in that no one else commented about clearing the CMOS, which makes a lot of sense.

ASRock should surely provide this wisdom in the documentation.  It would have been so much easier just having all the steps you laid out provided as a matter of course.  Although I suppose it's only the enthusiasts and "professionals" who want this kind of firepower that would be looking for this info.  Still, I'm guessing as time moves forwards, that using this kind of approach will become increasingly popular once users wake up and realize the benefits, and also as the pricepoint for these kinds of SSDs becomes more accessible.  
Back to Top
parsec View Drop Down
Moderator Group
Moderator Group
Avatar

Joined: 04 May 2015
Location: USA
Status: Offline
Points: 4996
Post Options Post Options   Thanks (0) Thanks(0)   Quote parsec Quote  Post ReplyReply Direct Link To This Post Posted: 01 Jan 2017 at 8:01pm
So you can create the RAID 0 array with the Intel utility in the UEFI/BIOS, and you can see it in the Intel utility if you restart the PC, but nothing at all in the Windows installation process? I forget, are you using Win 7?

I can't believe Samsung changed something with the 960 series... so I get to buy two 960s to check this?! Angry  I doubt that is true.

Another reason I am (was) less than thrilled about the RAID 0 of NVMe SSDs, is they tended to be... delicate, as I termed it. What I mean is, if you had one created and Windows installed on it, if you simply cleared the UEFI/BIOS, the RAID 0 array would fail on the following restart of the PC. That would never happen with SATA drive RAID arrays. That seems to have been fixed with an update to the UEFI, by keeping the PCIe Remapping options alone during a UEFI/BIOS clear.

If you have a Windows installation on another drive, try creating the RAID 0 array of 960s in the UEFI, and then boot from the other OS. In Windows, check if Disk Management sees the RAID 0 array and lets you format it.

Don't forget the SATA ports are shared with the M.2 slots. If you have SATA drives connected to the shared ports, the M.2 SSDs won't work correctly.


Back to Top
eComposer View Drop Down
Newbie
Newbie
Avatar

Joined: 30 Dec 2016
Location: LA
Status: Offline
Points: 26
Post Options Post Options   Thanks (1) Thanks(1)   Quote eComposer Quote  Post ReplyReply Direct Link To This Post Posted: 01 Jan 2017 at 6:41pm
Originally posted by parsec parsec wrote:

Originally posted by eComposer eComposer wrote:

Thanks Parsec,

Interestingly I had followed all the steps listed above (except I used 64kb strip per Tweaktown, and did not clear the CMOS).

I'm not sure what is going wrong, it just won't give the RAID 0 array as an option to install Windows to...

From what you're saying though, RAID 0 has performance penalties to just using a single SSD.

I'm mixing music with multiple channels (often 50+ depending on using "stems"), and each has multiple plugins running (including many high end emulations and samples etc, effects, amp simulators, console emulations etc etc), and I/O is key to avoid drop outs, distortion and other issues detracting from the real time audio output.

Would you suggest just using a single Pcie 3 x4 SSD as the boot drive vs RAID 0, as I/O is the objective here to support the best mixing conditions I can achieve?  (Essentially looking for the best streaming config available). 

I thought the specs for RAID 0 using the M 2 Pcie 3 x4 slots leveraging NVMe via the ASRock Z170 Extreme 7+ was supposed to deliver the best I/O, but your comment seems to indicate the reverse (that is non RAID vs RAID 0).

FYI:  I'm using the Thunderbolt 2 ASRock card with a high end Thunderbolt audio interface to minimize latency etc (especially when recording multiple tracks "real time" and monitoring vs mixing).

RAID 0 or not RAID? - This would be helpful to know since I've held off a full update to build a new config from scratch aiming to reinstall everything on RAID 0 (and have frequent backup capabilities set up to guard against RAID 0 failure).  Maybe avoiding RAID 0 is the better option then?


That's strange the RAID 0 array can't be seen by the Win 10 installer. I went through the procedure again, and I don't think I left anything out. The RAID 0 array won't be shown as a drive until after the IRST "F6" driver is installed. Can you do a "refresh" in main the Custom installation screen, as in look for drives again?

I did not mention that you must format the RAID 0 array after the F6 driver is loaded. All you do is click the New button and the installer will format the RAID array correctly, as GPT and all required partitions.

Also do not remove the USB drive with the F6 driver from the PC until the Windows installation is complete. It's been a while since I've used a RAID 0 array of 950 Pros. But I know it works. The 960 should be no different than a 950 in RAID for a Windows installation.

When the Z170 Extreme7+ board was first released, we had a thread in this forum about creating and using 950 Pro's in RAID 0 arrays. Several forum members and I worked out the details ourselves. One guy had three 950 Pro's in RAID 0. At that time with the very first IRST driver (14.0...) that supported NVMe SSDs in RAID with Z170 boards, the difference between the benchmark results of two vs three SSDs was minimal, at best 500MB/s faster for sequential read speed. That guy was disappointed, but we never could improve the results. Also, anything less than a 64K stripe size would result in terrible benchmark results with 950 Pros in RAID 0. At that time with IRST version 14, we all agreed the 128K stripe size was the best. If that has changed with newer versions of IRST, great.

Personally, I always configure a full UEFI booting installation, meaning CSM set to Disabled. The only problem with that is your video source must be GOP compatible, a UEFI booting protocol. Older video cards (pre-Nvidia 700 series) may not be GOP compatible without a VBIOS update. No idea about ATI/AMD video cards. EVGA 600 series cards needed a VBIOS update to be GOP compatible, but it worked. Intel integrated graphics is GOP compatible since Sandy Bridge.

Regarding the articles about how fast and great RAID 0 arrays of NVMe SSDs are: By all means, be my guest and use them! The only way to really know what they are like is to have and use them.

I'll make one comment about the articles, the graphs in particular. Yes, you can see the clear difference in the benchmark tests with the RAID 0 arrays, with their multi-hundreds of thousands of IOPs. But check one axis of the graphs labeled Queue Depth (QD.) QD is the number of outstanding IO requests waiting to be serviced by the drive or RAID array. NVMe SSDs have even better high QD performance, and better 4K random read performance than SATA SSDs.

It is well known that in home PC usage, since even a single SSD is so fast, that the number of outstanding IO requests is rarely, if ever, four. That is called a QD of 4, or QD = 4. That statistic was done with SATA SSDs.

Notice in the test graphs, the maximum QD=32 for IOPs, and for latency, QD=16. Unless you are hosting a website on your PC, or running database queries against millions of data records, you'll never be doing IO at even QD=4. In short, yes the performance potential is there, but most people never use it. I can't predict what benefits you will get from you usage case, but do you think you will ever use the ~200,000 IOPs (IO Operations Per second) of a single NVMe SSD? Do we need 400,000+ IOPs of the RAID 0 array?

I'm also very certain that a RAID 0 array of NVMe SSDs will not boot Windows faster than a single identical NVMe SSD. The same is true of SATA SSDs in RAID 0. From a cold start/boot, run Task Manager and click the Startup tab. Check the Last BIOS Time at the top right.

Interesting.

Startup speed to me is really not an issue, marginal improvements really don't matter.

The key objective is seamless mixing with heavy sound production workloads:

The killer there is that music is sequential, and basically failure depends on the "slowest ship" - particularly in recording, although mixing around 100 separate tracks each with multiple plugins eats up CPU, RAM and especially I/O from storage.

(Examples:  VU meter emulations, compressors, console emulations, EQ emulations, amp sims/samples etc, and multiple applications like Kontact, BFD 3, Cinesamples, Cinestrings, and all the other sample apps, plus multiple channel bus setups, and a whole cadre of master Chanel processors, all running at the same time).  

Hard Drives were the Achilles heel to being able to mix large pieces in real time.  SSDs helped to ameliorate this.  With the capability to utilize Thunderbolt in the Intel space also helped a lot too.

Hence it may well be that 400,000+ IOPs will contribute to being able to avoid drop outs, clicks, distortion and specific plugins failing or crashing...  The challenge is that ALL key areas such as CPU, RAM, Storage, and Audio interface I/O cannot fail at any point in terms of a delay in real time.  This kills the flow, and can ruin live performances that are being recorded.

As you can imagine the variable are very wide ranging, so my aim is to minimize latency and optimize I/O in all aspects aiming for a seamless playback/mixing/mastering/recording.

Hope that makes sense.  

FYI:  Still no joy for the RAID 0.  Something is still impeding this, and I honestly don't know what it is yet...  

Greatly appreciate your input, good to know the history and get a feel for what others have discovered so far.  I do want to try the raid, then load it with my most demanding projects, that will tell us a lot about performance.

Postscript:  Personally, I think this is a very exciting time where you can effectively have very similar powers of sound production that not long ago would have cost hundreds of thousands if not millions of dollars in physical hardware, that now can be achieved with a tiny fraction of those costs.  :)  I'm sure this translates to other areas at a similar level too - ASRock have done some amazing development in this space, and glad I've been using their Motherboards for many years now!
Back to Top
parsec View Drop Down
Moderator Group
Moderator Group
Avatar

Joined: 04 May 2015
Location: USA
Status: Offline
Points: 4996
Post Options Post Options   Thanks (0) Thanks(0)   Quote parsec Quote  Post ReplyReply Direct Link To This Post Posted: 01 Jan 2017 at 11:29am
Originally posted by eComposer eComposer wrote:

Thanks Parsec,

Interestingly I had followed all the steps listed above (except I used 64kb strip per Tweaktown, and did not clear the CMOS).

I'm not sure what is going wrong, it just won't give the RAID 0 array as an option to install Windows to...

From what you're saying though, RAID 0 has performance penalties to just using a single SSD.

I'm mixing music with multiple channels (often 50+ depending on using "stems"), and each has multiple plugins running (including many high end emulations and samples etc, effects, amp simulators, console emulations etc etc), and I/O is key to avoid drop outs, distortion and other issues detracting from the real time audio output.

Would you suggest just using a single Pcie 3 x4 SSD as the boot drive vs RAID 0, as I/O is the objective here to support the best mixing conditions I can achieve?  (Essentially looking for the best streaming config available). 

I thought the specs for RAID 0 using the M 2 Pcie 3 x4 slots leveraging NVMe via the ASRock Z170 Extreme 7+ was supposed to deliver the best I/O, but your comment seems to indicate the reverse (that is non RAID vs RAID 0).

FYI:  I'm using the Thunderbolt 2 ASRock card with a high end Thunderbolt audio interface to minimize latency etc (especially when recording multiple tracks "real time" and monitoring vs mixing).

RAID 0 or not RAID? - This would be helpful to know since I've held off a full update to build a new config from scratch aiming to reinstall everything on RAID 0 (and have frequent backup capabilities set up to guard against RAID 0 failure).  Maybe avoiding RAID 0 is the better option then?


That's strange the RAID 0 array can't be seen by the Win 10 installer. I went through the procedure again, and I don't think I left anything out. The RAID 0 array won't be shown as a drive until after the IRST "F6" driver is installed. Can you do a "refresh" in main the Custom installation screen, as in look for drives again?

I did not mention that you must format the RAID 0 array after the F6 driver is loaded. All you do is click the New button and the installer will format the RAID array correctly, as GPT and all required partitions.

Also do not remove the USB drive with the F6 driver from the PC until the Windows installation is complete. It's been a while since I've used a RAID 0 array of 950 Pros. But I know it works. The 960 should be no different than a 950 in RAID for a Windows installation.

When the Z170 Extreme7+ board was first released, we had a thread in this forum about creating and using 950 Pro's in RAID 0 arrays. Several forum members and I worked out the details ourselves. One guy had three 950 Pro's in RAID 0. At that time with the very first IRST driver (14.0...) that supported NVMe SSDs in RAID with Z170 boards, the difference between the benchmark results of two vs three SSDs was minimal, at best 500MB/s faster for sequential read speed. That guy was disappointed, but we never could improve the results. Also, anything less than a 64K stripe size would result in terrible benchmark results with 950 Pros in RAID 0. At that time with IRST version 14, we all agreed the 128K stripe size was the best. If that has changed with newer versions of IRST, great.

Personally, I always configure a full UEFI booting installation, meaning CSM set to Disabled. The only problem with that is your video source must be GOP compatible, a UEFI booting protocol. Older video cards (pre-Nvidia 700 series) may not be GOP compatible without a VBIOS update. No idea about ATI/AMD video cards. EVGA 600 series cards needed a VBIOS update to be GOP compatible, but it worked. Intel integrated graphics is GOP compatible since Sandy Bridge.

Regarding the articles about how fast and great RAID 0 arrays of NVMe SSDs are: By all means, be my guest and use them! The only way to really know what they are like is to have and use them.

I'll make one comment about the articles, the graphs in particular. Yes, you can see the clear difference in the benchmark tests with the RAID 0 arrays, with their multi-hundreds of thousands of IOPs. But check one axis of the graphs labeled Queue Depth (QD.) QD is the number of outstanding IO requests waiting to be serviced by the drive or RAID array. NVMe SSDs have even better high QD performance, and better 4K random read performance than SATA SSDs.

It is well known that in home PC usage, since even a single SSD is so fast, that the number of outstanding IO requests is rarely, if ever, four. That is called a QD of 4, or QD = 4. That statistic was done with SATA SSDs.

Notice in the test graphs, the maximum QD=32 for IOPs, and for latency, QD=16. Unless you are hosting a website on your PC, or running database queries against millions of data records, you'll never be doing IO at even QD=4. In short, yes the performance potential is there, but most people never use it. I can't predict what benefits you will get from you usage case, but do you think you will ever use the ~200,000 IOPs (IO Operations Per second) of a single NVMe SSD? Do we need 400,000+ IOPs of the RAID 0 array?

I'm also very certain that a RAID 0 array of NVMe SSDs will not boot Windows faster than a single identical NVMe SSD. The same is true of SATA SSDs in RAID 0. From a cold start/boot, run Task Manager and click the Startup tab. Check the Last BIOS Time at the top right.
Back to Top
eComposer View Drop Down
Newbie
Newbie
Avatar

Joined: 30 Dec 2016
Location: LA
Status: Offline
Points: 26
Post Options Post Options   Thanks (0) Thanks(0)   Quote eComposer Quote  Post ReplyReply Direct Link To This Post Posted: 01 Jan 2017 at 9:10am
Hmmm, this is one of several interesting conclusions drawn by Tweaktown:


While this isn't an exact test of the 960 Evo pcie 3 x4 SSD, it's certainly an indication of potential performance with RAID 0 outperforming a single drive.

I'm still not clear yet though on other choke points for I/O such as the overall bus limitations and CPU capacity, RAM, and the speed that the variety of audio software programs in complex situations that real time mixing throws up in terms of performance needs and actual real world outcomes.

Still, I have a strong suspicion that combining Thunderbolt throughput (to a Thunderbolt Audio Interface) with fast M 2 PCIe RAID arrays, and overclocked CPU and RAM on the Z170 Extreme 7+ is going to outperform most other PC based approaches at this point in time - Specifically for what I'm doing with mixing multiple audio channels via DAWs and all the processing associated with audio plugins (compressors, EQ, console emulations, amp simulations etc etc). It's just taking time to work this out, so any perspectives on this would be most welcome!  :)
Back to Top
 Post Reply Post Reply Page  123>
  Share Topic   

Forum Jump Forum Permissions View Drop Down

Forum Software by Web Wiz Forums® version 12.04
Copyright ©2001-2021 Web Wiz Ltd.

This page was generated in 0.164 seconds.