| High VRAM Allocation on Ryzen APUs for Machine Lea
 
 Printed From: ASRock.com
 Category:  Media & User's Review
 Forum Name:  AMD Motherboards
 Forum Description:  ASRock AMD Motherboards
 URL: https://forum.asrock.com/forum_posts.asp?TID=54722
 Printed Date: 01 Nov 2025 at 7:02am
 Software Version: Web Wiz Forums 12.04 - http://www.webwizforums.com
 
 
 Topic: High VRAM Allocation on Ryzen APUs for Machine Lea
 Posted By: AsRockML
 Subject: High VRAM Allocation on Ryzen APUs for Machine Lea
 Date Posted: 07 Aug 2024 at 11:26am
 
 
        
          | Hello ASRock Community, 
 I?™m currently exploring the potential of using Ryzen APUs (eg from the 5000 to 8000 series) for machine learning applications. Specifically, I am interested in configuring the VRAM allocation settings in the BIOS to support more than 16GB of VRAM. This capability would be beneficial for loading large models and datasets.
 
 I have seen various discussions about increasing VRAM allocation using the ?œUMA Frame Buffer Size??setting in the BIOS. However, detailed user experiences about achieving allocations higher than 16GB are sparse.
 
 My questions are:
 
 1.     Are there any documented cases where users have successfully set the VRAM allocation above 16GB on Ryzen APUs? Eg. 64GB.
 2.     Has this led to successful system boot and stable operation under such configurations?
 3.     Are there specific BIOS settings or configurations that need to be adjusted to achieve this?
 
 Are there specific models of motherboard that are more flexible in this regard? The motherboards of particular interest for this inquiry include:
 
 ??ASRock X670E Steel Legend
 ??ASRock B650E Steel Legend
 ??ASRock B650 Pro RS
 
 The goal is to understand the feasibility and potential performance benefits of such configurations for machine learning tasks.
 
 Any insights or experiences from the community or ASRock support team would be greatly appreciated.
 
 Thank you!
 
 |  
 
 Replies:
 Posted By: Skybuck
 Date Posted: 17 Oct 2024 at 8:14pm
 
 
        
          | All I can tell you is I can run 70B models on my 7950X with 128 GB of ram with ollama. 
 I do have a rtx 4070 ti installed, with 12 gb ram.
 
 However I believe ollama will simply use MAIN ram.... of the pc/cpu/ddr5.
 
 The computation most likely happens on CPU cores, so the embedded gpu plays no role...
 |  
 
 |