ASRock.com Homepage
Forum Home Forum Home > Media&User's Review > AMD Motherboards
  New Posts New Posts RSS Feed - High VRAM Allocation on Ryzen APUs for Machine Lea
  FAQ FAQ  Forum Search Search  Events   Register Register  Login Login

High VRAM Allocation on Ryzen APUs for Machine Lea

 Post Reply Post Reply
Author
Message
AsRockML View Drop Down
Newbie
Newbie


Joined: 07 Aug 2024
Status: Offline
Points: 20
Post Options Post Options   Thanks (0) Thanks(0)   Quote AsRockML Quote  Post ReplyReply Direct Link To This Post Topic: High VRAM Allocation on Ryzen APUs for Machine Lea
    Posted: 07 Aug 2024 at 11:26am
Hello ASRock Community,

I?™m currently exploring the potential of using Ryzen APUs (eg from the 5000 to 8000 series) for machine learning applications. Specifically, I am interested in configuring the VRAM allocation settings in the BIOS to support more than 16GB of VRAM. This capability would be beneficial for loading large models and datasets.

I have seen various discussions about increasing VRAM allocation using the ?œUMA Frame Buffer Size??setting in the BIOS. However, detailed user experiences about achieving allocations higher than 16GB are sparse.

My questions are:

     1.     Are there any documented cases where users have successfully set the VRAM allocation above 16GB on Ryzen APUs? Eg. 64GB.
     2.     Has this led to successful system boot and stable operation under such configurations?
     3.     Are there specific BIOS settings or configurations that need to be adjusted to achieve this?

Are there specific models of motherboard that are more flexible in this regard? The motherboards of particular interest for this inquiry include:

     ??ASRock X670E Steel Legend
     ??ASRock B650E Steel Legend
     ??ASRock B650 Pro RS

The goal is to understand the feasibility and potential performance benefits of such configurations for machine learning tasks.

Any insights or experiences from the community or ASRock support team would be greatly appreciated.

Thank you!
Back to Top
Skybuck View Drop Down
Senior Member
Senior Member


Joined: 18 Apr 2023
Status: Offline
Points: 1215
Post Options Post Options   Thanks (0) Thanks(0)   Quote Skybuck Quote  Post ReplyReply Direct Link To This Post Posted: 17 Oct 2024 at 8:14pm
All I can tell you is I can run 70B models on my 7950X with 128 GB of ram with ollama.

I do have a rtx 4070 ti installed, with 12 gb ram.

However I believe ollama will simply use MAIN ram.... of the pc/cpu/ddr5.

The computation most likely happens on CPU cores, so the embedded gpu plays no role...
Back to Top
 Post Reply Post Reply
  Share Topic   

Forum Jump Forum Permissions View Drop Down

Forum Software by Web Wiz Forums® version 12.04
Copyright ©2001-2021 Web Wiz Ltd.

This page was generated in 0.750 seconds.