GpuRamDrive: GPU VRAM as RAMdisk: Game changer!? Topic is solved

Suggestions around PrimoCache
Logic
Level 5
Level 5
Posts: 47
Joined: Mon Oct 29, 2018 2:12 pm

Re: GpuRamDrive: GPU VRAM as RAMdisk: Game changer!?

Post by Logic »

tverweij wrote: Thu Mar 02, 2023 11:40 am Personally, I think this request is a waste of resources.

If you have a GPU with lots of RAM (means: an expensive one), you either use it or not.
If you use it, Primocache can't.
If you don't use it: sell it and buy RAM from the money you get for it - PrimoCache can then use that RAM.
What absolute crap!

If all I wanted to do was play games I'd buy a bucking X Fox!

I game, but that is less than 30% up the system uptime.
The rest of the time the GPU and its VRAM are sitting idle.

ie:
At least 70% of the time that 8GB of VRAM is sitting idle. (At boot; its ALWAYS idle..)
Those who want to game sometimes can't rent and install a gucking frafics card for an hour or 3 on a daily basis.
Therefore, there are a huge # of systems where the VRAM is sitting idle most of the time...

Are you perhaps under the impression that devs (like the excellent one/s at Romex) are incapable of writing software that senses when the GPU RAM is required elsewhere and stopping or reducing its use as a write through cache?
Logic
Level 5
Level 5
Posts: 47
Joined: Mon Oct 29, 2018 2:12 pm

Re: GpuRamDrive: GPU VRAM as RAMdisk: Game changer!?

Post by Logic »

Nick7 wrote: Sat Apr 01, 2023 9:26 am No matter what they allow, standard RAM is faster, cheaper.
With PCIe 5.0 x16 you can have theoretical 64GB/s.
But compared to standard RAM, it would still have much higher latency due to PCIe BUS.

Please do not waste time where it's not needed....
In 2021 around 49.15 million graphics cards were sold.
In 2022 around 37.86 million graphics cards were sold.
https://www.tomshardware.com/news/nvidi ... n-2022-jpr

Now who goes looking for an uplift in I/O performance?
Those with the latest NVME SSD, or those with spinning rust..?

Those who just want to game buy X Boxes etc.
Those who want to use their computers normally and game occasionally buy these cards.
Therefore:
There are a huge # of systems where ~8 or more GB of RAM is sitting idle ~70% of the time. (100% of the time at boot)

If YOU don't want to waste YOUR time on this subject: feel free to skip this thread completely.
gpuuser
Level 1
Level 1
Posts: 2
Joined: Sun Dec 03, 2023 5:16 pm

Re: GpuRamDrive: GPU VRAM as RAMdisk: Game changer!?

Post by gpuuser »

Communication with graphics cards have some overhead like 20-50 microseconds due to synchronization between host environment (i.e. C++, CPU) and device environment (C, GPU) using sync calls in OpenCL/CUDA/etc. This makes at most 20000 operations per second in a single thread. 20000 x 4kB read operations per second (without any extra latency from API, Windows, etc) would make maximum 400 MB/s from single thread. This is not so good. But also not so bad compared to Samsung Evo 970 which does only 50 MB/s in single thread. But in multi-threaded acceess, you could have 32GB/s from main graphics card and 8GB/s from second card.

Pros:

- longer SSD life because you can use graphics card as many times as you like and it won't wear out like an SSD

- more RAM dedicated to OS = responsive even in gaming

Cons:

- You need an extra graphics card during gaming (some GPGPU programmers, miners, benchmarkers do)

- PCIE has limited bandwidth especially on 4x slots (gen 4 has only 8GB/s)

Nevertheless, I'd like a cache hieararchy like this:

L0: direct-mapped cache with 4/8 byte access resolution within each 4kB cluster [1GB], low hit ratio, high hit latency due to simple indexing

L1: normal LRU/LFU combo of primocache [10GB]

L2: two graphics cards [20 GB] , high latency but higher bandwidth than a single SSD due to combined bandwidth of 2 PCIE bridges

L3: SSD [100 GB], faster than HDD

HDD as the backing-store
Post Reply