avoiding cache pollution from backup

FAQ, getting help, user experience about FancyCache
Post Reply
tobidelbruck
Level 1
Level 1
Posts: 1
Joined: Wed Dec 19, 2012 9:08 am

avoiding cache pollution from backup

Post by tobidelbruck »

FancyCache is really useful and highly effective for interaction continuous sessions.
It would be great if the cache could be prevented from pollution.
I notice that in the morning when I return after a nightly backup the cache seems to have been polluted and programs/windows start menu, etc seem to take a long time to read. Could this be the result of nightly backups or dropbox/googledrive/sugarsync daemon action?
Is there anything to do about this?
I'm using readonly disk cache with L1 RAM cache of 2GB and L2 SSD of 15GB.
Thanks
User avatar
Violator
Level 5
Level 5
Posts: 48
Joined: Mon Jan 16, 2012 11:13 pm

Re: avoiding cache pollution from backup

Post by Violator »

I would say you got some software that conflicts with the caching, never had such issues with Acronis + LiveDrive backup, SkyDrive sync, DropBox, FireFox sync etc.
Did you check your Sleep states and checked if your SSD's or something else went into idle overnight?
Manny
Level 6
Level 6
Posts: 62
Joined: Tue Nov 13, 2012 11:42 pm

Re: avoiding cache pollution from backup

Post by Manny »

All software that you specified, add it's own interceptors to file system actions. That is how they know that file have been changed and they need to sync. And they add interceptors to all running programs and services, even to each other. Some time it cause lags, when hdd and cpu idle, but everything is freezing. If it is your case, then i would suggest try to close, disable or uninstall them one by one to see which one is doing that.
Also all performance and activity monitors, and antiviruses do the same. So check them also. For example i used AnVir programm, and it was doing that.
dustyny
Level 8
Level 8
Posts: 118
Joined: Sun Sep 02, 2012 12:54 am

Re: avoiding cache pollution from backup

Post by dustyny »

Achems razor, the simplest answer is usually the right answer. When you read data off the disk it ends up in your caches (L1, L2) if you are backing up your entire disk the data is being read off and it replaces anything already there. So your cache isn't being polluted it's being replaced. Sorry to say but I don't think there is a work around for this, it would require some clever coding on Romex's part to ignore data that is being backed up or scanned by an antirvus. If we had a CLI interface for fancy cache you could probably script pausing caching and then run the backup/virus scan (or similar)..
Manny
Level 6
Level 6
Posts: 62
Joined: Tue Nov 13, 2012 11:42 pm

Re: avoiding cache pollution from backup

Post by Manny »

dustyny wrote:Achems razor, the simplest answer is usually the right answer. When you read data off the disk it ends up in your caches (L1, L2) if you are backing up your entire disk the data is being read off and it replaces anything already there. So your cache isn't being polluted it's being replaced. Sorry to say but I don't think there is a work around for this, it would require some clever coding on Romex's part to ignore data that is being backed up or scanned by an antirvus. If we had a CLI interface for fancy cache you could probably script pausing caching and then run the backup/virus scan (or similar)..
Doesn't LFU-R means that once-read data would not replace data read for 10 times? If it is not like this, then it looks like an issue. And should be posted in bugs section.
dustyny
Level 8
Level 8
Posts: 118
Joined: Sun Sep 02, 2012 12:54 am

Re: avoiding cache pollution from backup

Post by dustyny »

tobidelbruck - If you don't mind a manual process you might want to try pausing caching before you start your backup. It's supposed to maintain your cache until you turn it back on again.

LFU (Least Frequently Used): counts how often a data block is needed. Those that are used least often are discarded first.
LRU (Least Recently Used): discards the least recently used data first.

Doesn't matter which one you use, recently accessed data always takes precedence and old data will be expired based on which algorithm you pick. So when a user runs the antivirus or backup on a drive that FC is running on it will constantly try to add the data to the L1 & L2 caches. Chances are this data is going to exceed the size of the cache and it's going to have to expire data based on either how often it's used (or not used really) LFU or how long it's been since it's been last accessed (the oldest being dropped) LRU. So the cache is going to end up being filled with the files that were opened from the backup/virus scan and the data the user want's to be cached has been pushed out.
Post Reply