There hasn't been much work on bounding software code cache sizes. Most systems size their caches generously and assume the limit won't be reached. Furthermore, their only cache management is flushing the entire cache, but these systems have typically focused on small benchmark suites. For larger and more varying applications, adapting the code cache size to the application's code usage is desirable, and since this is a software cache, this flexibility is possible. However, we don't have the luxury of profiling or examining the application beforehand. We need an incremental, reactive algorithm that enlarges the cache as needed. Until we're sure the cache needs to be enlarged, we want to be able to carry on without flushing the whole thing every time it fills up -- we would like fine-grained cache management. Furthermore, as we'll see later, cache consistency events present in modern, dynamic applications require fine-grained cache management for efficiency.
|Copyright © 2004 Derek Bruening|