Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion BENCHMARK.md
Original file line number Diff line number Diff line change
Expand Up @@ -203,7 +203,7 @@ Multi-level caching provides the most benefit when:

## Performance Tips

1. **Tune Memory Strategies:** Adjust `MemoryPercentageLimitStrategy` threshold based on your application's memory profile
1. **Tune Memory Strategies:** Adjust `RamPercentageLimitStrategy` threshold based on your application's memory profile
2. **Choose Appropriate TTL:** Set cache TTL values that balance freshness and hit rate
3. **Monitor Cache Metrics:** Track hit rates to optimize cache configuration
4. **Size Your Cache:** Use Benchmark 4 to estimate memory requirements
Expand Down
47 changes: 31 additions & 16 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -68,12 +68,12 @@ import {
CacheService,
MemoryCacheLevel,
FirstExpiringMemoryPolicy,
MemoryPercentageLimitStrategy
RamPercentageLimitStrategy
} from 'cacheforge';

// Create memory cache with eviction policy and strategy
const memoryCache = new MemoryCacheLevel({
memoryStrategies: [new MemoryPercentageLimitStrategy(80)], // Trigger at 80% memory
memoryStrategies: [new RamPercentageLimitStrategy(80)], // Trigger at 80% memory
evictionPolicy: new FirstExpiringMemoryPolicy()
});

Expand Down Expand Up @@ -103,11 +103,11 @@ At the top (CacheService), fallbacks are handled. However the added layers do no
Fast, in-memory caching using a Map and min-heap for efficient expiration tracking.

```typescript
import { MemoryCacheLevel, FirstExpiringMemoryPolicy, MemoryPercentageLimitStrategy } from 'cacheforge';
import { MemoryCacheLevel, FirstExpiringMemoryPolicy, RamPercentageLimitStrategy } from 'cacheforge';

const memoryCache = new MemoryCacheLevel({
memoryStrategies: [
new MemoryPercentageLimitStrategy(75) // Evict when memory exceeds 75%
new RamPercentageLimitStrategy(75) // Evict when memory exceeds 75%
],
evictionPolicy: new FirstExpiringMemoryPolicy()
});
Expand Down Expand Up @@ -145,14 +145,26 @@ const policy = new FirstExpiringMemoryPolicy();

Strategies check conditions and trigger eviction policies when thresholds are met.

#### MemoryPercentageLimitStrategy
#### MemorySizeLimitStrategy (Recommended Default)
Triggers eviction when the total size of items in the cache exceeds a defined threshold (as a percentage of the Node.js process heap).

This strategy is recommended as the default for most applications, as it provides a more accurate measurement of cache memory usage and helps prevent out-of-memory errors.

```typescript
import { MemorySizeLimitStrategy } from 'cacheforge';

// Trigger eviction when cache uses 10% or more of Node.js heap
const strategy = new MemorySizeLimitStrategy(10);
```

#### RamPercentageLimitStrategy
Triggers eviction when system memory usage exceeds a percentage threshold.

```typescript
import { MemoryPercentageLimitStrategy } from 'cacheforge';
import { RamPercentageLimitStrategy } from 'cacheforge';

// Trigger eviction at 80% memory usage
const strategy = new MemoryPercentageLimitStrategy(80);
const strategy = new RamPercentageLimitStrategy(80);
```

## Usage Guide
Expand All @@ -166,13 +178,13 @@ import {
CacheService,
MemoryCacheLevel,
FirstExpiringMemoryPolicy,
MemoryPercentageLimitStrategy
RamPercentageLimitStrategy
} from 'cacheforge';

const cache = new CacheService({
levels: [
new MemoryCacheLevel({
memoryStrategies: [new MemoryPercentageLimitStrategy(80)],
memoryStrategies: [new RamPercentageLimitStrategy(80)],
evictionPolicy: new FirstExpiringMemoryPolicy()
})
],
Expand Down Expand Up @@ -202,12 +214,12 @@ import {
MemoryCacheLevel,
RedisCacheLevel,
FirstExpiringMemoryPolicy,
MemoryPercentageLimitStrategy
RamPercentageLimitStrategy
} from 'cacheforge';
import Redis from 'ioredis';

const memoryCache = new MemoryCacheLevel({
memoryStrategies: [new MemoryPercentageLimitStrategy(75)],
memoryStrategies: [new RamPercentageLimitStrategy(75)],
evictionPolicy: new FirstExpiringMemoryPolicy()
});

Expand Down Expand Up @@ -236,7 +248,7 @@ import Redis from 'ioredis';
const cache = new CacheService({
levels: [
new MemoryCacheLevel({
memoryStrategies: [new MemoryPercentageLimitStrategy(80)],
memoryStrategies: [new RamPercentageLimitStrategy(80)],
evictionPolicy: new FirstExpiringMemoryPolicy()
}),
new RedisCacheLevel(new Redis())
Expand Down Expand Up @@ -556,8 +568,11 @@ await cache.invalidateKey('user:123');

### 4. Memory Strategy Thresholds

- Development: 80-90% (more headroom)
- Production: 70-75% (prevent OOM issues)

- **Recommended Default:** Use `MemorySizeLimitStrategy` with a threshold of 10-20% of Node.js heap for most production workloads.
- **RamPercentageLimitStrategy:**
- Development: 80-90% (more headroom)
- Production: 70-75% (prevent OOM issues)

### 5. Distributed Locking

Expand Down Expand Up @@ -608,14 +623,14 @@ Example test using the library:

```typescript
import { describe, it, expect, beforeEach, afterEach } from 'vitest';
import { CacheService, MemoryCacheLevel, FirstExpiringMemoryPolicy, MemoryPercentageLimitStrategy } from 'cacheforge';
import { CacheService, MemoryCacheLevel, FirstExpiringMemoryPolicy, RamPercentageLimitStrategy } from 'cacheforge';

describe('Cache Service', () => {
let cache: CacheService;

beforeEach(() => {
const memoryCache = new MemoryCacheLevel({
memoryStrategies: [new MemoryPercentageLimitStrategy(80)],
memoryStrategies: [new RamPercentageLimitStrategy(80)],
evictionPolicy: new FirstExpiringMemoryPolicy()
});

Expand Down
4 changes: 2 additions & 2 deletions package-lock.json

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

2 changes: 1 addition & 1 deletion package.json
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
{
"name": "cacheforge",
"version": "1.1.0",
"version": "1.2.0",
"description": "A multi-level caching library for Node.js applications, supporting in-memory and Redis, and custom cache levels.",
"main": "dist/src/index.js",
"types": "dist/src/index.d.ts",
Expand Down
4 changes: 2 additions & 2 deletions src/cache.service.spec.ts
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ import {
type StoredHeapItem,
} from "./levels";
import { FirstExpiringMemoryPolicy } from "./policies/first-expiring-memory.policy";
import { MemoryPercentageLimitStrategy } from "./strategies/memory-percentage-limit.strategy";
import { RamPercentageLimitStrategy } from "./strategies/ram-percentage-limit.strategy";

let redisContainer: StartedRedisContainer;
let redisLevel: RedisCacheLevel;
Expand All @@ -25,7 +25,7 @@ let faultyFirstLevelVersionedCacheService: CacheService;
let faultyFirstLevelCacheService: CacheService;
let allFaultyLevelsCacheService: CacheService;
let allFaultyLevelsVersionedCacheService: CacheService;
const memoryStrategy = new MemoryPercentageLimitStrategy<StoredHeapItem>(70);
const memoryStrategy = new RamPercentageLimitStrategy<StoredHeapItem>(70);
const evictionPolicy = new FirstExpiringMemoryPolicy();
memoryLevel = new MemoryCacheLevel({
memoryStrategies: [memoryStrategy],
Expand Down
6 changes: 6 additions & 0 deletions src/levels/interfaces/in-memory.ts
Original file line number Diff line number Diff line change
Expand Up @@ -12,4 +12,10 @@ export interface InMemory<T> {
* @return Array of items in the heap.
*/
getHeap(): MemoryHeap<T>;

/**
* Get the size of the key-value store in bytes.
* @return Size of the store in bytes.
*/
getStoreSize(): number;
}
8 changes: 4 additions & 4 deletions src/levels/memory/eviction-manager.spec.ts
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
import { beforeEach, describe, expect, it, vi } from "vitest";
import { AbstractMemoryEvictionPolicy } from "../../policies/abstract/abstract-memory-eviction.policy";
import { FirstExpiringMemoryPolicy } from "../../policies/first-expiring-memory.policy";
import { MemoryPercentageLimitStrategy } from "../../strategies/memory-percentage-limit.strategy";
import { RamPercentageLimitStrategy } from "../../strategies/ram-percentage-limit.strategy";
import { EvictionManager } from "./eviction-manager";
import { MemoryCacheLevel, type StoredHeapItem } from "./memory.level";
import { triggerMemoryChange } from "./memory-event.manager";
Expand All @@ -14,11 +14,11 @@ describe("EvictionManager", () => {
let memoryLevel: MemoryCacheLevel;
let evictionPolicy: FirstExpiringMemoryPolicy;
let memoryWithoutEvictionPolicy: MemoryCacheLevel;
let memoryStrategy: MemoryPercentageLimitStrategy<StoredHeapItem>;
let memoryStrategy: RamPercentageLimitStrategy<StoredHeapItem>;

beforeEach(() => {
evictionPolicy = new FirstExpiringMemoryPolicy();
memoryStrategy = new MemoryPercentageLimitStrategy(0); // Always triggers
memoryStrategy = new RamPercentageLimitStrategy(0); // Always triggers
memoryLevel = new MemoryCacheLevel({
memoryStrategies: [memoryStrategy],
evictionPolicy,
Expand Down Expand Up @@ -73,7 +73,7 @@ describe("EvictionManager", () => {
});

it("does not evict if no strategy triggers", async () => {
const neverStrategy = new MemoryPercentageLimitStrategy(100); // Never triggers
const neverStrategy = new RamPercentageLimitStrategy(100); // Never triggers
const neverOptions = {
memoryStrategies: [neverStrategy],
evictionPolicy,
Expand Down
10 changes: 8 additions & 2 deletions src/levels/memory/memory.level.spec.ts
Original file line number Diff line number Diff line change
Expand Up @@ -2,11 +2,11 @@ import { faker, fakerZH_TW } from "@faker-js/faker";
import { afterEach, describe, expect, it, vi } from "vitest";
import { generateJSONData } from "../../../tests/utilities/data.utilities";
import { FirstExpiringMemoryPolicy } from "../../policies";
import { MemoryPercentageLimitStrategy } from "../../strategies/memory-percentage-limit.strategy";
import { RamPercentageLimitStrategy } from "../../strategies/ram-percentage-limit.strategy";
import { MemoryCacheLevel, type StoredHeapItem } from "..";

const evictionPolicy = new FirstExpiringMemoryPolicy();
const strategy = new MemoryPercentageLimitStrategy<StoredHeapItem>(80);
const strategy = new RamPercentageLimitStrategy<StoredHeapItem>(80);
const cacheEngine = new MemoryCacheLevel({
memoryStrategies: [strategy],
evictionPolicy: evictionPolicy,
Expand Down Expand Up @@ -125,6 +125,12 @@ describe("should successfully store data, and retrieve it on demand", async () =
await cacheEngine.mget<number>(["bingo", "bingo1", "bingo2"]),
).toEqual([undefined, undefined, undefined]);
});

it("should get store size in bytes", () => {
const storeSize = cacheEngine.getStoreSize();
expect(typeof storeSize).toBe("number");
expect(storeSize).toBeGreaterThanOrEqual(0);
});
});

describe("It should successfully manage the application memory usage", () => {
Expand Down
14 changes: 8 additions & 6 deletions src/levels/memory/memory.level.ts
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,7 @@ import { DEFAULT_TTL } from "../../constants";
import type { AbstractMemoryEvictionPolicy } from "../../policies/abstract/abstract-memory-eviction.policy";
import type { MemoryManagementStrategy } from "../../strategies/interfaces/memory-management-strategy";
import { createCacheHeap } from "../../utils/heap.utils";
import { serialize } from "../../utils/parsing.utils";
import type { CacheLevel } from "../interfaces/cache-level";
import type { InMemory } from "../interfaces/in-memory";
import type { Purgable } from "../interfaces/purgable";
Expand All @@ -26,6 +27,7 @@ export class MemoryCacheLevel
implements CacheLevel, Purgable, InMemory<StoredHeapItem>
{
protected store = new Map<string, StoredItem>();
protected size = 0;
protected heap = createCacheHeap<StoredHeapItem>((item) => item.expiry);
protected evictionManager: EvictionManager;

Expand All @@ -41,17 +43,17 @@ export class MemoryCacheLevel
await Promise.all(deletePromises);
}

private insertHeapItem(item: StoredHeapItem) {
this.heap.insert(item);
}

private updateStore(key: string, item: StoredItem) {
this.store.set(key, item);

this.insertHeapItem({ ...item, key });
this.heap.insert({ ...item, key });
this.size += serialize(item).length;
triggerMemoryChange();
}

public getStoreSize(): number {
return this.size;
}

async mset<T>(
keys: string[],
values: T[],
Expand Down
11 changes: 4 additions & 7 deletions src/levels/redis/redis.level.ts
Original file line number Diff line number Diff line change
Expand Up @@ -3,10 +3,7 @@ import type IoRedis from "ioredis";
import type { Cluster } from "ioredis";
import { DEFAULT_TTL } from "../../constants";
import { parseIfJSON } from "../../utils/cache.utils";
import {
deserializeFromRedis,
serializeForRedis,
} from "../../utils/parsing.utils";
import { deserialize, serialize } from "../../utils/parsing.utils";
import { generateVersionLookupKey } from "../../utils/version.utils";
import type { CacheLevel } from "../interfaces/cache-level";
import type { Lockable } from "../interfaces/lockable";
Expand Down Expand Up @@ -38,7 +35,7 @@ export class RedisCacheLevel implements CacheLevel, Lockable {
for (let i = 0; i < keys.length; i++) {
const key = keys[i];
const value = values[i];
pipeline.set(key, serializeForRedis(value), "EX", ttl);
pipeline.set(key, serialize(value), "EX", ttl);
}

await pipeline.exec();
Expand All @@ -56,15 +53,15 @@ export class RedisCacheLevel implements CacheLevel, Lockable {
if (cachedValue === null || cachedValue === undefined) {
finalResults.push(undefined as T);
} else {
finalResults.push(deserializeFromRedis(cachedValue));
finalResults.push(deserialize(cachedValue));
}
}

return finalResults;
}

async set<T>(key: string, value: T, ttl = DEFAULT_TTL) {
await this.client.set(key, serializeForRedis(value), "EX", ttl);
await this.client.set(key, serialize(value), "EX", ttl);

return parseIfJSON(value) as T;
}
Expand Down
11 changes: 5 additions & 6 deletions src/policies/first-expiring-memory.policy.spec.ts
Original file line number Diff line number Diff line change
@@ -1,11 +1,11 @@
import { afterEach, describe, expect, it } from "vitest";
import { generateJSONData } from "../../tests/utilities/data.utilities";
import { MemoryCacheLevel } from "../levels";
import { MemoryPercentageLimitStrategy } from "../strategies/memory-percentage-limit.strategy";
import { RamPercentageLimitStrategy } from "../strategies/ram-percentage-limit.strategy";
import { FirstExpiringMemoryPolicy } from "./first-expiring-memory.policy";

const policy = new FirstExpiringMemoryPolicy();
const strategy = new MemoryPercentageLimitStrategy(80);
const strategy = new RamPercentageLimitStrategy(80);
const cacheEngine = new MemoryCacheLevel({
memoryStrategies: [strategy],
evictionPolicy: policy,
Expand Down Expand Up @@ -33,10 +33,9 @@ describe("First Expiring Memory Policy", () => {

const policy = new FirstExpiringMemoryPolicy();

for (let i = 0; i <= 10000; i++) {
await policy.evict(cacheEngine);
}
await policy.evict(cacheEngine);

expect(cacheEngine.getHeap().getCount()).toEqual(0);
// 10% has been removed.
expect(cacheEngine.getHeap().getCount()).toEqual(900);
});
});
3 changes: 2 additions & 1 deletion src/strategies/index.ts
Original file line number Diff line number Diff line change
@@ -1 +1,2 @@
export * from "./memory-percentage-limit.strategy";
export * from "./memory-size-limit.strategy";
export * from "./ram-percentage-limit.strategy";
13 changes: 0 additions & 13 deletions src/strategies/memory-percentage-limit.strategy.ts

This file was deleted.

Loading