Skip to content

Implement caching strategy in collections using TTL or LRU pattern #973

@sridmad

Description

@sridmad

Proposal

Implement a robust caching strategy (such as TTL or LRU caching) in Service Fabric Explorer's collection management. This will reduce API round-trips, improve UI responsiveness, and decouple refresh intervals from strict network dependency.

Rationale

  • Collection models currently retrieve fresh data from the API on each refresh or update, causing possible performance issues and unnecessary load
  • Introducing a smart caching layer will enable faster navigation and more responsive interfaces

Example Approach

export abstract class CachedCollection<T> extends DataModelCollectionBase<T> {
    private cache = new Map<string, { data: T; timestamp: number }>();
    private readonly TTL = 30000; // 30 seconds
    protected getFromCache(id: string): T | null {
        const cached = this.cache.get(id);
        if (cached && Date.now() - cached.timestamp < this.TTL) {
            return cached.data;
        }
        return null;
    }
}

Tasks

  • Identify collection types that would benefit from caching
  • Choose and implement a caching strategy (TTL and/or LRU)
  • Add cache invalidation and stale data handling
  • Add/adjust tests to verify caching and invalidation logic

Benefits

  • Faster navigation and data rendering
  • Reduce redundant network requests
  • Improved perceived and actual performance

Labels: performance, architecture, code quality

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions