Skip to main content

Cache Services Overview

DreamFactory supports multiple caching backends to improve API performance. Caching stores frequently accessed data in memory, reducing database load and improving response times for read-heavy workloads.


Supported Cache Services

ServiceTypeUse CaseDocumentation
RedisIn-memory storeProduction caching, sessions, queuesRedis Guide
MemcachedIn-memory cacheSimple key-value cachingMemcached Guide
LocalFile-basedDevelopment, single-instance deploymentsBuilt-in
FileFile-basedDevelopment, disk-based persistenceBuilt-in
DatabaseSQL-backedSimple deployments without external cacheBuilt-in

Cache Use Cases in DreamFactory

API Response Caching

Cache API responses to reduce database queries:

  • Per-service caching: Enable caching on individual database or file services
  • TTL configuration: Set expiration times based on data volatility
  • Cache invalidation: Automatic clearing on write operations

Session Storage

Store user sessions in a distributed cache:

  • Scalability: Share sessions across multiple DreamFactory instances
  • Performance: Faster session lookups than database
  • Persistence: Redis can persist sessions across restarts

Rate Limiting

Track API request counts per user or role:

  • Distributed counting: Works across load-balanced instances
  • Atomic operations: Accurate request tracking
  • Automatic expiry: Counters reset after time windows

Choosing a Cache Backend

Redis vs Memcached

FeatureRedisMemcached
Data structuresStrings, lists, sets, hashes, sorted setsStrings only
PersistenceOptional disk persistenceMemory only
ReplicationMaster-slave, clusteringNone (client-side)
Pub/SubYesNo
Lua scriptingYesNo
Memory efficiencyGoodExcellent for simple data
Max key size512 MB1 MB
Use caseFeature-rich, sessions, queuesSimple caching

Recommendation

  • Production: Use Redis for its flexibility, persistence options, and support for sessions
  • Development: Use Local cache for simplicity
  • High-volume simple caching: Use Memcached for maximum memory efficiency

Cache Architecture

┌─────────────────┐     ┌──────────────────┐     ┌─────────────────┐
│ API Client │────▶│ DreamFactory │────▶│ Database │
│ │◀────│ │◀────│ │
└─────────────────┘ └────────┬─────────┘ └─────────────────┘

│ Cache check

┌─────────────────┐
│ Cache Backend │
│ (Redis/Memcached)
└─────────────────┘

Cache Flow

  1. Request arrives at DreamFactory
  2. Cache check: If cached response exists and is valid, return it
  3. Cache miss: Query the database/service
  4. Store in cache: Save response with configured TTL
  5. Return response to client

Cache Invalidation

DreamFactory automatically invalidates cache entries when:

  • POST/PUT/PATCH/DELETE operations modify data
  • TTL expires based on configured duration
  • Manual flush via admin API

Configuring Cache Services

System-Wide Cache

Configure the default cache backend in DreamFactory's environment:

CACHE_DRIVER=redis
REDIS_HOST=localhost
REDIS_PORT=6379
REDIS_PASSWORD=your_password

Per-Service Caching

Enable caching on individual services:

  1. Navigate to the service configuration
  2. Enable Cache Enabled option
  3. Set Cache TTL in seconds
  4. Save the service

Cache API Endpoints

DreamFactory exposes a Cache API for direct cache operations:

MethodEndpointDescription
GET/api/v2/{cache_service}/{key}Get cached value
POST/api/v2/{cache_service}Store value in cache
DELETE/api/v2/{cache_service}/{key}Delete cached value
DELETE/api/v2/{cache_service}Flush all cached values

Example: Store a Value

curl -X POST "https://example.com/api/v2/cache" \
-H "X-DreamFactory-API-Key: YOUR_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"key": "user:123:profile",
"value": {"name": "John", "email": "john@example.com"},
"ttl": 3600
}'

Example: Retrieve a Value

curl -X GET "https://example.com/api/v2/cache/user:123:profile" \
-H "X-DreamFactory-API-Key: YOUR_API_KEY"

Performance Best Practices

TTL Configuration

Data TypeRecommended TTLRationale
Static reference data24 hoursRarely changes
User profiles5-15 minutesBalances freshness and performance
Real-time data30-60 secondsFrequently updated
Session data30 minutes - 2 hoursMatches session duration

Cache Key Design

Use structured, predictable key patterns:

{resource_type}:{id}:{sub_resource}

Examples:

  • user:123:profile
  • product:456:inventory
  • order:789:items

Memory Management

  • Monitor memory usage to prevent eviction
  • Set appropriate TTLs to expire stale data
  • Use Redis maxmemory-policy for graceful eviction

Common Errors

Error CodeMessageCauseSolution
400Bad RequestInvalid key or value formatCheck request payload
401UnauthorizedMissing API keyAdd API key header
404Not FoundKey does not existHandle cache miss in client
503Service UnavailableCache backend unreachableCheck cache server status

Next Steps