Redis Caching
Grit provides a Redis-backed caching service with JSON serialization, TTL configuration, pattern-based deletion, and a Gin middleware for automatic HTTP response caching. Speed up expensive queries and reduce database load with a few lines of code.
Cache Service
The cache service at internal/cache/cache.go wraps the go-redis/v9 client with convenience methods for storing and retrieving JSON-serialized values.
// DefaultTTL is the default cache expiration time.
const DefaultTTL = 5 * time.Minute
// Cache provides a Redis-backed caching service.
type Cache struct {
client *redis.Client
}
// New creates a new Cache instance connected to the given Redis URL.
func New(redisURL string) (*Cache, error)
// Get retrieves a cached value and unmarshals it into dest.
// Returns false if the key does not exist.
func (c *Cache) Get(ctx context.Context, key string, dest interface{}) (bool, error)
// Set stores a value in the cache with the given TTL.
func (c *Cache) Set(ctx context.Context, key string, value interface{}, ttl time.Duration) error
// Delete removes a key from the cache.
func (c *Cache) Delete(ctx context.Context, key string) error
// DeletePattern removes all keys matching a glob pattern.
func (c *Cache) DeletePattern(ctx context.Context, pattern string) error
// Flush clears the entire cache.
func (c *Cache) Flush(ctx context.Context) error
// Client returns the underlying Redis client for advanced operations.
func (c *Cache) Client() *redis.Client
// Close closes the Redis connection.
func (c *Cache) Close() errorJSON Serialization
Values are automatically serialized to JSON when stored and deserialized when retrieved. You can cache any Go struct, slice, map, or primitive type.
// Cache a single struct
user := models.User{ID: 1, Name: "John", Email: "john@example.com"}
err := cache.Set(ctx, "user:1", user, 10*time.Minute)
// Retrieve it
var cachedUser models.User
found, err := cache.Get(ctx, "user:1", &cachedUser)
if found {
fmt.Println(cachedUser.Name) // "John"
}
// Cache a slice
users := []models.User{...}
err = cache.Set(ctx, "users:page:1", users, 5*time.Minute)
// Retrieve the slice
var cachedUsers []models.User
found, err = cache.Get(ctx, "users:page:1", &cachedUsers)
// Cache a map
stats := map[string]int{"total": 100, "active": 42}
err = cache.Set(ctx, "stats:users", stats, 30*time.Second)
// Cache a simple string
err = cache.Set(ctx, "config:motd", "Welcome!", 24*time.Hour)TTL Configuration
Every cache entry requires a TTL (time-to-live). The default is 5 minutes. Choose TTL values based on how frequently the data changes and how stale the data can be.
| Data Type | Suggested TTL | Reasoning |
|---|---|---|
| Dashboard stats | 30s - 1m | Frequently accessed, changes often |
| List queries | 1m - 5m | Moderate change frequency |
| User profiles | 5m - 15m | Rarely changes, frequently read |
| Configuration | 1h - 24h | Almost never changes |
| External API data | 5m - 1h | Reduce API calls, respect rate limits |
CacheResponse Middleware
The CacheResponse middleware at internal/middleware/cache.go automatically caches GET request responses. It hashes the full URL (including query parameters) to generate cache keys, and sets an X-Cache: HIT/MISS header for debugging.
// CacheResponse caches GET request responses in Redis. // Only caches 200 OK responses. Skips if no cache service available. func CacheResponse(cacheService *cache.Cache, ttl time.Duration) gin.HandlerFunc // How it works: // 1. Generate cache key from URL: sha256(request.URL.String()) // 2. Check cache: if HIT -> return cached response (X-Cache: HIT) // 3. If MISS -> capture response, serve it, then cache it // 4. Only cache 200 OK responses with non-empty bodies
Using the Middleware
import "myapp/apps/api/internal/middleware"
// Apply cache middleware to specific routes
api := router.Group("/api")
{
// Cache the products list for 2 minutes
api.GET("/products",
middleware.CacheResponse(cacheService, 2*time.Minute),
productHandler.List,
)
// Cache individual product for 5 minutes
api.GET("/products/:id",
middleware.CacheResponse(cacheService, 5*time.Minute),
productHandler.GetByID,
)
// Do NOT cache mutations
api.POST("/products", productHandler.Create)
api.PUT("/products/:id", productHandler.Update)
api.DELETE("/products/:id", productHandler.Delete)
}Cache Key Patterns
Use consistent key patterns to make cache invalidation predictable. The DeletePattern() method accepts glob patterns, making it easy to clear all keys for a resource.
// Recommended key patterns:
"user:{id}" // Single resource: user:42
"users:page:{page}" // Paginated list: users:page:1
"users:count" // Aggregation
"stats:dashboard" // Dashboard data
"config:{key}" // Configuration values
// Set with pattern-aware keys
cache.Set(ctx, "user:42", user, 10*time.Minute)
cache.Set(ctx, "users:page:1", users, 5*time.Minute)
cache.Set(ctx, "users:page:2", users, 5*time.Minute)
// Delete a single key
cache.Delete(ctx, "user:42")
// Delete all keys matching a pattern
cache.DeletePattern(ctx, "users:*") // Clears all user cache
// Flush the entire cache (use with caution)
cache.Flush(ctx)When to Use Caching
Not everything should be cached. Here are guidelines for when caching adds value versus when it adds unnecessary complexity.
Good candidates for caching
- ✓Dashboard statistics (computed aggregations)
- ✓Public product/content listings
- ✓Configuration values loaded from database
- ✓External API responses (weather, exchange rates)
- ✓User profile data (read-heavy, write-rare)
Not ideal for caching
- ✗User-specific data that changes per request
- ✗Real-time data (chat messages, live feeds)
- ✗Write-heavy endpoints (mutations)
- ✗Data that must always be fresh (payment status)
- ✗Authenticated admin panels with small user bases
Full Service Example
Here is a complete example of using the cache in a service layer with read-through caching and cache invalidation on writes.
type ProductService struct {
DB *gorm.DB
Cache *cache.Cache
}
func (s *ProductService) GetByID(ctx context.Context, id uint) (*models.Product, error) {
key := fmt.Sprintf("product:%d", id)
// Try cache first
var product models.Product
found, err := s.Cache.Get(ctx, key, &product)
if err == nil && found {
return &product, nil // Cache HIT
}
// Cache MISS -- query database
if err := s.DB.First(&product, id).Error; err != nil {
return nil, err
}
// Store in cache for next time
_ = s.Cache.Set(ctx, key, product, 10*time.Minute)
return &product, nil
}
func (s *ProductService) Update(ctx context.Context, id uint, updates map[string]interface{}) error {
if err := s.DB.Model(&models.Product{}).Where("id = ?", id).Updates(updates).Error; err != nil {
return err
}
// Invalidate the cached product AND the list cache
_ = s.Cache.Delete(ctx, fmt.Sprintf("product:%d", id))
_ = s.Cache.DeletePattern(ctx, "products:*")
return nil
}