Performance Optimization

Optimize your Sent integration for high throughput, low latency, and efficient resource usage.

Connection Pooling

Reuse HTTP connections to reduce overhead:

import { Agent } from 'https';

const httpsAgent = new Agent({
  keepAlive: true,
  maxSockets: 50,
  maxFreeSockets: 10,
  timeout: 60000,
  freeSocketTimeout: 30000
});

const client = new SentDm({
  httpAgent: httpsAgent
});

Caching Contact Data

Cache contact lookups to reduce API calls:

import NodeCache from 'node-cache';

const contactCache = new NodeCache({ stdTTL: 300 }); // 5 minutes

async function getContactWithCache(contactId: string) {
  // Check cache first
  let contact = contactCache.get(contactId);

  if (!contact) {
    // Fetch from API
    const response = await client.contacts.get(contactId);
    contact = response.data;

    // Store in cache
    contactCache.set(contactId, contact);
  }

  return contact;
}

Batching

Send messages in batches for better throughput. The API accepts up to 1000 recipients per request:

// Up to 1000 recipients per request
const BATCH_SIZE = 1000;

for (let i = 0; i < recipients.length; i += BATCH_SIZE) {
  const batch = recipients.slice(i, i + BATCH_SIZE);

  await client.messages.send({
    to: batch,
    template: { id: templateId }
  });

  // Small delay between batches
  await sleep(100);
}

Async Processing

Use queues for non-blocking message sending:

// Add to queue
await messageQueue.add({
  phoneNumber,
  templateId
}, {
  attempts: 3,
  backoff: 'exponential'
});

// Process in background
queue.process(async (job) => {
  return client.messages.send({
    to: [job.data.phoneNumber],
    template: { id: job.data.templateId }
  });
});

Monitoring Performance

Track key metrics:

const metrics = {
  // Request latency
  requestDuration: new Histogram('sent_request_duration_seconds'),

  // Throughput
  requestsPerSecond: new Counter('sent_requests_total'),

  // Cache hit rate
  cacheHits: new Counter('cache_hits_total'),
  cacheMisses: new Counter('cache_misses_total')
};

On this page