Guide

How to Send Daily Digest Notifications

Aggregate your application's daily events into a clean summary delivered to Email, Telegram, or Slack every morning. Stay informed without drowning in real-time alerts.

Why Daily Digests Matter

Real-time notifications are essential for urgent events -- a server going down, a critical error, or a high-value order. But not every event needs to interrupt your day. New user signups, daily sales totals, error counts, API usage stats, and inventory changes are all important, but they are better consumed as a summary rather than individual pings. Sending 50 "new user signup" notifications per day creates noise. Sending one digest at 9 AM that says "47 new users signed up yesterday, up 12% from last week" provides actionable insight.

Daily digests complement your real-time alerts by providing the big picture. They help you spot trends, track KPIs, and stay informed about your application's overall health without the constant interruption of individual event notifications. With One-Ping, you can send formatted digests to email for detailed reports, Telegram for a quick mobile glance, and Slack for team-wide visibility.

What you will build: An automated daily digest system that collects the previous day's events and metrics, formats them into a structured summary, and delivers it every morning at your chosen time via One-Ping. Examples include e-commerce sales digests, server health summaries, and application activity reports.

Step-by-Step Setup

Define What to Include in Your Digest

The best digests are focused and scannable. Start by identifying the 5-8 most important metrics for your use case. For an e-commerce store: total orders, revenue, new customers, refunds, top products, and inventory warnings. For a SaaS application: new signups, active users, API calls, error rate, and churn events. For server monitoring: uptime percentage, average response time, peak load, error count, and disk usage. Avoid including everything -- a digest with 50 metrics is just as overwhelming as 50 individual notifications.

Build the Data Aggregation Query

Write a database query or series of API calls that collects the data for your digest. Query data from the previous 24 hours (or since the last digest was sent). Calculate aggregates like counts, sums, averages, and percentages. Where possible, include a comparison to the previous period (day, week, or month) to show trends. Store these queries in a dedicated digest script or n8n workflow. See the code examples below for practical implementations.

Format the Digest Message

Structure your digest for quick scanning. Use a clear heading with the date, followed by key metrics as bullet points or a table, then any notable events or highlights. For email, use HTML formatting with sections, colors, and charts. For Telegram, use clean text with bold metrics and emoji indicators for up/down trends. For Slack, use Block Kit with sections and fields for a structured layout. One-Ping's metadata field lets you customize the format for each channel.

Schedule the Digest with Cron or n8n

Set up your digest script to run automatically at a consistent time each day. Most teams prefer receiving the digest first thing in the morning -- 8 AM or 9 AM local time. Use a cron job on your server (0 9 * * * /path/to/digest.js) or an n8n Schedule Trigger node set to run daily at your preferred time. Make sure to account for time zones if your team is distributed.

Send via One-Ping to Your Preferred Channels

Call the One-Ping API with your formatted digest and the channels you want to deliver to. For most teams, the ideal combination is: Email for a detailed, archivable report that team members can reference later; Slack for immediate team visibility in a #daily-report channel; and Telegram for founders or managers who want a quick mobile summary. Use multi-channel delivery to send to all three with one API call.

Code Examples

E-commerce Daily Sales Digest (Node.js)

const { Pool } = require('pg');
const pool = new Pool();

async function sendDailyDigest() {
  const yesterday = new Date();
  yesterday.setDate(yesterday.getDate() - 1);
  const dateStr = yesterday.toISOString().split('T')[0];

  // Aggregate yesterday's data
  const stats = await pool.query(`
    SELECT
      COUNT(*) as total_orders,
      COALESCE(SUM(total), 0) as revenue,
      COUNT(DISTINCT customer_email) as unique_customers,
      COALESCE(AVG(total), 0) as avg_order_value,
      COUNT(*) FILTER (WHERE status = 'refunded') as refunds
    FROM orders
    WHERE created_at >= $1::date
    AND created_at < ($1::date + interval '1 day')
  `, [dateStr]);

  const prevStats = await pool.query(`
    SELECT COUNT(*) as total_orders, COALESCE(SUM(total), 0) as revenue
    FROM orders
    WHERE created_at >= ($1::date - interval '1 day')
    AND created_at < $1::date
  `, [dateStr]);

  const s = stats.rows[0];
  const prev = prevStats.rows[0];

  // Calculate trends
  const orderTrend = prev.total_orders > 0
    ? ((s.total_orders - prev.total_orders) / prev.total_orders * 100).toFixed(1)
    : 'N/A';
  const revTrend = prev.revenue > 0
    ? ((s.revenue - prev.revenue) / prev.revenue * 100).toFixed(1)
    : 'N/A';

  // Top products
  const topProducts = await pool.query(`
    SELECT product_name, SUM(quantity) as units, SUM(line_total) as revenue
    FROM order_items oi
    JOIN orders o ON o.id = oi.order_id
    WHERE o.created_at >= $1::date AND o.created_at < ($1::date + interval '1 day')
    GROUP BY product_name ORDER BY revenue DESC LIMIT 5
  `, [dateStr]);

  // Build the digest message
  const topList = topProducts.rows
    .map((p, i) => `${i + 1}. ${p.product_name}: ${p.units} units ($${p.revenue})`)
    .join('\n');

  const telegramMessage = `
<b>Daily Sales Digest - ${dateStr}</b>

Orders: <b>${s.total_orders}</b> (${orderTrend}% vs yesterday)
Revenue: <b>$${Number(s.revenue).toFixed(2)}</b> (${revTrend}%)
Customers: ${s.unique_customers}
Avg Order: $${Number(s.avg_order_value).toFixed(2)}
Refunds: ${s.refunds}

<b>Top Products:</b>
${topList}
  `.trim();

  // Send to all channels
  await fetch('https://api.one-ping.com/send', {
    method: 'POST',
    headers: {
      'Authorization': `Bearer ${process.env.ONEPING_API_KEY}`,
      'Content-Type': 'application/json'
    },
    body: JSON.stringify({
      message: telegramMessage,
      channels: ['telegram', 'email', 'slack'],
      metadata: {
        email: {
          to: '[email protected]',
          subject: `Daily Sales Report - ${dateStr}: $${Number(s.revenue).toFixed(2)}`,
          html: `<h2>Daily Sales Digest - ${dateStr}</h2>
            <table style="border-collapse:collapse;width:100%">
              <tr><td>Orders</td><td><b>${s.total_orders}</b> (${orderTrend}%)</td></tr>
              <tr><td>Revenue</td><td><b>$${Number(s.revenue).toFixed(2)}</b> (${revTrend}%)</td></tr>
              <tr><td>Customers</td><td>${s.unique_customers}</td></tr>
              <tr><td>Avg Order</td><td>$${Number(s.avg_order_value).toFixed(2)}</td></tr>
              <tr><td>Refunds</td><td>${s.refunds}</td></tr>
            </table>
            <h3>Top Products</h3>
            <ol>${topProducts.rows.map(p =>
              '<li>' + p.product_name + ': ' + p.units + ' units ($' + p.revenue + ')</li>'
            ).join('')}</ol>`
        },
        slack: {
          blocks: [
            { type: 'header', text: { type: 'plain_text', text: `Daily Sales Digest - ${dateStr}` }},
            { type: 'section', fields: [
              { type: 'mrkdwn', text: `*Orders:*\n${s.total_orders} (${orderTrend}%)` },
              { type: 'mrkdwn', text: `*Revenue:*\n$${Number(s.revenue).toFixed(2)} (${revTrend}%)` },
              { type: 'mrkdwn', text: `*Customers:*\n${s.unique_customers}` },
              { type: 'mrkdwn', text: `*Avg Order:*\n$${Number(s.avg_order_value).toFixed(2)}` }
            ]}
          ]
        }
      }
    })
  });

  console.log(`Daily digest sent for ${dateStr}`);
}

sendDailyDigest().catch(console.error);

Schedule this script with cron to run every morning:

# Run daily digest at 9 AM
0 9 * * * /usr/bin/node /path/to/daily-digest.js >> /var/log/daily-digest.log 2>&1

Application Activity Digest (Python)

import requests, os, psycopg2
from datetime import datetime, timedelta

def send_activity_digest():
    conn = psycopg2.connect(os.environ['DATABASE_URL'])
    cur = conn.cursor()
    yesterday = (datetime.now() - timedelta(days=1)).strftime('%Y-%m-%d')

    # Gather stats
    cur.execute("""
        SELECT
            (SELECT COUNT(*) FROM users WHERE created_at::date = %s) as new_users,
            (SELECT COUNT(*) FROM sessions WHERE created_at::date = %s) as sessions,
            (SELECT COUNT(*) FROM api_calls WHERE created_at::date = %s) as api_calls,
            (SELECT COUNT(*) FROM errors WHERE created_at::date = %s) as errors
    """, [yesterday] * 4)

    stats = cur.fetchone()
    new_users, sessions, api_calls, errors = stats

    # Error rate
    error_rate = (errors / max(api_calls, 1) * 100)

    message = f"""Daily Activity Report - {yesterday}

New Users: {new_users}
Sessions: {sessions}
API Calls: {api_calls:,}
Errors: {errors} ({error_rate:.2f}% error rate)"""

    requests.post(
        'https://api.one-ping.com/send',
        headers={'Authorization': f'Bearer {os.environ["ONEPING_API_KEY"]}'},
        json={
            'message': message,
            'channels': ['telegram', 'slack', 'email'],
            'metadata': {
                'email': {
                    'to': '[email protected]',
                    'subject': f'Daily Activity Report - {yesterday}'
                }
            }
        }
    )

    cur.close()
    conn.close()

send_activity_digest()

n8n Workflow for Daily Digests

For a no-code approach, build an n8n workflow that generates and sends your daily digest. The workflow structure is:

  1. Schedule Trigger -- set to run daily at 9 AM (or your preferred time).
  2. PostgreSQL / MySQL node -- runs your aggregation query to collect yesterday's metrics.
  3. Function node -- formats the data into a readable digest message with trend indicators and highlights.
  4. HTTP Request node -- calls the One-Ping API with the formatted digest and your preferred channels.

n8n makes it easy to add additional data sources (like Google Analytics, Stripe, or external APIs) to enrich your digest without writing complex code. You can also add a branching condition that sends a special alert if any metric exceeds a threshold, combining the digest with real-time alerting.

n8n template available: We have a ready-to-import n8n template for daily digests that includes database connection, data aggregation, formatting, and One-Ping delivery. Customize it with your own database queries and metrics.

Digest Types by Use Case

E-commerce Sales Digest

Orders, revenue, top products, new customers, refunds, and inventory warnings. Best sent to the team at 9 AM. See our e-commerce guide for real-time alerts.

Application Health Digest

Error count by type, API response times, uptime percentage, and deployment count. Complements real-time error alerts and monitoring.

User Activity Digest

New signups, active users, feature usage, churn events, and support tickets. Helps product teams track engagement without checking dashboards daily.

Inventory Status Digest

Current stock levels for all products, items approaching low-stock threshold, reorder suggestions, and sales velocity. See our stock alerts guide for urgent notifications.

Formatting Tips for Each Channel

The same digest data should be formatted differently for each channel to maximize readability:

Email: The Detailed Report

Email digests can be rich and comprehensive. Use HTML tables for metrics, inline CSS for styling, and include charts if possible. Structure with clear headings: Key Metrics, Highlights, Alerts, and Trends. Email is the channel where people expect detail, so this is where you can include everything.

Telegram: The Quick Summary

Keep the Telegram digest short and scannable. Use bold for key numbers, line breaks between sections, and simple up/down arrow characters for trends. Aim for a message that can be read in 10 seconds on a phone screen. If someone needs more detail, they can check the email version or open the dashboard.

Slack: The Team Dashboard

Use Slack Block Kit to create a structured, visually clean digest. Section blocks with fields create a two-column layout that works well for metric pairs (Orders | Revenue). Add a divider between sections and use color-coded attachments for alerts (green for positive trends, red for concerning metrics).

Best Practices

Combine real-time and digest: The most effective notification strategy uses real-time alerts for urgent events (server down, critical errors, high-value orders) and daily digests for everything else. This gives your team complete awareness without constant interruptions. Check our multi-channel guide for setting up both.

Ready to set up daily digests?

Start free with 100 messages/month. No credit card required.

Get started free