LL
LightningLogs

API Documentation

API Version: v1
Docs Version: 1.0.0
Last Updated: Jan 2024

Log Management

Ingest, retrieve, and manage your application logs with powerful filtering and search capabilities.

Overview

The log management API allows you to ingest logs from your applications and retrieve them with advanced filtering, search, and pagination capabilities. All logs are automatically associated with your tenant and stored securely.

Log Ingestion

  • • Single log ingestion
  • • Bulk log ingestion (up to 1000 logs)
  • • Automatic timestamp and tenant association
  • • Structured metadata support

Log Retrieval

  • • Advanced filtering by level, date, service
  • • Full-text search in log messages
  • • Pagination and sorting
  • • Real-time statistics

Log Levels

Use appropriate log levels to categorize your logs and enable effective filtering and alerting.

Trace

Detailed debugging information

Debug

Debugging information

Info

General information

Warn

Warning messages

Error

Error messages

Ingest Single Log

Send individual log entries to LightningLogs. This is perfect for low-volume logging or when you need immediate feedback on log ingestion.

Basic Example

import axios from 'axios';

const apiKey = 'YOUR_API_KEY';
const baseURL = 'https://api.lightninglogs.com/api/v1';

// Single log ingestion
const ingestLog = async () => {
  try {
    const response = await axios.post(`${baseURL}/logs`, {
      level: 'info',
      message: 'User login successful',
      metadata: {
        userId: '12345',
        ipAddress: '192.168.1.1'
      }
    }, {
      headers: {
        'Authorization': `ApiKey ${apiKey}`,
        'Content-Type': 'application/json'
      }
    });
    
    console.log('Log ingested:', response.data);
  } catch (error) {
    console.error('Error ingesting log:', error);
  }
};
{
  "success": true,
  "message": "Log entry created successfully",
  "id": "507f1f77bcf86cd799439011"
}

Bulk Log Ingestion

Send multiple log entries in a single request for high-volume logging. This is more efficient than individual requests and reduces API overhead.

Bulk Example

import axios from 'axios';

const apiKey = 'YOUR_API_KEY';
const baseURL = 'https://api.lightninglogs.com/api/v1';

// Bulk log ingestion
const ingestBulkLogs = async () => {
  try {
    const response = await axios.post(`${baseURL}/logs/batch`, {
      logs: [
        {
          level: 'info',
          message: 'Request processed successfully',
          metadata: { requestId: 'req_123', duration: 150 }
        },
        {
          level: 'warn',
          message: 'High memory usage detected',
          metadata: { memoryUsage: '85%', threshold: '80%' }
        },
        {
          level: 'error',
          message: 'Payment processing failed',
          metadata: { errorCode: 'INSUFFICIENT_FUNDS', amount: 100.50 }
        }
      ]
    }, {
      headers: {
        'Authorization': `ApiKey ${apiKey}`,
        'Content-Type': 'application/json'
      }
    });
    
    console.log('Bulk logs ingested:', response.data);
  } catch (error) {
    console.error('Error ingesting bulk logs:', error);
  }
};
{
  "success": true,
  "message": "Logs ingested successfully",
  "ingested": 3,
  "failed": 0,
  "errors": []
}

💡 Best Practice

Use bulk ingestion for high-volume logging scenarios. The API accepts up to 1,000 logs per request. For even higher volumes, consider batching your logs and sending multiple requests in parallel.

Retrieve Logs

Retrieve logs with powerful filtering, search, and pagination capabilities. All queries are scoped to your tenant.

Filtered Retrieval

import axios from 'axios';

const apiKey = 'YOUR_API_KEY';
const baseURL = 'https://api.lightninglogs.com/api/v1';

// Retrieve logs with filtering
const retrieveLogs = async () => {
  try {
    const params = new URLSearchParams({
      level: 'error',
      startDate: '2024-01-15T00:00:00Z',
      endDate: '2024-01-15T23:59:59Z',
      search: 'database',
      page: '1',
      limit: '50'
    });
    
    const response = await axios.get(`${baseURL}/logs?${params}`, {
      headers: {
        'Authorization': `ApiKey ${apiKey}`
      }
    });
    
    console.log('Retrieved logs:', response.data);
  } catch (error) {
    console.error('Error retrieving logs:', error);
  }
};
{
  "data": [
    {
      "id": "507f1f77bcf86cd799439011",
      "timestamp": "2024-01-15T10:30:00Z",
      "level": "info",
      "message": "User login successful",
      "metadata": {
        "userId": "12345",
        "ipAddress": "192.168.1.1"
      },
      "service": "auth-service"
    }
  ],
  "total": 1250,
  "page": 1,
  "limit": 10,
  "pages": 125
}

Query Parameters

ParameterTypeDescriptionExample
levelstringFilter by log levelerror
searchstringSearch in log messagesdatabase
startDateISO 8601Start date for filtering2024-01-15T00:00:00Z
endDateISO 8601End date for filtering2024-01-15T23:59:59Z
pageintegerPage number (1-10,000)1
limitintegerLogs per page (1-1,000)100
servicestringFilter by service nameauth-service

Log Statistics

Get insights into your log volume, distribution by level, and ingestion patterns.

Get Statistics

import axios from 'axios';

const apiKey = 'YOUR_API_KEY';
const baseURL = 'https://api.lightninglogs.com/api/v1';

// Get log statistics
const getLogStats = async () => {
  try {
    const params = new URLSearchParams({
      startDate: '2024-01-01T00:00:00Z',
      endDate: '2024-01-31T23:59:59Z'
    });
    
    const response = await axios.get(`${baseURL}/logs/stats?${params}`, {
      headers: {
        'Authorization': `ApiKey ${apiKey}`
      }
    });
    
    console.log('Log statistics:', response.data);
  } catch (error) {
    console.error('Error getting log stats:', error);
  }
};
{
  "totalLogs": 125000,
  "logsByLevel": {
    "trace": 5000,
    "debug": 15000,
    "info": 80000,
    "warn": 15000,
    "error": 10000
  },
  "logsByService": {
    "auth-service": 45000,
    "payment-service": 35000,
    "user-service": 25000,
    "notification-service": 20000
  },
  "ingestionRate": 125.5,
  "timeRange": {
    "start": "2024-01-01T00:00:00Z",
    "end": "2024-01-31T23:59:59Z"
  }
}

Best Practices

Log Structure

  • • Use descriptive, actionable log messages
  • • Include relevant context in metadata
  • • Use consistent log levels across your application
  • • Avoid logging sensitive information (passwords, tokens, etc.)
  • • Use structured logging for better searchability

Performance

  • • Use bulk ingestion for high-volume logging
  • • Implement retry logic with exponential backoff
  • • Batch logs before sending to reduce API calls
  • • Use appropriate log levels to control storage costs
  • • Monitor your ingestion rate and adjust accordingly

Monitoring

  • • Set up alerts for high error rates
  • • Monitor log volume trends
  • • Use service names to organize logs
  • • Regularly review and clean up old logs
  • • Set up dashboards for log analytics

Error Handling

Status CodeError CodeDescription
400VALIDATION_ERRORInvalid log format or missing required fields
401AUTHENTICATION_ERRORInvalid or missing API key
403AUTHORIZATION_ERRORInsufficient permissions for log operations
422LOG_INGESTION_ERRORLog ingestion failed due to processing error
429RATE_LIMIT_ERRORToo many requests, rate limit exceeded
402QUOTA_EXCEEDED_ERRORLog volume exceeds plan limits