api integrations fintech

How to Integrate Silverfin API: Ruby, Python, TypeScript

- 28 min read

Step-by-step Silverfin API integration guide with code in Ruby, Python, and TypeScript. Covers OAuth2 auth, tiered sync, and custom financial dashboards.

Silverfin API integration architecture showing OAuth2 flow, sync layer, and period-aware data modeling

Silverfin’s API opens up real possibilities for building custom applications on top of their accounting platform. Whether you need a client-facing portal, an internal dashboard for your firm, or workflow automation that reacts to financial data changes, getting your Silverfin integration right from the start makes everything else easier.

I have worked with several accounting platform APIs, and the architectural decisions are remarkably consistent regardless of the language or framework you use. This post walks through those decisions using Silverfin as the primary example - from OAuth2 authentication through intelligent syncing to building custom calculations on the financial data you pull in. Every code example is shown in Ruby, Python, and TypeScript so you can follow along in your stack.

What Silverfin’s API Gives You

Silverfin’s API provides access to structured financial data organized around the concepts accountants work with daily: companies, financial periods, account balances, and account mappings. The important characteristic is that this data is period-aware - balances and adjustments are organized by reporting period rather than presented as a single current state.

Data Type What You Get What You Can Build
Companies Client and entity records with metadata Multi-tenant dashboards, client portals
Financial periods Date ranges with open/closed status Trend analysis, period-over-period reporting
Account balances Per-period values with opening balances KPI calculations, margin tracking
Account mappings Normalized categories across charts of accounts Cross-client comparisons, portfolio analytics

This period-aware structure means the API already gives you the raw material for trend analysis and comparisons without reconstructing the timeline yourself. Each period carries a status indicating whether it is still being worked on or has been finalized, which directly affects how much you can trust the numbers.

One thing worth checking before committing to an architecture: does the API support webhooks or event notifications for your use case? That single detail shapes your entire sync strategy, and it is easy to overlook until you are deep into implementation.

Silverfin OAuth2 Authentication Setup

Silverfin uses OAuth2 with the authorization code grant. The implementation follows the standard RFC 6749 flow in any language, but there are details specific to financial integrations that matter.

Store Tokens Per Connection

A firm might have separate Silverfin environments or need to connect additional accounting platforms alongside Silverfin. Each connection needs its own token pair, and tokens must be encrypted at rest.

class AccountingConnection < ApplicationRecord
  belongs_to :organization

  encrypts :access_token, :refresh_token

  enum :provider, { silverfin: 0, xero: 1 }

  scope :active, -> { where.not(access_token: nil) }
end
from sqlalchemy import Column, Integer, Enum, ForeignKey
from sqlalchemy.orm import relationship
from app.crypto import EncryptedString
import enum

class Provider(enum.Enum):
    silverfin = "silverfin"
    xero = "xero"

class AccountingConnection(Base):
    __tablename__ = "accounting_connections"

    id = Column(Integer, primary_key=True)
    organization_id = Column(Integer, ForeignKey("organizations.id"))
    provider = Column(Enum(Provider), nullable=False)
    access_token = Column(EncryptedString)   # custom type
    refresh_token = Column(EncryptedString)  # custom type

    organization = relationship("Organization")
// Prisma schema defines the model; encryption via middleware
import { PrismaClient } from "@prisma/client";
import { encrypt, decrypt } from "./crypto";

// model AccountingConnection {
//   id             Int      @id @default(autoincrement())
//   organizationId Int
//   provider       Provider  // enum: silverfin, xero
//   accessToken    String?
//   refreshToken   String?
// }

const prisma = new PrismaClient().$extends({
  result: {
    accountingConnection: {
      accessToken: {
        needs: { accessToken: true },
        compute(conn) {
          return conn.accessToken ? decrypt(conn.accessToken) : null;
        },
      },
    },
  },
});

Encrypt tokens at rest regardless of your stack. This is non-negotiable when dealing with financial data.

Handle Token Refresh Transparently

Token refresh is the single most common cause of “the integration stopped working” support tickets. If your HTTP client refreshes an expired access token mid-request and you do not persist the new refresh token immediately, the next refresh attempt fails. The firm has to re-authorize manually. In practice, this destroys trust in the integration.

class SilvefinClient
  def initialize(connection)
    @connection = connection
    @base_url = ENV.fetch("SILVERFIN_API_URL")
  end

  def companies
    get("/companies")
  end

  def period_accounts(company_id, period_id)
    get("/companies/#{company_id}/periods/#{period_id}/accounts")
  end

  private

  def get(path)
    response = request(:get, path)
    JSON.parse(response.body)
  end

  def request(method, path, **options)
    response = http_client.send(method, "#{@base_url}#{path}", **options)
    if response.status == 401
      refresh_tokens!
      response = http_client.send(method, "#{@base_url}#{path}", **options)
    end
    response
  end

  def refresh_tokens!
    new_tokens = perform_token_refresh
    @connection.update!(
      access_token: new_tokens[:access_token],
      refresh_token: new_tokens[:refresh_token]
    )
  end

  def http_client
    @http_client ||= Faraday.new do |f|
      f.request :authorization, "Bearer", -> { @connection.access_token }
      f.response :raise_error
    end
  end
end
import httpx
import os
from dataclasses import dataclass

@dataclass
class SilvefinClient:
    connection: AccountingConnection
    base_url: str = os.environ["SILVERFIN_API_URL"]

    async def companies(self) -> list[dict]:
        return await self._get("/companies")

    async def period_accounts(
        self, company_id: int, period_id: int
    ) -> list[dict]:
        return await self._get(
            f"/companies/{company_id}/periods/{period_id}/accounts"
        )

    async def _get(self, path: str) -> dict:
        response = await self._request("GET", path)
        return response.json()

    async def _request(self, method: str, path: str):
        async with httpx.AsyncClient() as client:
            headers = {
                "Authorization": f"Bearer {self.connection.access_token}"
            }
            response = await client.request(
                method, f"{self.base_url}{path}", headers=headers
            )
            if response.status_code == 401:
                await self._refresh_tokens()
                headers["Authorization"] = (
                    f"Bearer {self.connection.access_token}"
                )
                response = await client.request(
                    method, f"{self.base_url}{path}", headers=headers
                )
            response.raise_for_status()
            return response

    async def _refresh_tokens(self):
        new_tokens = await perform_token_refresh(self.connection)
        self.connection.access_token = new_tokens["access_token"]
        self.connection.refresh_token = new_tokens["refresh_token"]
        await save_connection(self.connection)  # persist immediately
class SilvefinClient {
  private connection: AccountingConnection;
  private baseUrl: string;

  constructor(connection: AccountingConnection) {
    this.connection = connection;
    this.baseUrl = process.env.SILVERFIN_API_URL!;
  }

  async companies(): Promise<Company[]> {
    return this.get("/companies");
  }

  async periodAccounts(
    companyId: number, periodId: number
  ): Promise<Account[]> {
    return this.get(
      `/companies/${companyId}/periods/${periodId}/accounts`
    );
  }

  private async get<T>(path: string): Promise<T> {
    const response = await this.request("GET", path);
    return response.json();
  }

  private async request(
    method: string, path: string
  ): Promise<Response> {
    let response = await fetch(`${this.baseUrl}${path}`, {
      method,
      headers: {
        Authorization: `Bearer ${this.connection.accessToken}`,
      },
    });

    if (response.status === 401) {
      await this.refreshTokens();
      response = await fetch(`${this.baseUrl}${path}`, {
        method,
        headers: {
          Authorization: `Bearer ${this.connection.accessToken}`,
        },
      });
    }

    if (!response.ok)
      throw new Error(`Silverfin API error: ${response.status}`);
    return response;
  }

  private async refreshTokens(): Promise<void> {
    const tokens = await performTokenRefresh(this.connection);
    this.connection.accessToken = tokens.accessToken;
    this.connection.refreshToken = tokens.refreshToken;
    await saveConnection(this.connection); // persist immediately
  }
}

Be deliberate about scopes. Request the minimum access you need. Financial data is sensitive, and firms will scrutinize what you are asking for. If you only need read access to accounts and balances, do not request write scopes. You can always prompt for additional scopes later if the product grows into it.

Syncing Financial Data from Silverfin

Sync financial data from Silverfin using a tiered polling cadence that matches accounting workflow rhythms. Poll recently closed periods every 4 hours during active reconciliation, quieter clients daily, and inactive clients every few days.

If webhooks are available, use them as triggers and fetch the full updated resource on notification. If you are polling, the interesting question becomes how intelligently you poll.

The naive approach is a recurring background job that pulls everything for every connected client on a fixed schedule. That works when you have five clients. It falls apart at five hundred. You hit rate limits, waste cycles syncing clients whose data has not changed, and create unnecessary load on both systems.

Tiered Sync Cadence

Accounting data does not change uniformly. A client in the middle of their financial year, months from any reporting deadline, has minimal activity. A client whose period just closed and whose accountant is actively reconciling might have changes every few hours. You can infer which tier a client belongs to from the period dates you already have.

class SyncCadence
  TIERS = {
    active:  4.hours,   # recently closed period
    winding: 1.day,     # 1-3 months past close
    quiet:   3.days     # no recent activity
  }.freeze

  def self.for(connection)
    most_recent = connection.financial_periods
                            .order(end_date: :desc).first
    return new(interval: TIERS[:active]) unless most_recent

    days = (Date.current - most_recent.end_date).to_i

    interval = case days
               when 0..30  then TIERS[:active]
               when 31..90 then TIERS[:winding]
               else             TIERS[:quiet]
               end
    new(interval: interval)
  end
end
from datetime import date, timedelta
from dataclasses import dataclass

SYNC_TIERS = {
    "active":  timedelta(hours=4),   # recently closed period
    "winding": timedelta(days=1),    # 1-3 months past close
    "quiet":   timedelta(days=3),    # no recent activity
}

@dataclass
class SyncCadence:
    interval: timedelta

    @classmethod
    def for_connection(cls, connection) -> "SyncCadence":
        most_recent = get_latest_period(connection)
        if not most_recent:
            return cls(interval=SYNC_TIERS["active"])

        days_since = (date.today() - most_recent.end_date).days

        if days_since <= 30:
            tier = "active"
        elif days_since <= 90:
            tier = "winding"
        else:
            tier = "quiet"

        return cls(interval=SYNC_TIERS[tier])
const SYNC_TIERS = {
  active:  4 * 60 * 60_000,       // 4 hours in ms
  winding: 24 * 60 * 60_000,      // 1 day
  quiet:   3 * 24 * 60 * 60_000,  // 3 days
} as const;

function syncIntervalFor(
  connection: AccountingConnection
): number {
  const periods = connection.financialPeriods;
  if (!periods.length) return SYNC_TIERS.active;

  const mostRecent = periods.sort(
    (a, b) => b.endDate.getTime() - a.endDate.getTime()
  )[0];

  const daysSinceClose = Math.floor(
    (Date.now() - mostRecent.endDate.getTime()) / 86_400_000
  );

  if (daysSinceClose <= 30) return SYNC_TIERS.active;
  if (daysSinceClose <= 90) return SYNC_TIERS.winding;
  return SYNC_TIERS.quiet;
}

Schedule these sync jobs through your background processor of choice. If you are using Ruby on Rails, Solid Queue’s recurring jobs work well here. For Python, Celery Beat or APScheduler. For TypeScript, BullMQ or node-cron. The key is combining domain-aware scheduling with concurrency controls to prevent overlapping syncs from competing for the same API quota.

Sync Strategy API Efficiency Complexity Best For
Fixed interval polling Low - syncs everything every cycle Simple Small integrations, under 50 clients
Tiered cadence High - syncs based on period activity Moderate Growing client base, 50-500+ clients
Webhook-driven Highest - only fetches on change Moderate Platforms with reliable webhook delivery
Hybrid (webhooks + polling fallback) Highest with resilience Complex Mission-critical financial dashboards

The tiered approach fits most accounting API integration projects well. It scales without over-engineering, and the domain logic is easy for the team to understand because it mirrors how accountants actually work.

Modeling Period-Aware Financial Data

Store Silverfin’s financial data locally using snapshot models that capture per-account, per-period state at each sync. This gives you a fast, API-independent time series for dashboard queries and offline resilience.

The core schema needs three entities: an AccountingCompany (the Silverfin client), a FinancialPeriod (a date range with open/closed status), and an AccountSnapshot (one account’s balance for one period).

The period_status field deserves attention. Silverfin distinguishes between periods that are still being worked on and periods that have been closed and finalized. Track this locally because it affects how much you should trust the numbers. A finalized period is settled. A draft period might change tomorrow. Any dashboard sitting on top of this data needs to show the difference.

For query performance on snapshot data, compound indexes on (accounting_company_id, financial_period_id, mapping_category) make the metric calculations fast even across large portfolios. If you are using PostgreSQL, the compound indexing patterns I have written about apply directly here.

If your Silverfin integration handles large volumes of historical financial data and you are running PostgreSQL, the TimescaleDB hypertable approach maps naturally to period-aware snapshots. Time-series partitioning and columnar compression can significantly reduce storage costs for finalized periods that never change.

Building Custom Dashboards on Silverfin Data

Compute custom financial metrics - gross margin, debtor days, cash runway - from your local Silverfin snapshots using normalized account mappings. This extends Silverfin with bespoke KPIs specific to your firm’s workflow.

The mapping_category field is doing important work here. Different companies use different charts of accounts. Account 4000 might be revenue for one client and cost of sales for another. Silverfin provides account mapping capabilities that normalize this across clients. Land that normalized category alongside the raw account data so your calculations work across any client portfolio.

class FinancialMetrics
  def initialize(company, period)
    @snapshots = company.account_snapshots
                        .where(financial_period: period)
  end

  def gross_margin
    return nil unless revenue.nonzero?
    ((revenue - cost_of_sales) / revenue * 100).round(2)
  end

  def debtor_days(days_in_period)
    return nil unless revenue.nonzero?
    (trade_debtors / revenue * days_in_period).round(1)
  end

  def cash_runway_months
    return nil unless monthly_burn.negative?
    (cash_balance / monthly_burn.abs).round(1)
  end

  private

  def revenue
    @revenue ||= sum_by_mapping("revenue")
  end

  def cost_of_sales
    @cost_of_sales ||= sum_by_mapping("cost_of_sales")
  end

  def sum_by_mapping(category)
    @snapshots.where(mapping_category: category).sum(:value)
  end
end
from decimal import Decimal

class FinancialMetrics:
    def __init__(self, snapshots: list[AccountSnapshot]):
        self._snapshots = snapshots

    def gross_margin(self) -> Decimal | None:
        rev = self._sum("revenue")
        if not rev:
            return None
        cos = self._sum("cost_of_sales")
        return round((rev - cos) / rev * 100, 2)

    def debtor_days(self, days_in_period: int) -> Decimal | None:
        rev = self._sum("revenue")
        if not rev:
            return None
        debtors = self._sum("trade_debtors")
        return round(debtors / rev * days_in_period, 1)

    def cash_runway_months(self) -> Decimal | None:
        burn = self._sum("operating_expenses")
        if burn >= 0:
            return None
        cash = self._sum("cash_and_equivalents")
        return round(cash / abs(burn), 1)

    def _sum(self, category: str) -> Decimal:
        return sum(
            s.value for s in self._snapshots
            if s.mapping_category == category
        )
class FinancialMetrics {
  private snapshots: AccountSnapshot[];

  constructor(snapshots: AccountSnapshot[]) {
    this.snapshots = snapshots;
  }

  grossMargin(): number | null {
    const rev = this.sumByMapping("revenue");
    if (rev === 0) return null;
    const cos = this.sumByMapping("cost_of_sales");
    return Math.round(((rev - cos) / rev) * 10000) / 100;
  }

  debtorDays(daysInPeriod: number): number | null {
    const rev = this.sumByMapping("revenue");
    if (rev === 0) return null;
    const debtors = this.sumByMapping("trade_debtors");
    return Math.round((debtors / rev) * daysInPeriod * 10) / 10;
  }

  cashRunwayMonths(): number | null {
    const burn = this.sumByMapping("operating_expenses");
    if (burn >= 0) return null;
    const cash = this.sumByMapping("cash_and_equivalents");
    return Math.round((cash / Math.abs(burn)) * 10) / 10;
  }

  private sumByMapping(category: string): number {
    return this.snapshots
      .filter((s) => s.mappingCategory === category)
      .reduce((sum, s) => sum + Number(s.value), 0);
  }
}

Computing metrics for a single company and period is useful. Computing them across periods gives you trend analysis. Computing them across companies gives you portfolio-level insights - which is where the real leverage is for firms managing hundreds of clients.

Silverfin Integration in Production

Rate limits, mixed period states, chart of accounts inconsistency, and multi-entity structures are the four production challenges that catch most Silverfin integrations off guard.

Mixed Period States

When you display a trend over the last twelve months and the most recent period is still in draft, do you include it? My approach: show it but distinguish it visually and exclude it from aggregate calculations like averages. A dashed line or a “provisional” label is enough. The worst outcome is presenting draft numbers with the same confidence as finalized ones. The moment a number changes after someone relied on it, you lose trust permanently.

Chart of Accounts Inconsistency

If you compute metrics across multiple companies, you are implicitly assuming that accounts mapped to “revenue” mean the same thing everywhere. Within a single firm working in one jurisdiction, this usually holds. Across industries, it might not. There is no clean technical solution to this. It is a data quality problem that needs a human review step during onboarding.

Rate Limiting at Scale

Every API has rate limits, and they vary in how they are applied. When syncing hundreds of connections, track your remaining budget per connection and back off gracefully.

class RateLimitTracker
  def self.record(connection, headers)
    remaining = headers["X-RateLimit-Remaining"]&.to_i
    reset_at  = headers["X-RateLimit-Reset"]&.to_i

    Rails.cache.write(
      "rate_limit:#{connection.id}",
      { remaining: remaining, reset_at: Time.at(reset_at) },
      expires_in: 1.hour
    )
  end

  def self.can_request?(connection)
    limit = Rails.cache.read("rate_limit:#{connection.id}")
    return true if limit.nil?
    return true if Time.current > limit[:reset_at]
    limit[:remaining] > 0
  end
end
import time
from redis import Redis

redis = Redis()

class RateLimitTracker:
    @staticmethod
    def record(connection_id: int, headers: dict):
        remaining = int(headers.get("X-RateLimit-Remaining", 0))
        reset_at = int(headers.get("X-RateLimit-Reset", 0))

        redis.hset(f"rate_limit:{connection_id}", mapping={
            "remaining": remaining,
            "reset_at": reset_at,
        })
        redis.expire(f"rate_limit:{connection_id}", 3600)

    @staticmethod
    def can_request(connection_id: int) -> bool:
        data = redis.hgetall(f"rate_limit:{connection_id}")
        if not data:
            return True
        if time.time() > int(data[b"reset_at"]):
            return True
        return int(data[b"remaining"]) > 0
const rateLimits = new Map<
  number,
  { remaining: number; resetAt: number }
>();

function recordRateLimit(
  connectionId: number, headers: Headers
): void {
  const remaining = Number(
    headers.get("X-RateLimit-Remaining") ?? 0
  );
  const resetAt = Number(
    headers.get("X-RateLimit-Reset") ?? 0
  );

  rateLimits.set(connectionId, { remaining, resetAt });
  setTimeout(
    () => rateLimits.delete(connectionId), 3_600_000
  );
}

function canRequest(connectionId: number): boolean {
  const limit = rateLimits.get(connectionId);
  if (!limit) return true;
  if (Date.now() / 1000 > limit.resetAt) return true;
  return limit.remaining > 0;
}

Integrate this check into your sync jobs so they reschedule automatically when budget is low rather than failing with 429 errors.

Multi-Entity Structures

Some businesses are not a single company but a group - a parent with subsidiaries, a franchise model, or a holding company. Silverfin models each entity separately, but your dashboards might need to consolidate them. Decide early: does your company model support a parent_id for hierarchy, and do your metrics know how to roll up?

Limitations and When to Keep It Simple

Not every Silverfin integration needs the full sync-and-snapshot architecture. Here is when each approach makes sense:

Integration Need Recommended Approach Why
One-off data export Direct API calls, no local persistence Adding sync infrastructure for a batch job is over-engineering
Simple automation (e.g., alerts on period close) Webhook listener or lightweight poller No need for local snapshots if you are just triggering actions
Custom dashboard with historical trends Full sync + snapshot + calculation layer Fast local queries and offline resilience justify the complexity
Cross-client portfolio analytics Full sync + normalized mappings + aggregations This is where the architecture earns its keep

The sync layer and snapshot models add complexity that pays off when you need fast local queries, historical trend data, or cross-client analytics. For simpler use cases, a direct API call without local persistence might be the right choice.

Other trade-offs to consider:

  • Data staleness. Your local snapshots are always slightly behind. For most accounting workflows this is fine - financial data moves on a daily or weekly cadence, not in real-time. But if your product promises “live” data, be honest about the sync delay.
  • Schema drift. APIs evolve. Fields get added, deprecated, or renamed. Build your sync layer to be tolerant of unexpected fields and log warnings rather than crashing on schema changes.
  • Operational overhead. The sync infrastructure needs monitoring. Failed syncs, stale connections, and token refresh errors should surface in your alerting before users notice.

Key Architecture Decisions

Step back from the specifics and the Silverfin integration follows a clear pattern regardless of your language or framework:

  • Token lifecycle as a first-class concern. Silent failures in token refresh create the worst kind of bugs - intermittent, hard to reproduce, and visible only to end users.
  • Domain-aware sync. Using financial period activity to drive sync cadence applies domain knowledge to an infrastructure problem. More effective than generic retry and backoff patterns.
  • Local snapshots over live queries. Decoupling your UI from API availability means outages upstream do not become outages for your users.
  • Data confidence as a feature. Distinguishing draft from finalized data is the difference between a tool people trust and one they second-guess.

The interesting problems are not in the API integration plumbing. They are in the product decisions you make on top of the data once it is in your hands.


Need help building integrations with accounting platforms? I help teams with API architecture, data sync design, and production deployment for financial data applications.

Further Reading