You know what's funny? We spent decades moving everything to the cloud, convincing everyone that "software as a service" was the future. And now? Now we're realizing that maybe, just maybe, we gave up something important along the way.
I'm talking about ownership. About reliability. About not losing your work when the WiFi drops.
Local-first software is the idea that your data lives primarily on your device, and syncing to the cloud is just a convenient backup and collaboration feature, not a requirement. Think Figma's offline mode, Linear's instant UI, or how Obsidian works entirely on your local files.
Building the sync engine that makes this possible? That's the hard part. And that's exactly what we're going to build today.
What Makes a Good Sync Engine?
Before writing any code, let's talk about what we're actually trying to achieve. A solid sync engine needs to handle:
Fast local operations — Changes should feel instant. No waiting for server round trips.
Reliable syncing — Data gets to the server eventually, even with flaky connections.
Conflict handling — Multiple devices editing the same data shouldn't cause disasters.
Efficient bandwidth use — Don't send the entire database on every sync.
Offline resilience — App should work perfectly with zero connectivity.
Background sync — Syncing shouldn't block the UI or drain the battery.
I built a sync engine for a note-taking app last year, and let me tell you, getting all of these right at once is trickier than it looks.
The Architecture: What We're Building
Here's the high-level design:
┌─────────────────────────────────────────┐
│ Application Layer │
│ (UI, Business Logic, User Interactions)│
└─────────────┬───────────────────────────┘
│
┌─────────────▼───────────────────────────┐
│ Local Data Layer │
│ (IndexedDB, SQLite, or similar) │
└─────────────┬───────────────────────────┘
│
┌─────────────▼───────────────────────────┐
│ Sync Engine │
│ • Operation Queue │
│ • Conflict Resolver │
│ • Network Manager │
│ • State Machine │
└─────────────┬───────────────────────────┘
│
┌─────────────▼───────────────────────────┐
│ Server API Layer │
│ (REST, GraphQL, WebSockets) │
└─────────────────────────────────────────┘The sync engine sits between your local storage and the server, managing the flow of data in both directions.
Step 1: The Operation Log
Everything starts with tracking what changed. Instead of syncing entire documents, we sync operations.
class OperationLog {
constructor(storage) {
this.storage = storage;
this.sequence = 0;
}
async logOperation(operation) {
const op = {
id: this.generateId(),
sequence: ++this.sequence,
type: operation.type, // 'create', 'update', 'delete'
entity: operation.entity, // 'note', 'task', etc.
entityId: operation.entityId,
changes: operation.changes,
timestamp: Date.now(),
synced: false,
deviceId: this.getDeviceId()
};
await this.storage.add('operations', op);
return op;
}
async getPendingOperations() {
const ops = await this.storage.query('operations', 'synced', false);
return ops.sort((a, b) => a.sequence - b.sequence);
}
async markSynced(operationId, serverTimestamp) {
const op = await this.storage.get('operations', operationId);
op.synced = true;
op.serverTimestamp = serverTimestamp;
await this.storage.update('operations', op);
}
async getOperationsSince(sequence) {
const allOps = await this.storage.getAll('operations');
return allOps.filter(op => op.sequence > sequence);
}
generateId() {
return `${this.getDeviceId()}_${Date.now()}_${Math.random().toString(36).substr(2, 9)}`;
}
getDeviceId() {
let deviceId = localStorage.getItem('deviceId');
if (!deviceId) {
deviceId = `device_${Math.random().toString(36).substr(2, 9)}`;
localStorage.setItem('deviceId', deviceId);
}
return deviceId;
}
}The operation log is append-only, which makes it reliable and easy to reason about. Each operation gets a sequence number that helps with ordering.
Step 2: The Sync State Machine
Syncing isn't a simple on/off thing. There are multiple states and transitions:
class SyncStateMachine {
constructor() {
this.state = 'idle';
this.listeners = new Set();
}
/*
* States:
* - idle: Not syncing, waiting
* - pulling: Fetching changes from server
* - resolving: Handling conflicts
* - pushing: Sending local changes
* - error: Something went wrong
*/
transition(newState, metadata = {}) {
const oldState = this.state;
this.state = newState;
console.log(`Sync state: ${oldState} -> ${newState}`, metadata);
this.notify({
oldState,
newState,
timestamp: Date.now(),
...metadata
});
}
subscribe(callback) {
this.listeners.add(callback);
return () => this.listeners.delete(callback);
}
notify(event) {
this.listeners.forEach(listener => {
try {
listener(event);
} catch (error) {
console.error('State listener error:', error);
}
});
}
getCurrentState() {
return this.state;
}
canStartSync() {
return this.state === 'idle' || this.state === 'error';
}
}Having explicit states makes debugging way easier. You can see exactly where the sync process is stuck when things go wrong.
Step 3: The Sync Engine Core
Now let's build the actual engine that orchestrates everything:
class SyncEngine {
constructor(storage, api, operationLog) {
this.storage = storage;
this.api = api;
this.operationLog = operationLog;
this.stateMachine = new SyncStateMachine();
this.conflictResolver = new ConflictResolver();
this.syncInterval = null;
this.isSyncing = false;
this.retryDelay = 1000;
this.maxRetryDelay = 60000;
this.setupListeners();
}
setupListeners() {
// Listen for online/offline events
window.addEventListener('online', () => this.onOnline());
window.addEventListener('offline', () => this.onOffline());
// Listen for visibility changes (user returns to tab)
document.addEventListener('visibilitychange', () => {
if (document.visibilityState === 'visible') {
this.sync();
}
});
// Listen for storage events (changes from other tabs)
window.addEventListener('storage', (e) => {
if (e.key === 'sync_request') {
this.sync();
}
});
}
async start() {
if (navigator.onLine) {
await this.sync();
this.startPeriodicSync();
}
}
stop() {
this.stopPeriodicSync();
}
async sync() {
if (!this.stateMachine.canStartSync()) {
console.log('Sync already in progress or in error state');
return;
}
if (!navigator.onLine) {
console.log('Offline, skipping sync');
return;
}
this.isSyncing = true;
try {
this.stateMachine.transition('pulling');
await this.pullChanges();
this.stateMachine.transition('pushing');
await this.pushChanges();
this.stateMachine.transition('idle');
this.retryDelay = 1000; // Reset retry delay on success
} catch (error) {
console.error('Sync failed:', error);
this.stateMachine.transition('error', { error: error.message });
await this.handleSyncError(error);
} finally {
this.isSyncing = false;
}
}
async pullChanges() {
const lastSync = await this.getLastSyncTimestamp();
try {
const response = await this.api.getChangesSince(lastSync);
const serverChanges = response.changes;
if (serverChanges.length === 0) {
console.log('No server changes to pull');
return;
}
console.log(`Pulling ${serverChanges.length} changes from server`);
for (const change of serverChanges) {
await this.applyServerChange(change);
}
await this.setLastSyncTimestamp(response.timestamp);
} catch (error) {
if (error.status === 409) {
// Conflict detected, need resolution
this.stateMachine.transition('resolving');
await this.resolveConflicts(error.conflicts);
} else {
throw error;
}
}
}
async applyServerChange(change) {
// Check if we have local changes for the same entity
const localOps = await this.operationLog.getPendingOperations();
const conflicting = localOps.find(op =>
op.entity === change.entity &&
op.entityId === change.entityId
);
if (conflicting) {
// Conflict! Need to resolve
const resolution = await this.conflictResolver.resolve(
conflicting,
change
);
if (resolution.strategy === 'server-wins') {
// Apply server change, discard local
await this.applyChange(change);
await this.operationLog.markSynced(conflicting.id, change.timestamp);
} else if (resolution.strategy === 'local-wins') {
// Keep local change, will push it later
return;
} else if (resolution.strategy === 'merge') {
// Apply merged version
await this.applyChange(resolution.merged);
}
} else {
// No conflict, just apply the change
await this.applyChange(change);
}
}
async applyChange(change) {
switch (change.type) {
case 'create':
await this.storage.add(change.entity, change.data);
break;
case 'update':
await this.storage.update(change.entity, change.data);
break;
case 'delete':
await this.storage.delete(change.entity, change.entityId);
break;
}
}
async pushChanges() {
const pendingOps = await this.operationLog.getPendingOperations();
if (pendingOps.length === 0) {
console.log('No local changes to push');
return;
}
console.log(`Pushing ${pendingOps.length} changes to server`);
// Batch operations for efficiency
const batches = this.batchOperations(pendingOps, 50);
for (const batch of batches) {
try {
const response = await this.api.pushOperations(batch);
// Mark operations as synced
for (let i = 0; i < batch.length; i++) {
const op = batch[i];
const result = response.results[i];
if (result.success) {
await this.operationLog.markSynced(op.id, result.timestamp);
} else if (result.conflict) {
// Server detected a conflict
await this.resolveServerConflict(op, result.serverVersion);
}
}
} catch (error) {
console.error('Failed to push batch:', error);
// Don't mark as synced, will retry later
throw error;
}
}
}
batchOperations(operations, batchSize) {
const batches = [];
for (let i = 0; i < operations.length; i += batchSize) {
batches.push(operations.slice(i, i + batchSize));
}
return batches;
}
async resolveServerConflict(localOp, serverVersion) {
const resolution = await this.conflictResolver.resolve(
localOp,
serverVersion
);
if (resolution.strategy === 'server-wins') {
await this.applyChange(serverVersion);
await this.operationLog.markSynced(localOp.id, serverVersion.timestamp);
} else if (resolution.strategy === 'local-wins') {
// Try pushing again with force flag
await this.api.pushOperations([localOp], { force: true });
} else if (resolution.strategy === 'merge') {
// Push merged version
const mergedOp = {
...localOp,
changes: resolution.merged
};
await this.api.pushOperations([mergedOp]);
}
}
async handleSyncError(error) {
// Exponential backoff
await new Promise(resolve => setTimeout(resolve, this.retryDelay));
this.retryDelay = Math.min(this.retryDelay * 2, this.maxRetryDelay);
// Retry if online
if (navigator.onLine) {
await this.sync();
}
}
onOnline() {
console.log('Network connection restored');
this.sync();
this.startPeriodicSync();
}
onOffline() {
console.log('Network connection lost');
this.stopPeriodicSync();
}
startPeriodicSync(interval = 30000) {
this.stopPeriodicSync();
this.syncInterval = setInterval(() => {
if (navigator.onLine && !this.isSyncing) {
this.sync();
}
}, interval);
}
stopPeriodicSync() {
if (this.syncInterval) {
clearInterval(this.syncInterval);
this.syncInterval = null;
}
}
async getLastSyncTimestamp() {
return localStorage.getItem('lastSyncTimestamp') || 0;
}
async setLastSyncTimestamp(timestamp) {
localStorage.setItem('lastSyncTimestamp', timestamp);
}
getSyncState() {
return {
state: this.stateMachine.getCurrentState(),
isSyncing: this.isSyncing,
online: navigator.onLine
};
}
subscribe(callback) {
return this.stateMachine.subscribe(callback);
}
}Step 4: Smart Conflict Resolution
The conflict resolver is where you encode your business logic:
class ConflictResolver {
async resolve(localChange, serverChange) {
// Strategy 1: Timestamp-based (last write wins)
if (localChange.timestamp > serverChange.timestamp) {
return { strategy: 'local-wins', reason: 'Newer timestamp' };
}
if (serverChange.timestamp > localChange.timestamp) {
return { strategy: 'server-wins', reason: 'Newer timestamp' };
}
// Strategy 2: Field-level merge for objects
if (this.canMergeFields(localChange, serverChange)) {
const merged = this.mergeFields(
localChange.changes,
serverChange.changes
);
return {
strategy: 'merge',
merged,
reason: 'Fields merged'
};
}
// Strategy 3: User decision (save both versions)
return {
strategy: 'user-decision',
reason: 'Requires manual resolution',
localChange,
serverChange
};
}
canMergeFields(local, server) {
// Can merge if changes affect different fields
const localFields = new Set(Object.keys(local.changes));
const serverFields = new Set(Object.keys(server.changes));
const overlap = [...localFields].filter(f => serverFields.has(f));
return overlap.length === 0;
}
mergeFields(localChanges, serverChanges) {
return {
...serverChanges,
...localChanges
};
}
}Step 5: Delta Sync for Efficiency
Instead of syncing entire documents, sync only what changed:
class DeltaSync {
static createDelta(oldVersion, newVersion) {
const delta = {
type: 'delta',
baseVersion: oldVersion._version,
changes: {}
};
for (const key in newVersion) {
if (key === '_version') continue;
if (oldVersion[key] !== newVersion[key]) {
delta.changes[key] = {
old: oldVersion[key],
new: newVersion[key]
};
}
}
return delta;
}
static applyDelta(baseVersion, delta) {
if (baseVersion._version !== delta.baseVersion) {
throw new Error('Version mismatch, cannot apply delta');
}
const result = { ...baseVersion };
for (const [key, change] of Object.entries(delta.changes)) {
if (result[key] !== change.old) {
throw new Error(`Conflict: expected ${change.old}, found ${result[key]}`);
}
result[key] = change.new;
}
result._version += 1;
return result;
}
static squashDeltas(deltas) {
// Combine multiple deltas into one
const squashed = {
type: 'delta',
baseVersion: deltas[0].baseVersion,
changes: {}
};
for (const delta of deltas) {
Object.assign(squashed.changes, delta.changes);
}
return squashed;
}
}
// Using it
const oldDoc = { _version: 1, title: 'Hello', body: 'World' };
const newDoc = { _version: 2, title: 'Hello', body: 'Universe' };
const delta = DeltaSync.createDelta(oldDoc, newDoc);
// { type: 'delta', baseVersion: 1, changes: { body: { old: 'World', new: 'Universe' } } }
// Send only the delta over the network
const restored = DeltaSync.applyDelta(oldDoc, delta);
// { _version: 2, title: 'Hello', body: 'Universe' }Step 6: Optimistic Updates for Snappy UI
Make the app feel instant by showing changes immediately:
class OptimisticUpdateManager {
constructor(storage, syncEngine) {
this.storage = storage;
this.syncEngine = syncEngine;
this.pendingUpdates = new Map();
}
async performUpdate(entity, entityId, changes) {
const updateId = this.generateUpdateId();
const originalData = await this.storage.get(entity, entityId);
// Store original for potential rollback
this.pendingUpdates.set(updateId, {
entity,
entityId,
original: originalData,
timestamp: Date.now()
});
// Apply update immediately to local storage
const updated = { ...originalData, ...changes };
await this.storage.update(entity, updated);
// Queue for sync
const operation = {
type: 'update',
entity,
entityId,
changes,
updateId
};
try {
await this.syncEngine.operationLog.logOperation(operation);
// Start sync in background
this.syncEngine.sync().then(() => {
// Sync succeeded, remove from pending
this.pendingUpdates.delete(updateId);
}).catch(error => {
// Sync failed, might need to rollback
this.handleSyncFailure(updateId, error);
});
return updated;
} catch (error) {
// Immediate failure, rollback
await this.rollback(updateId);
throw error;
}
}
async rollback(updateId) {
const pending = this.pendingUpdates.get(updateId);
if (!pending) return;
console.log('Rolling back optimistic update:', updateId);
await this.storage.update(pending.entity, pending.original);
this.pendingUpdates.delete(updateId);
}
async handleSyncFailure(updateId, error) {
// Decide whether to retry or rollback
if (this.isRetryableError(error)) {
// Keep the update, sync will retry
console.log('Sync will retry for update:', updateId);
} else {
// Rollback the update
await this.rollback(updateId);
}
}
isRetryableError(error) {
return error.code === 'NETWORK_ERROR' ||
error.code === 'TIMEOUT' ||
error.status >= 500;
}
generateUpdateId() {
return `update_${Date.now()}_${Math.random().toString(36).substr(2, 9)}`;
}
}Step 7: Background Sync with Web Workers
Don't block the main thread:
// sync-worker.js
class SyncWorker {
constructor() {
self.addEventListener('message', (e) => this.handleMessage(e));
this.syncEngine = null;
}
async handleMessage(event) {
const { type, payload } = event.data;
switch (type) {
case 'init':
await this.initialize(payload);
break;
case 'sync':
await this.performSync();
break;
case 'push':
await this.pushOperation(payload);
break;
}
}
async initialize(config) {
// Set up sync engine in worker
this.syncEngine = new SyncEngine(/* ... */);
await this.syncEngine.start();
self.postMessage({ type: 'initialized' });
}
async performSync() {
try {
await this.syncEngine.sync();
self.postMessage({
type: 'sync-complete',
timestamp: Date.now()
});
} catch (error) {
self.postMessage({
type: 'sync-error',
error: error.message
});
}
}
}
new SyncWorker();
// Main thread usage
class BackgroundSyncManager {
constructor() {
this.worker = new Worker('sync-worker.js');
this.setupListeners();
}
setupListeners() {
this.worker.addEventListener('message', (e) => {
const { type, payload } = e.data;
switch (type) {
case 'sync-complete':
this.onSyncComplete(payload);
break;
case 'sync-error':
this.onSyncError(payload);
break;
}
});
}
async initialize(config) {
return new Promise((resolve) => {
const handler = (e) => {
if (e.data.type === 'initialized') {
this.worker.removeEventListener('message', handler);
resolve();
}
};
this.worker.addEventListener('message', handler);
this.worker.postMessage({ type: 'init', payload: config });
});
}
triggerSync() {
this.worker.postMessage({ type: 'sync' });
}
onSyncComplete(data) {
// Notify UI that sync completed
console.log('Background sync completed:', data);
}
onSyncError(error) {
console.error('Background sync error:', error);
}
}Putting It All Together
Here's how you'd use everything in a real app:
class App {
async initialize() {
// Set up storage
this.storage = new OfflineDB('myapp', 1);
await this.storage.init([
{ name: 'notes', keyPath: 'id' },
{ name: 'operations', keyPath: 'id', autoIncrement: true }
]);
// Set up operation log
this.operationLog = new OperationLog(this.storage);
// Set up API client
this.api = new ApiClient('https://api.example.com');
// Set up sync engine
this.syncEngine = new SyncEngine(
this.storage,
this.api,
this.operationLog
);
// Set up optimistic updates
this.updateManager = new OptimisticUpdateManager(
this.storage,
this.syncEngine
);
// Subscribe to sync events
this.syncEngine.subscribe((event) => {
this.updateSyncUI(event);
});
// Start syncing
await this.syncEngine.start();
}
async createNote(title, body) {
const note = {
id: generateId(),
title,
body,
createdAt: Date.now(),
updatedAt: Date.now()
};
// Optimistic update
await this.updateManager.performUpdate('notes', note.id, note);
return note;
}
async updateNote(id, changes) {
// Optimistic update
await this.updateManager.performUpdate('notes', id, {
...changes,
updatedAt: Date.now()
});
}
updateSyncUI(event) {
const statusEl = document.getElementById('sync-status');
switch (event.newState) {
case 'idle':
statusEl.textContent = '✓ Synced';
statusEl.className = 'synced';
break;
case 'pulling':
case 'pushing':
statusEl.textContent = '⟳ Syncing...';
statusEl.className = 'syncing';
break;
case 'error':
statusEl.textContent = '⚠ Sync error';
statusEl.className = 'error';
break;
}
}
}Performance Considerations
From real-world testing on that note app:
Initial load: 50ms (from IndexedDB)
Write operations: 5ms (local only)
Sync cycle: 200–500ms (depending on changes)
Memory usage: ~10MB for 10,000 notes
Storage: ~50MB for 10,000 notes with full operation history
The trick is aggressive pruning of old operations. You don't need the full history forever:
async pruneOldOperations() {
const cutoff = Date.now() - (30 * 24 * 60 * 60 * 1000); // 30 days
const oldOps = await this.storage.query(
'operations',
'timestamp',
IDBKeyRange.upperBound(cutoff)
);
const synced = oldOps.filter(op => op.synced);
for (const op of synced) {
await this.storage.delete('operations', op.id);
}
console.log(`Pruned ${synced.length} old operations`);
}The Hard-Learned Lessons
Building sync engines taught me things you can't learn from documentation:
Test with airplane mode. Constantly. Your users will go offline in ways you never imagined.
Timestamps are lies. Use logical clocks (vector clocks, hybrid logical clocks) for ordering if you can.
Batch everything. Network round trips are expensive. Batch operations aggressively.
Show pending changes. Users need to see what's waiting to sync. It builds trust.
Make rollback possible. Optimistic updates need a safety net.
Monitor sync health. Track how long syncs take, how often they fail, how many operations are pending.
The local-first approach isn't just better for offline scenarios. It makes your entire app faster, more reliable, and more pleasant to use. Every interaction is instant because nothing waits for the network.
That's the magic of local-first software. And now you know how to build the engine that makes it work.
What's your experience with building sync systems? Ever had to debug a gnarly merge conflict at 3 AM?