Build a Real-Time Chat with WebRTC Data Channels

Learn how to build peer-to-peer real-time chat applications using WebRTC Data Channels. Discover when to choose P2P over traditional WebSocket solutions.
While I was looking over some chat implementations the other day, I realized how many developers default to WebSockets without considering WebRTC Data Channels. I was once guilty of this myself! Little did I know that for certain use cases, peer-to-peer communication could eliminate server bottlenecks entirely and reduce latency to near-zero.
Why WebRTC Data Channels Beat Traditional Chat Solutions
When I finally decided to dig into WebRTC Data Channels, I discovered something fascinating. Unlike WebSocket connections that route every message through your server, Data Channels establish direct peer-to-peer connections between browsers. This means your server only facilitates the initial handshake, then gets out of the way.
The ROI here is significant. Your server bandwidth costs plummet because you're not proxying gigabytes of chat messages. Latency drops dramatically since packets travel directly between peers. For applications like gaming, collaborative editing, or video conferencing with side chat, this is wonderful!
But here's the catch—and this is something I cannot stress enough!—Data Channels require more upfront complexity. You need signaling infrastructure, STUN/TURN servers for NAT traversal, and careful state management. In other words, they're not always the right choice.
Understanding WebRTC Data Channels vs Media Streams
Most WebRTC tutorials focus on audio and video streams. Data Channels are the often-overlooked sibling that lets you send arbitrary data between peers. Think of them as bidirectional pipes for any data you want: text, binary files, game state updates, or collaborative document changes.
The key difference is that Data Channels use SCTP (Stream Control Transmission Protocol) over DTLS, providing reliable or unreliable delivery based on your needs. You get to choose! Need guaranteed message delivery for chat? Use reliable mode. Building a game where old state updates don't matter? Go unreliable for lower latency.

Setting Up the Signaling Server with WebSockets
Before peers can connect directly, they need to exchange connection information. This is where signaling comes in. Luckily we can use a simple WebSocket server for this. Here's a minimal signaling server I built using Node.js:
import { WebSocketServer } from 'ws';
const wss = new WebSocketServer({ port: 8080 });
const clients = new Map<string, any>();
wss.on('connection', (ws) => {
let clientId: string;
ws.on('message', (data) => {
const message = JSON.parse(data.toString());
switch (message.type) {
case 'register':
clientId = message.id;
clients.set(clientId, ws);
console.log(`Client registered: ${clientId}`);
break;
case 'offer':
case 'answer':
case 'ice-candidate':
const targetClient = clients.get(message.target);
if (targetClient && targetClient.readyState === 1) {
targetClient.send(JSON.stringify({
type: message.type,
from: clientId,
data: message.data
}));
}
break;
}
});
ws.on('close', () => {
if (clientId) {
clients.delete(clientId);
console.log(`Client disconnected: ${clientId}`);
}
});
});
console.log('Signaling server running on ws://localhost:8080');This server does three things: registers clients, forwards WebRTC offers/answers, and relays ICE candidates. That's all you need for signaling! The actual chat data will bypass this server entirely once peers connect.
Establishing Peer-to-Peer Connections with RTCPeerConnection
Now for the meat of the implementation. When I came across the RTCPeerConnection API, I was intimidated by its complexity. But once you break it down, it follows a logical flow: create connection → create data channel → exchange offers/answers → exchange ICE candidates → profit!
Here's the client-side code that brings it all together:
class WebRTCChat {
private pc: RTCPeerConnection;
private dataChannel: RTCDataChannel | null = null;
private ws: WebSocket;
private localId: string;
private onMessageCallback: (message: string) => void;
constructor(signalingUrl: string, onMessage: (message: string) => void) {
this.localId = Math.random().toString(36).substring(7);
this.onMessageCallback = onMessage;
// Configure ICE servers (STUN for NAT traversal)
this.pc = new RTCPeerConnection({
iceServers: [
{ urls: 'stun:stun.l.google.com:19302' }
]
});
// Set up WebSocket for signaling
this.ws = new WebSocket(signalingUrl);
this.ws.onopen = () => {
this.ws.send(JSON.stringify({ type: 'register', id: this.localId }));
};
this.ws.onmessage = (event) => {
this.handleSignalingMessage(JSON.parse(event.data));
};
// Handle ICE candidates
this.pc.onicecandidate = (event) => {
if (event.candidate) {
this.ws.send(JSON.stringify({
type: 'ice-candidate',
target: this.remoteId,
data: event.candidate
}));
}
};
// Handle incoming data channel
this.pc.ondatachannel = (event) => {
this.setupDataChannel(event.channel);
};
}
async createOffer(remoteId: string) {
this.remoteId = remoteId;
// Create data channel (only the offerer does this)
this.dataChannel = this.pc.createDataChannel('chat', {
ordered: true // Reliable delivery for chat messages
});
this.setupDataChannel(this.dataChannel);
const offer = await this.pc.createOffer();
await this.pc.setLocalDescription(offer);
this.ws.send(JSON.stringify({
type: 'offer',
target: remoteId,
data: offer
}));
}
private async handleSignalingMessage(message: any) {
switch (message.type) {
case 'offer':
this.remoteId = message.from;
await this.pc.setRemoteDescription(message.data);
const answer = await this.pc.createAnswer();
await this.pc.setLocalDescription(answer);
this.ws.send(JSON.stringify({
type: 'answer',
target: message.from,
data: answer
}));
break;
case 'answer':
await this.pc.setRemoteDescription(message.data);
break;
case 'ice-candidate':
await this.pc.addIceCandidate(message.data);
break;
}
}
private setupDataChannel(channel: RTCDataChannel) {
this.dataChannel = channel;
channel.onopen = () => {
console.log('Data channel opened!');
};
channel.onmessage = (event) => {
this.onMessageCallback(event.data);
};
channel.onerror = (error) => {
console.error('Data channel error:', error);
};
}
sendMessage(message: string) {
if (this.dataChannel && this.dataChannel.readyState === 'open') {
this.dataChannel.send(message);
}
}
close() {
this.dataChannel?.close();
this.pc.close();
this.ws.close();
}
private remoteId: string = '';
}I was once guilty of forgetting that only the peer creating the offer should call createDataChannel(). The answering peer receives the channel through the ondatachannel event. This tripped me up for hours when I was learning!

Building the Data Channel Chat Interface
Using our WebRTC client is straightforward. Here's how you'd wire it up to a simple HTML interface:
const chat = new WebRTCChat('ws://localhost:8080', (message) => {
const messagesDiv = document.getElementById('messages');
const messageEl = document.createElement('div');
messageEl.textContent = message;
messagesDiv?.appendChild(messageEl);
});
document.getElementById('connect')?.addEventListener('click', () => {
const remoteId = (document.getElementById('remote-id') as HTMLInputElement).value;
chat.createOffer(remoteId);
});
document.getElementById('send')?.addEventListener('click', () => {
const input = document.getElementById('message-input') as HTMLInputElement;
chat.sendMessage(input.value);
input.value = '';
});The beauty here is that once the connection establishes, messages flow directly between browsers. Your server isn't touching the chat data at all!
Handling Connection States and Error Recovery
In production, you need robust connection state handling. I realized this the hard way when testing on flaky mobile networks. The RTCPeerConnection has multiple states to monitor: new, connecting, connected, disconnected, failed, and closed.
You want to listen to the connectionstatechange event and handle disconnections gracefully. When a connection fails, attempt to reconnect using ICE restart. This involves creating a new offer with the iceRestart option set to true.
For the data channel specifically, watch the readyState property. Only send messages when it's open. Buffer messages during reconnection attempts, or show users a "reconnecting" indicator.
Data Channels vs WebSockets: When to Use Each
Let's talk about the elephant in the room. When should you actually use Data Channels over WebSockets?
Use Data Channels when you need true peer-to-peer communication with minimal latency. This is wonderful for gaming, real-time collaboration tools, video conferencing side chat, or file transfers between users. Your server only handles signaling, dramatically reducing infrastructure costs at scale.
Stick with WebSockets when you need a centralized server to process, store, or broadcast messages to many clients. WebSockets shine for traditional chat apps where message history matters, or when you need server-side validation and filtering.
I cannot stress this enough! Data Channels require both peers to be online simultaneously. For asynchronous communication where users might be offline, WebSockets with server-side storage is the pragmatic choice.
Production Considerations: TURN Servers and Scalability
Here's something I came across that surprised me: STUN servers work for about 80% of connections, but strict corporate firewalls or symmetric NATs block direct P2P. You need TURN servers as a fallback, which relay traffic through your infrastructure.
TURN servers aren't free to operate—they consume bandwidth. Services like Twilio, Xirsys, or self-hosted Coturn are your options. Factor this into your architecture costs.
For scalability, remember that signaling remains centralized even though data flows peer-to-peer. Design your signaling server to handle connection bursts efficiently. Consider using Redis for client tracking across multiple server instances.
WebRTC Data Channels also work beautifully in mesh networks where multiple peers connect to each other. But watch out—mesh topologies don't scale beyond 4-6 peers due to bandwidth constraints. For larger groups, you need SFU (Selective Forwarding Unit) architectures.
And that concludes the end of this post! I hope you found this valuable and look out for more in the future!