- Excelling With DynamoDB
- Posts
- DynamoDB Streams For Real-Time Stocks Trading Data Synchronization
DynamoDB Streams For Real-Time Stocks Trading Data Synchronization
Using DynamoDB Streams in real-world stock trading applications.
Most applications you will build today will demand near-instant data synchronization across distributed systems.
Whether it’s propagating user activity to analytics pipelines or updating downstream caches, real-time event propagation is important.
DynamoDB Streams provides a powerful mechanism to achieve this by capturing item-level changes in DynamoDB tables and making them available for processing in real time.
In this article, we’ll explore how DynamoDB Streams can be applied to a real-world financial trading platform.
We’ll dive into stream processing patterns, discuss the challenge of handling duplicate records, and show how materialized views can be built to support low-latency queries.
Real-World Case: Financial Trading Platform
Imagine a financial trading platform that processes buy and sell orders.
The system must store trades reliably, while also ensuring that multiple downstream services, such as risk management, analytics dashboards, compliance monitoring, receive consistent updates in real time.
For example:
When a new trade is inserted, the risk management system must recalculate exposure.
The analytics dashboard needs to update aggregated metrics like total traded volume.
The compliance system requires a real-time audit trail.
Here, DynamoDB serves as the source of truth for all trades.
But instead of forcing every downstream service to query the table repeatedly, we can use DynamoDB Streams to push changes as they happen.
DynamoDB Streams Basics
DynamoDB Streams captures a time-ordered sequence of item-level changes (inserts, updates, deletes).
Each change is represented as a stream record that includes the item’s primary key, the type of modification, and the new/old values depending on your configuration.
For example, enabling NEW_AND_OLD_IMAGES on the Trades table ensures that whenever a trade item is updated, both the before and after states of the item are available in the stream record.
Example stream response capture:
{
"eventID": "1",
"eventName": "INSERT",
"dynamodb": {
"Keys": { "tradeId": { "S": "T123" } },
"NewImage": {
"tradeId": { "S": "T123" },
"symbol": { "S": "AAPL" },
"price": { "N": "185.50" },
"quantity": { "N": "100" },
"timestamp": { "S": "2025-08-22T13:00:00Z" }
}
}
}
This record can then be consumed by AWS Lambda, Kinesis, or other processing services.
Data Model for Trades

The core DynamoDB table (“trades”) could be modelled as follows:
Partition key: tradeId (unique ID per trade)
Attributes: symbol, price, quantity, traderId, timestamp
Example item:
{
"tradeId": "T123",
"symbol": "AAPL",
"price": 185.50,
"quantity": 100,
"traderId": "U456",
"timestamp": "2025-08-22T13:00:00Z"
}
This table captures executed trades and downstream services rely on DynamoDB Streams to react whenever an insert or update occurs.
Stream Processing Patterns
1. Event Notification Pattern
Lambda functions can subscribe to the stream and push notifications to other services.
For instance, every new trade could be published to an SNS topic, ensuring that risk management, compliance, and monitoring systems receive an immediate notification.
2. Change Propagation Pattern
Changes can be replicated into other data stores optimized for specific workloads. For example:
Send trades to Amazon Redshift for analytical queries.
Push aggregated trades into Elasticsearch/OpenSearch for text-based queries.
Synchronize trades with S3 for compliance archiving.
3. Materialized Views Pattern
Streams can be used to maintain precomputed views for fast queries.
For instance, instead of recalculating the total traded volume per stock symbol on every query, a “SymbolStats” table can be updated in real time whenever a new trade arrives.
Building Materialized Views
Let’s consider the “SymbolStats” table:
Partition key: symbol (pk)
Attributes: totalVolume, latestPrice, lastUpdated
When a new trade is added for AAPL, the stream processor updates the materialized view:
{
"symbol": "AAPL",
"totalVolume": 12500,
"latestPrice": 185.50,
"lastUpdated": "2025-08-22T13:00:00Z"
}
This allows the analytics dashboard to query current volume and price per symbol directly, without scanning all trades items.
Handling Duplicate Records
One common challenge with DynamoDB Streams is at-least-once delivery. Stream records may occasionally be delivered multiple times.
This is acceptable for idempotent operations but can cause errors when blindly incrementing counters or writing non-idempotent data.
Some strategies for handling duplicates:
Idempotent Writes: Use conditional updates in DynamoDB. For example, when processing a trade with tradeId = T123, ensure updates only occur if they haven’t been applied already. (e.g. ConditionalExpression: “attribute_not_exists(tradeId)”)
Deduplication Keys: Maintain a “ProcessedEvents” table keyed by eventID. Before applying a change, check if the event item already exists.
Deterministic Updates: Instead of incrementing counters, recompute values from item attributes in the stream record (e.g., set latestPrice to NewImage.price rather than adding blindly).
In the trading example, this ensures that exposure calculations or compliance logs are never double-counted.
Conclusion
DynamoDB Streams provides a robust system for building real-time, event-driven architectures.
In a financial trading platform, streams enable eventual consistency across risk management, analytics, and compliance all managed behind the scenes for you.
With Streams we can conveniently build materialized views for statistics and analyticsl data like dashboards in an efficient and quick manner.
👋 My name is Uriel Bitton and I hope you learned something in this edition of Excelling With DynamoDB.
📅 If you're looking for help with DynamoDB, let's have a quick chat.
🙌 I hope to see you in next week's edition!