Technical Overview
This document explains the technical architecture for Sortment, designed to help understand how data flows across the platform.
Last updated
Was this helpful?
This document explains the technical architecture for Sortment, designed to help understand how data flows across the platform.
Last updated
Was this helpful?
Sortment uses a modular, scalable, and warehouse-native architecture that enables effective omnichannel messaging. This setup ensures that marketing and data teams can efficiently manage user events, campaigns, and messaging workflows while leveraging clean, actionable data directly from the data warehouse.
Data Warehouse (e.g., Snowflake, BigQuery)
Bucket Storage (e.g., S3, Cloud Storage)
Segmentation Engine
Journey Orchestrator
Events Consumer (Kafka-based)
Notifications Layer (Email, WhatsApp, SMS)
Data Sources (Apps, Websites, Segment, Rudderstack)
Campaign Automation and Analytics
The data flow within Sortment architecture can be broken into four main phases:
Segmentation Engine:
The team can create user segments on the Data Warehouse using visual builder or AI based on event properties and profiles. This engine converts the filters into queries.
Segmentation results can be fed into downstream systems for campaign execution.
Audience creation supports SQL-based queries and CSV imports for flexibility.
Campaign Automation:
Marketers can use a user-friendly campaign and journey builder to trigger automated messages, reducing reliance on engineering teams.
Journey Orchestrator:
Accesses profiles stored in Bucket Storage. This is refreshed at regular sync from Data Warehouse
Consumes journey-related events from the Events Consumer.
Orchestrates complex user journeys based on real-time and historical data.
Outcome: Audiences are segmented, and dynamic journeys are triggered based on user behaviors and profiles.
Data Sources: Events are generated from multiple sources, including:
Applications (mobile or web)
Websites
Third-party CDPs (Segment, Rudderstack)
Event Delivery: These real-time events are sent to the Events Consumer, which is built on a Kafka-based infrastructure to handle high-throughput data streams.
Outcome: Events are ingested in real-time and persisted for campaign conversion tracking and journey triggers and filters.
Events Consumer:
Persists all campaign-related events into Bucket Storage (e.g., Amazon S3, Cloud Storage).
Sync events to the Data Warehouse for analytical querying.
Profiles Sync:
User profiles and event data are regularly synced between the Data Warehouse and Bucket Storage.
Real-Time Capabilities:
Events can be used to trigger real-time campaigns with sub-second latency via APIs.
Outcome: Data is stored back in warehouse, available for querying, and ready for real-time campaign triggering.
Campaign Event Delivery:
The Events Consumer sends campaign events to the Notifications Layer.
Notifications Layer:
Distributes messages across multiple channels:
SMS
Feedback Loop:
Events generated from campaign engagement (e.g., opens, clicks, purchases) are sent back as real-time events to the Events Consumer and then Data Warehouse for further processing and analysis.
Outcome: Marketing messages are delivered across preferred channels with real-time triggers and automated workflows.
Direct integration with data warehouses (e.g., Snowflake, BigQuery) ensures real-time access to clean, actionable data without creating data copies.
Enables teams to work with accurate and up-to-date data while maintaining security and compliance.
Supports real-time message triggering with sub-second latency via APIs.
High-throughput Kafka-based ingestion ensures scalability for large event volumes and accuracy.
SQL-based and CSV-supported audience creation enables advanced segmentation for campaigns.
Segmentation engine allows flexible, self-serve queries for building target audiences.
Intuitive journey builder to automate customer experiences across channels.
Reduces dependency on engineering teams for campaign execution.
Provides self-serve insights for measuring campaign performance directly from warehouse data.
Allows for experimentation (A/B testing) to improve campaign strategies.
Warehouse-native architecture eliminates data duplication.
Granular access control, secure connections, and SOC 2 certification ensure privacy and compliance.
Sortment is designed to integrate seamlessly with existing client data ecosystems. Below are key integration points:
Data Ingestion:
Connectors for popular CDPs like Segment and Rudderstack
APIs for direct event streaming
Data Warehouse Integration:
Support for Snowflake, BigQuery, Redshift
Storage Compatibility:
Cloud providers like AWS S3 and GCP Cloud Storage
Notification Platforms:
Email providers (e.g., SendGrid, SES)
SMS
For further details or queries, please reach out to us.