Your Black Friday dashboard shows 1,247 orders in the last hour. By the time that number refreshes, you've already lost sales to a stockout you didn't see coming, missed a payment gateway error affecting checkout, and failed to spot the trending product that should be front and center on your homepage.
Real-time analytics changes this. Instead of discovering problems in yesterday's reports, you watch transactions flow in as they happen, catch inventory issues before stockouts, and respond to performance degradation while customers are still on your site. This post builds a complete Black Friday analytics system using Deephaven — streaming sales data, live inventory tracking, conversion funnels, and performance monitoring, all updating continuously.
The Black Friday data deluge
During peak shopping events, e-commerce systems generate massive data streams:
- Transactions: Orders, payments, returns flowing in every second.
- Inventory: Stock levels changing with every purchase and restock.
- User behavior: Clicks, searches, cart additions, abandonments.
- Performance: Page load times, API response times, error rates.
- Marketing: Campaign performance, conversion funnels, attribution.
Traditional analytics tools can't keep up. By the time your hourly report runs, you've already missed the trending product that should've been featured on your homepage, the payment gateway issue that cost you thousands in abandoned carts, and the inventory stockout that sent customers to your competitors.
Let's build better.
Note
This post is designed for data engineers, technical teams, and developers who build analytics systems for e-commerce operations. If you're a business stakeholder or sales manager, this shows you what's possible — your technical team will implement these dashboards, and you'll use the insights they provide.
What you'll build
By the end of this post, you'll have a complete real-time analytics system that tracks:
- Sales velocity — Orders and revenue per second with rolling 10-second windows.
- Inventory alerts — Automatic detection of low stock and stockouts before they happen.
- Conversion funnels — View → Cart → Checkout → Purchase with live conversion rates.
- Cart abandonment — Real-time tracking of abandoned carts and recovery opportunities.
- Performance monitoring — Page load times and error rates across all endpoints.
- Pricing analysis — Discount effectiveness and revenue optimization.
- Customer segmentation — VIP vs. new vs. single-purchase customer behavior.
- Geographic insights — Regional performance and shipping implications.
- Marketing attribution — Channel-by-channel ROI and ROAS tracking.
- Live dashboards — Interactive visualizations updating in real-time.
All processing live data as it streams in. No batch jobs, no refresh delays, no stale reports.
Simulating Black Friday traffic
First, let's generate realistic Black Friday e-commerce data to work with. This simulates the kind of high-velocity data stream you'd see during peak shopping. In production, this data would flow in from your order management system, web analytics platform (like Google Analytics), and inventory database.
We'll use Deephaven's time_table to generate streaming data and update_by operations for real-time aggregations.
What we're creating:
- Transaction stream updating every 100ms (10 transactions per second).
- Product categories, pricing, and payment methods.
- Geographic customer data and shipping information.
- Simulated revenue ranging from 500 per order.
This gives us a realistic Black Friday scenario to analyze:
Note
Heap memory requirements Running the full simulation with all tables requires sufficient heap memory. If you encounter out-of-memory errors, increase your JVM heap:
docker run --rm --name deephaven -p 10000:10000 \
--env START_OPTS="-Xmx8g -Xms4g" \
ghcr.io/deephaven/server:latest
This allocates 8GB maximum heap and 4GB initial heap. Adjust as needed based on your system's available RAM.
from deephaven import time_table, empty_table
from deephaven.updateby import rolling_avg_tick, rolling_sum_tick, cum_sum, ema_tick, delta
from deephaven.agg import sum_, avg, count_, last, first, min_, max_
import deephaven.plot.express as dx
# Simulate Black Friday transaction stream (peak traffic!)
transactions = time_table("PT00:00:00.100", start_time="2025-11-29T00:00:00 ET").update([
"TransactionID = (long)ii",
"Timestamp_ms = Timestamp",
# Simulate product categories with varying popularity (weighted distribution)
"CategoryRandom = randomInt(0, 100)",
"Category = (String)(CategoryRandom < 25 ? `Electronics` : CategoryRandom < 45 ? `Clothing` : CategoryRandom < 60 ? `Home & Garden` : CategoryRandom < 70 ? `Toys` : CategoryRandom < 80 ? `Beauty` : CategoryRandom < 88 ? `Sports` : CategoryRandom < 95 ? `Books` : `Food`)",
# Simulate products within categories
"ProductID = (String)(Category + `_` + (ii % 25))",
# Black Friday pricing - more discounts early in the day, with category-specific price ranges
"BasePrice = (double)(Category.equals(`Electronics`) ? randomDouble(100.0, 800.0) : Category.equals(`Clothing`) ? randomDouble(20.0, 150.0) : Category.equals(`Home & Garden`) ? randomDouble(30.0, 200.0) : Category.equals(`Toys`) ? randomDouble(10.0, 100.0) : Category.equals(`Beauty`) ? randomDouble(15.0, 80.0) : Category.equals(`Sports`) ? randomDouble(25.0, 300.0) : Category.equals(`Books`) ? randomDouble(10.0, 40.0) : randomDouble(5.0, 50.0))",
"OriginalPrice = BasePrice",
"DiscountPct = (ii < 36000 ? randomDouble(0.20, 0.60) : randomDouble(0.10, 0.40))", # Higher discounts early
"SalePrice = OriginalPrice * (1.0 - DiscountPct)",
"Quantity = (int)randomInt(1, 5)",
"Revenue = SalePrice * Quantity",
# Customer behavior
"CustomerID = (long)randomInt(1, 50000)",
"IsNewCustomer = randomDouble(0, 1) > 0.7", # 30% new customers on Black Friday
"PaymentMethod = (String)(ii % 4 == 0 ? `Credit Card` : ii % 4 == 1 ? `PayPal` : ii % 4 == 2 ? `Debit Card` : `Buy Now Pay Later`)",
# Transaction status
"Status = (String)(randomDouble(0, 1) > 0.05 ? `Completed` : `Failed`)", # 5% failure rate
])
# Filter to completed transactions for revenue calculations
completed_sales = transactions.where("Status == `Completed`")
The analytics suite
Real-time sales velocity
Now let's track how fast sales are happening. This is crucial during Black Friday — you need to know immediately if sales are spiking (time to promote that product more!) or slowing down (check for website issues or payment problems).
What this dashboard shows:
- Last 10 seconds: Rolling count of orders happening right now (are we at peak traffic?).
- Total cumulative revenue: Running total since Black Friday started (how close are we to our $1M goal?).
- Orders per second: Velocity metric (processing 50 orders/sec vs. our 100/sec capacity).
- Average order value: Real-time AOV tracking (did that discount campaign lower or raise it?).
Here's how to calculate these metrics as data streams in:
# Sales per second
sales_velocity = completed_sales.update([
"OrderCount = 1",
]).update_by([
rolling_sum_tick(cols=["Sales_Last10s = Revenue"], rev_ticks=1000, fwd_ticks=0), # 100ms * 1000 = 10s
rolling_sum_tick(cols=["Orders_Last10s = OrderCount"], rev_ticks=1000, fwd_ticks=0),
cum_sum(cols=["TotalRevenue = Revenue", "TotalOrders = OrderCount"]),
])
# Current metrics
current_metrics = sales_velocity.tail(1).view([
"Timestamp_ms",
"Sales_Last10s",
"Orders_Last10s",
"TotalRevenue",
"TotalOrders",
])
# Sales by category with velocity
category_velocity = completed_sales.update_by([
rolling_sum_tick(cols=["CategorySales_10s = Revenue"], rev_ticks=1000, fwd_ticks=0),
cum_sum(cols=["CategoryTotal = Revenue"]),
], by=["Category"])
# Top performing categories
top_categories = category_velocity.last_by(["Category"]).sort_descending(["CategoryTotal"])
Inventory crisis detection
Nothing kills Black Friday momentum like "OUT OF STOCK" messages. Let's build a real-time inventory tracker that alerts you before you hit zero, so you can pull products from the homepage or activate backup suppliers.
What this monitors:
- Current stock levels: Live count as sales deplete inventory.
- Stock percentage: How much is left (75% remaining vs. 5% remaining).
- Status flags: OK / LOW STOCK / OUT OF STOCK labels.
- Critical alerts: Products below reorder point (20% remaining).
- Trending products: Which items are flying off the digital shelves.
This gives your ops team time to react instead of discovering stockouts from angry customer emails:
# Simulate starting inventory
starting_inventory = empty_table(200).update([
"ProductID = (String)((i % 8 == 0 ? `Electronics` : i % 8 == 1 ? `Clothing` : i % 8 == 2 ? `Home & Garden` : i % 8 == 3 ? `Toys` : i % 8 == 4 ? `Beauty` : i % 8 == 5 ? `Sports` : i % 8 == 6 ? `Books` : `Food`) + `_` + (i % 25))",
"InitialStock = (int)randomInt(50, 500)",
"ReorderPoint = (int)(InitialStock * 0.20)", # Reorder at 20% remaining
])
# Calculate inventory depletion in real-time
inventory_usage = completed_sales.view([
"ProductID",
"Quantity",
]).update_by([
cum_sum(cols=["TotalSold = Quantity"]),
], by=["ProductID"])
# Current inventory levels
current_inventory = starting_inventory.natural_join(
inventory_usage.last_by(["ProductID"]),
on=["ProductID"],
joins=["TotalSold"]
).update([
"TotalSold = isNull(TotalSold) ? 0 : TotalSold",
"CurrentStock = (int)Math.max(0, InitialStock - TotalSold)",
"StockPct = (double)CurrentStock / (double)InitialStock * 100.0",
"Status = (String)(CurrentStock == 0 ? `OUT OF STOCK` : CurrentStock <= ReorderPoint ? `LOW STOCK` : `OK`)",
"UnitsRemaining = CurrentStock",
])
# Critical alerts: items that need attention
stock_alerts = current_inventory.where("Status != `OK`").sort(["CurrentStock"])
# Trending items (selling fast)
trending_products = inventory_usage.last_by(["ProductID"]).where("TotalSold > 20").sort_descending(["TotalSold"])
Conversion funnel analysis
Understanding where customers drop off in the purchase journey is critical during high-traffic events. Tracking the funnel in real-time enables rapid identification of conversion bottlenecks.
The customer journey we're tracking:
- Page View: Customer lands on a product page (100% starting point).
- Add to Cart: Product interest converts to cart addition (typically ~30% conversion).
- Checkout Started: Customer proceeds to payment (typically ~60% of cart additions).
- Purchase Complete: Transaction completed (typically ~80% of checkouts).
Diagnostic value: Sudden drops in conversion rates signal potential issues — payment gateway problems, unexpected costs at checkout, or performance degradation requiring immediate attention.
Here's how to build the funnel tracker:
# Simulate browsing activity (more views than purchases)
browsing_activity = time_table("PT00:00:00.050", start_time="2025-11-29T00:00:00 ET").update([
"EventID = (long)ii",
"EventTime = Timestamp",
"CustomerID = (long)randomInt(1, 50000)",
"ProductID = (String)((ii % 8 == 0 ? `Electronics` : ii % 8 == 1 ? `Clothing` : ii % 8 == 2 ? `Home & Garden` : ii % 8 == 3 ? `Toys` : ii % 8 == 4 ? `Beauty` : ii % 8 == 5 ? `Sports` : ii % 8 == 6 ? `Books` : `Food`) + `_` + (ii % 25))",
"EventType = (String)(ii % 10 < 7 ? `View` : ii % 10 < 9 ? `AddToCart` : `Purchase`)",
])
# Funnel metrics
funnel_counts = browsing_activity.update([
"EventCount = 1",
]).update_by([
cum_sum(cols=["TotalEvents = EventCount"]),
], by=["EventType"])
# Calculate conversion rates
funnel_summary = funnel_counts.last_by(["EventType"])
# Conversion rate over time (rolling window)
conversion_tracking = browsing_activity.update([
"IsView = EventType.equals(`View`) ? 1 : 0",
"IsCart = EventType.equals(`AddToCart`) ? 1 : 0",
"IsPurchase = EventType.equals(`Purchase`) ? 1 : 0",
]).update_by([
rolling_sum_tick(cols=["Views_1000 = IsView"], rev_ticks=1000, fwd_ticks=0),
rolling_sum_tick(cols=["Carts_1000 = IsCart"], rev_ticks=1000, fwd_ticks=0),
rolling_sum_tick(cols=["Purchases_1000 = IsPurchase"], rev_ticks=1000, fwd_ticks=0),
]).update([
"ViewToCart = Views_1000 > 0 ? (Carts_1000 / (double)Views_1000) * 100.0 : 0.0",
"CartToPurchase = Carts_1000 > 0 ? (Purchases_1000 / (double)Carts_1000) * 100.0 : 0.0",
"OverallConversion = Views_1000 > 0 ? (Purchases_1000 / (double)Views_1000) * 100.0 : 0.0",
])
Cart abandonment tracking
Cart abandonment rates spike during Black Friday as customers comparison shop across multiple sites. Real-time tracking identifies abandonment patterns and enables recovery strategies like targeted emails or checkout optimization.
What we're detecting:
- Cart creation: Customer starts shopping.
- Items added: They're building a cart (cart value grows).
- Checkout started: They're serious about buying.
- Abandoned: No purchase after checkout start.
- Average cart value: Potential revenue at risk.
Actionable insights: High abandonment rates can indicate checkout friction, pricing concerns, or technical issues. Real-time monitoring enables immediate investigation and response.
Here's the tracking system:
# Simulate cart activity - simplified to reduce memory usage
cart_events = time_table("PT00:00:00.500", start_time="2025-11-29T00:00:00 ET").update([
"CartID = (long)ii",
"CustomerID = (long)randomInt(1, 10000)",
"CartValue = randomDouble(20.0, 300.0)",
"IsAbandoned = randomDouble(0, 1) > 0.65", # ~35% abandonment rate
])
# Abandonment metrics
abandonment_summary = cart_events.update([
"AbandonedInt = IsAbandoned ? 1 : 0",
]).agg_by([
count_("TotalCarts"),
sum_("AbandonedCarts = AbandonedInt"),
avg("AvgCartValue = CartValue"),
]).update([
"CompletedCarts = TotalCarts - AbandonedCarts",
"AbandonmentRate = (double)AbandonedCarts / (double)TotalCarts * 100.0",
])
So far we've covered: Real-time sales tracking, inventory monitoring, conversion funnels, and cart abandonment. Now let's add performance monitoring, pricing analysis, customer segmentation, and visualization to complete your Black Friday command center.
Website performance monitoring
Your site going slow on Black Friday? That's lost sales. Every extra second of load time costs you conversions. Let's monitor performance in real-time so you can scale servers, fix slow API calls, or disable that heavy animation before customers bounce.
What we're watching:
- Page load times: Homepage, product pages, checkout (target: under 2 seconds).
- API response times: Payment gateway, inventory lookup, user auth.
- Error rates: 4xx errors (broken links), 5xx errors (server crashes).
- Slowest endpoints: Which pages need immediate optimization.
Critical thresholds: If average load time hits 3+ seconds or error rate goes above 1%, you're in trouble. This dashboard tells you exactly which page or API is the culprit.
Here's the performance tracker:
# Simulate performance metrics
performance_data = time_table("PT00:00:01", start_time="2025-11-29T00:00:00 ET").update([
"Timestamp_perf = Timestamp",
"Endpoint = (String)(ii % 5 == 0 ? `/product` : ii % 5 == 1 ? `/cart` : ii % 5 == 2 ? `/checkout` : ii % 5 == 3 ? `/search` : `/home`)",
# Simulate load degradation during peak hours
"LoadTime_ms = randomDouble(50.0, 500.0) + (ii < 10800 ? randomDouble(0, 200.0) : 0)", # Slower during first 3 hours
"StatusCode = (int)(randomDouble(0, 1) > 0.02 ? 200 : (randomDouble(0, 1) > 0.5 ? 500 : 503))",
"IsError = StatusCode >= 400",
])
# Performance metrics by endpoint
endpoint_performance = performance_data.update([
"ErrorInt = IsError ? 1 : 0",
"RequestInt = 1",
]).update_by([
rolling_avg_tick(cols=["AvgLoadTime_100 = LoadTime_ms"], rev_ticks=100, fwd_ticks=0),
rolling_sum_tick(cols=["Errors_100 = ErrorInt", "Requests_100 = RequestInt"], rev_ticks=100, fwd_ticks=0),
], by=["Endpoint"]).update([
"ErrorRate = Requests_100 > 0 ? (Errors_100 / (double)Requests_100) * 100.0 : 0.0",
"PerformanceStatus = (String)(AvgLoadTime_100 > 300.0 ? `SLOW` : AvgLoadTime_100 > 150.0 ? `WARNING` : `GOOD`)",
])
# Critical performance issues
performance_alerts = endpoint_performance.last_by(["Endpoint"]).where("PerformanceStatus != `GOOD` || ErrorRate > 1.0")
Dynamic pricing insights
Discount strategy optimization requires understanding the relationship between discount depth and sales volume. Analyzing which discount tiers drive revenue helps balance volume and margin.
What this reveals:
- Discount effectiveness: Comparative sales volume across discount tiers.
- Revenue per discount tier: Total revenue generated at each discount level.
- Volume vs. margin: Trade-offs between order count and profit per order.
- Optimal pricing: Discount levels that maximize total revenue.
Analysis approach: Compare order volume and revenue across discount tiers to identify inefficiencies in pricing strategy.
Here's the discount analyzer:
# Discount effectiveness
discount_analysis = completed_sales.update([
"DiscountTier = (String)(DiscountPct >= 0.50 ? `50%+ Off` : DiscountPct >= 0.30 ? `30-50% Off` : DiscountPct >= 0.20 ? `20-30% Off` : `Under 20% Off`)",
]).agg_by([
count_("OrderCount"),
sum_("TotalRevenue = Revenue"),
avg("AvgOrderValue = Revenue"),
avg("AvgDiscount = DiscountPct"),
], by=["Category", "DiscountTier"])
# Revenue per discount tier
discount_performance = discount_analysis.update([
"RevenuePerOrder = TotalRevenue / (double)OrderCount",
]).sort_descending(["TotalRevenue"])
Customer segmentation
Customer behavior varies significantly during Black Friday. High-spending customers have different needs than single-purchase shoppers, and segmenting them in real-time enables targeted responses.
Customer segments we're tracking:
- High-Value Customers: Multiple orders, high total spend ($500+).
- New Customers: First-time buyers during the event.
- Single-Purchase Customers: One-time low-value transactions.
- Average order value per segment: Spending patterns by customer type.
Application: Segment-specific metrics enable prioritized resource allocation and targeted interventions based on customer value and behavior patterns.
Here's the segmentation engine:
# Customer lifetime value (CLV) during event
customer_metrics = completed_sales.agg_by([
count_("OrderCount"),
sum_("TotalSpent = Revenue"),
avg("AvgOrderValue = Revenue"),
first("FirstPurchaseCategory = Category"),
], by=["CustomerID", "IsNewCustomer"]).update([
"CustomerTier = (String)(TotalSpent > 500.0 ? `VIP` : TotalSpent > 200.0 ? `High Value` : TotalSpent > 100.0 ? `Medium Value` : `Low Value`)",
])
# New vs returning customer performance
customer_cohort = customer_metrics.agg_by([
count_("CustomerCount"),
sum_("CohortRevenue = TotalSpent"),
avg("AvgCLV = TotalSpent"),
], by=["IsNewCustomer"]).update([
"CohortType = (String)(IsNewCustomer ? `New Customers` : `Returning Customers`)",
])
# High-value customer analysis
vip_customers = customer_metrics.where("CustomerTier == `VIP`").sort_descending(["TotalSpent"])
Geographic performance
Where are your sales coming from? If California is crushing it but Texas is lagging, maybe your shipping times to Texas are too long, or you need region-specific promotions. Geographic data helps you optimize marketing spend and logistics.
Regional insights:
- Sales by state/region: Which areas are driving revenue.
- Orders per region: Volume distribution (East Coast vs. West Coast).
- Average order value by region: Are coastal customers spending more?
- Shipping implications: High volume in specific regions means warehouse proximity matters.
Use case: If you see massive orders from NYC but slow shipping times there, consider expedited shipping promotions for that region. Or if Seattle sales are low, investigate if your local competitors are running better deals.
Here's the geographic breakdown:
# Add geographic data
geo_sales = completed_sales.update([
"Region = (String)(ii % 4 == 0 ? `Northeast` : ii % 4 == 1 ? `Southeast` : ii % 4 == 2 ? `Midwest` : `West`)",
"State = (String)(Region.equals(`Northeast`) ? (ii % 2 == 0 ? `NY` : `MA`) : Region.equals(`Southeast`) ? (ii % 2 == 0 ? `FL` : `TX`) : Region.equals(`Midwest`) ? (ii % 2 == 0 ? `IL` : `OH`) : (ii % 2 == 0 ? `CA` : `WA`))",
])
# Regional performance
regional_metrics = geo_sales.agg_by([
count_("Orders"),
sum_("Revenue = Revenue"),
avg("AvgOrderValue = Revenue"),
], by=["Region"]).update([
"RevenuePerOrder = Revenue / (double)Orders",
]).sort_descending(["Revenue"])
# State-level breakdown
state_performance = geo_sales.agg_by([
count_("Orders"),
sum_("Revenue = Revenue"),
], by=["State", "Region"]).sort_descending(["Revenue"])
Peak hour analysis
When does the Black Friday tsunami hit? Is it midnight (online shoppers refreshing at 12:01am), morning (8-10am coffee + shopping), or evening (6-8pm after work)? Knowing your peak hours helps you staff support teams and ensure server capacity.
Time-based patterns:
- Hourly sales volume: Which hours drive the most revenue?
- Orders per hour: Traffic spikes throughout the day.
- Average order value by time: Do late-night shoppers spend more or less?
- Capacity planning: Ensure your servers can handle 3pm surge.
Strategic value: If you know 9pm-11pm is your peak window, schedule your best flash deals then. If 2am is dead, that's when you run database maintenance.
Here's the hourly breakdown:
# Hour-by-hour breakdown
hourly_sales = completed_sales.update([
"Hour = (int)((TransactionID / 36000) % 24)", # Approximate hour
]).agg_by([
count_("Orders"),
sum_("Revenue = Revenue"),
avg("AvgOrderValue = Revenue"),
], by=["Hour"]).update([
"OrdersPerMinute = Orders / 60.0",
]).sort(["Hour"])
# Identify peak hours
peak_hours = hourly_sales.where("Orders > 1000").sort_descending(["Orders"])
Marketing attribution
You're spending thousands on Black Friday ads across email, social media, paid search, and affiliates. But which channels actually deliver ROI? Real-time attribution shows you where to double down and where to cut spending during the event, not weeks later.
Channel performance metrics:
- Revenue by channel: Email vs. Social Media vs. Paid Search
- Orders per channel: Volume from each traffic source.
- ROAS (Return on Ad Spend): Revenue / Cost for each channel.
- Average order value by channel: Which brings higher-value customers?
Real-time decisions: If paid search is generating 1 spent (3x ROAS), increase your bids. If social media ads are underperforming (0.5x ROAS), pause them and reallocate budget.
Here's the attribution tracker:
# Simulate traffic sources
attributed_sales = completed_sales.update([
"Channel = (String)(ii % 6 == 0 ? `Email` : ii % 6 == 1 ? `Social Media` : ii % 6 == 2 ? `Paid Search` : ii % 6 == 3 ? `Organic Search` : ii % 6 == 4 ? `Direct` : `Affiliate`)",
"Campaign = (String)(Channel + `_BlackFriday2025`)",
])
# Channel performance
channel_metrics = attributed_sales.agg_by([
count_("Orders"),
sum_("Revenue = Revenue"),
avg("AvgOrderValue = Revenue"),
], by=["Channel"]).update([
"RevenuePerOrder = Revenue / (double)Orders",
"ROAS = Revenue / (Orders * 5.0)", # Assuming $5 cost per order
]).sort_descending(["Revenue"])
# Best performing campaigns
top_campaigns = attributed_sales.agg_by([
count_("Orders"),
sum_("Revenue = Revenue"),
], by=["Campaign", "Channel"]).sort_descending(["Revenue"])
Real-time alerts
Set up automated alerts for critical conditions:
# Alert conditions
alerts = empty_table(1).update([
"AlertType = `System Check`",
"Message = `Monitoring active`",
])
In production, you'd check conditions and generate alerts:
- Inventory below reorder point.
- Error rate above threshold.
- Page load time degradation.
- Sudden drop in conversion rate.
- Payment gateway failures.
Building your dashboard
Now comes the visual payoff — turning these data streams into actionable dashboards. Deephaven offers two approaches: programmatic visualization with Python or no-code chart building through the UI.
Using the Chart Builder UI
The easiest way to visualize your data is through Deephaven's built-in Chart Builder. Right-click any table in the UI and select Create Chart:
Sales velocity line chart:
- Right-click
sales_velocitytable → Create Chart. - Choose "Line" chart type.
- X-axis:
Timestamp_ms. - Y-axis:
Sales_Last10s. - Add a second series for
TotalRevenueto track cumulative sales.
Top categories bar chart:
- Right-click
top_categoriestable → Create Chart. - Choose "Bar" chart type.
- X-axis:
Category. - Y-axis:
CategoryTotal. - Enable "Color by Category" for visual distinction.
Inventory status scatter plot:
- Right-click
current_inventorytable → Create Chart. - Choose "Scatter" chart type.
- X-axis:
StockPct(percentage remaining). - Y-axis:
TotalSold. - Color by:
Status(red for OUT OF STOCK, yellow for LOW STOCK, green for OK). - Size by:
InitialStock(larger bubbles = higher-volume products).
Performance monitoring:
- Right-click
endpoint_performancetable → Create Chart. - Choose "Line" chart type.
- X-axis:
Timestamp_perf. - Y-axis:
AvgLoadTime_100. - Split by:
Endpoint(creates separate lines per endpoint).
Programmatic dashboards with deephaven.ui
For automated dashboard generation or custom layouts, use deephaven.ui to build interactive control panels. Here's a complete Black Friday monitoring dashboard:
from deephaven import ui
import deephaven.plot.express as dx
# Create charts for embedding in dashboard
sales_velocity_chart = dx.line(
sales_velocity.tail(500),
x="Timestamp_ms",
y=["Sales_Last10s", "TotalRevenue"],
title="Sales Velocity: Last 10s Rolling vs Total Revenue"
)
categories_revenue_chart = dx.bar(
top_categories.head(10),
x="Category",
y="CategoryTotal",
title="Top 10 Categories by Revenue"
)
# Build dashboard using row, column, and panel layout
dashboard = ui.dashboard([
ui.row(
[
ui.column(
[
ui.panel(sales_velocity_chart, title="Sales Velocity Over Time"),
ui.panel(sales_velocity.tail(100), title="Recent Sales"),
]
),
ui.column(
[
ui.panel(categories_revenue_chart, title="Top Categories"),
ui.panel(top_categories.head(10), title="Category Details"),
]
),
],
height=60,
),
ui.row(
[
ui.column(
[
ui.panel(stock_alerts.head(20), title="Inventory Alerts"),
]
),
ui.column(
[
ui.panel(conversion_tracking.tail(50), title="Conversion Funnel"),
]
),
],
height=40,
),
])
This creates a responsive dashboard with:
- Real-time tables that update automatically as data streams in.
- Embedded charts showing sales velocity and category performance.
- Multi-column layout for efficient space usage.
- Scrollable views showing the most recent or critical data.
You can extend this with filters, buttons, and interactive controls using other ui components like ui.picker, ui.button, and ui.text_field.
Taking it to production
What you would've seen on Black Friday
With this dashboard running during the actual event, you would have:
Caught inventory issues early:
- Detected trending items before stockouts.
- Triggered reorders automatically.
- Redirected customers to alternative products.
Optimized performance in real-time:
- Identified slow endpoints during traffic spikes.
- Scaled resources before customers noticed.
- Reduced error rates by catching issues instantly.
Maximized revenue:
- Featured trending products on homepage.
- Adjusted discounts based on velocity.
- Targeted high-value customers with special offers.
Improved conversion:
- Identified bottlenecks in checkout flow.
- Sent abandoned cart recovery emails immediately.
- Optimized user experience based on behavior.
Made smarter decisions:
- Reallocated ad spend to best-performing channels.
- Extended promotions on slow-moving categories.
- Prepared for next-day traffic patterns.
Beyond Black Friday
These patterns apply to any high-stakes e-commerce scenario:
Cyber Monday: Same framework, different products. Product launches: Monitor adoption and detect issues immediately. Flash sales: Track velocity and prevent overselling. Prime Day: Multi-category coordination and inventory management. Holiday shopping: Extended period monitoring with trend detection.
The real-time advantage
Traditional analytics:
- Lag time: Hours or days before you see problems.
- Batch processing: Can't respond to changing conditions.
- Static reports: Outdated by the time you view them.
- Limited granularity: Miss short-term trends and issues.
With Deephaven:
- Instant visibility: See every transaction as it happens.
- Automatic updates: All metrics refresh continuously.
- Real-time alerts: Catch issues before they impact revenue.
- Unlimited detail: Drill down to individual transactions.
Implementation in your stack
To build this for your Black Friday 2026:
- Data ingestion: Connect your order management, web analytics, and inventory systems using Kafka or other data sources.
- Schema mapping: Map your actual fields to the table structures using table operations.
- Alert logic: Define thresholds for your business.
- Dashboard creation: Build visualizations for your team.
- Load testing: Ensure system handles peak traffic.
- Team training: Make sure everyone knows how to use it.
The code patterns shown here scale from thousands to millions of transactions per second. The same queries work whether you're processing 100 orders per minute or 10,000.
Ready for next year?
Start building now. Test with historical data. Refine your metrics. Train your team. When Black Friday 2026 arrives, you'll have the command center that turns chaos into opportunity.
The best time to build real-time analytics was before last Black Friday. The second best time is now.
Join the Community