● Index 00 — Manifest
Engineering systems that compound over noise time.
I'm Daryl Zhong — a builder working at the seams of AI distributed architecture big data retail intelligence quantitative investment. I design platforms that turn messy reality into compounding signal.
01 / Expertise
Five domains, one throughline — turning systems into leverage.
-
01
AI & Applied Intelligence
Production LLM systems, retrieval pipelines, evaluation harnesses, and agentic workflows. Comfortable from token economics up to org-wide AI strategy.
- RAG / Eval
- Agents
- Multimodal
- Inference cost
-
02
Distributed Architecture
Designing platforms that survive growth — service boundaries, event flows, storage layouts, failure modes. Pragmatic about complexity, ruthless about clarity.
- Event-driven
- CQRS
- Schema evolution
- SRE
-
03
Big Data Platforms
Lakehouse, streaming, batch, governance. Treating data as a product, not a dump — with contracts, observability, and lineage built in.
- Spark / Flink
- Iceberg
- Lineage
- Quality
-
04
Retail Intelligence
Years inside retail — supply, pricing, assortment, store operations, loyalty. I read the P&L before the schema, and I know where the noise lives.
- Pricing
- Assortment
- Forecast
- Ops
-
05
Quantitative Investment
Research infrastructure, alpha lifecycle, factor frameworks, execution-aware backtests. Engineering discipline meets market reality.
- Factor research
- Backtest
- Risk
- Execution
02 / Selected Work
A handful of systems I'm proud of — names redacted where required.
-
CASE 01
Real-time pricing engine for a top-3 retailer
Sub-second decisioning across millions of SKUs. Streaming feature store, guardrail policies, shadow rollout. Cut markdown leakage by double digits.
-
CASE 02
LLM platform for an enterprise data org
Multi-tenant inference, RAG pipelines, eval gates, audit-grade logging. Replaced a sprawl of POCs with one paved road.
-
CASE 03
Lakehouse migration · 8 PB, zero downtime
From legacy warehouse to Iceberg-on-object-store with backwards-compatible contracts. Halved infra spend, tripled query concurrency.
-
CASE 04
Quant research platform · factor-to-prod
Notebook → backtest → paper trade → live, all under one lineage. Researchers ship 3× faster; risk reads from the same source of truth.
-
CASE 05
Demand forecast suite · 2,400 stores
Hierarchical forecasting with promo + weather + cannibalization signals. Shrunk forecast error band by ~22% on the hardest categories.
03 / Approach
How I work — seven commitments I make to a system.
- i.Read the business before the schema. The P&L is the real spec; everything else is implementation.
- ii.Make the boring decisions on purpose. Boring tech, boring boundaries, boring deploys — boring is what scales.
- iii.Optimize for what's reversible. Cheap to undo beats clever and locked in.
- iv.Treat data as a product. Contracts, owners, SLAs. Dumps end up as debts.
- v.Ship the smallest honest version. Real users beat stakeholder hypotheticals.
- vi.Instrument before you optimize. Measurement first, opinions second, code third.
- vii.Prefer compounding over heroics. A system that improves weekly beats a launch that's perfect once.
04 / Stack
Tools I reach for — chosen, not collected.
// Languages
- Python
- TypeScript
- Go
- Rust
- SQL
- Java / Scala
// AI / ML
- PyTorch
- JAX
- vLLM / TGI
- LangGraph
- Ragas
- Weights & Biases
// Data
- Spark
- Flink
- Iceberg
- DuckDB
- Kafka
- Airflow / Dagster
// Infra
- Kubernetes
- Terraform
- AWS · GCP
- ClickHouse
- Postgres
- Redis
// Quant
- Pandas / Polars
- NumPy
- QuantLib
- Vectorbt
- Backtrader
- Optuna
// Practice
- RFCs & ADRs
- Trunk-based
- Observability-first
- Incident review
- Cost as a feature
- Mentorship
05 / Contact