Deep dive
← Back to ProjectsSystem logic and property design philosophy
This section explains why the data model is intentionally minimal, how primitives are chosen, and why derived interpretations are computed instead of persisted.
Brief
This property model is intentionally minimal. Rather than persisting every possible interpretation of buyer behavior, it focuses on a small set of primitive signals that reliably drive downstream decisions. Derived insights such as velocity, timing, or behavioral nuance are computed dynamically and not stored as long lived system state. This reduces semantic drift, avoids conflicting interpretations across teams, and keeps the system trustworthy as it scales.
The goal is not expressiveness for its own sake, but decision clarity with minimal operational debt.
Core Property Model
When designing custom properties in HubSpot, I intentionally constrained the model to three foundational dimensions:
- Engagement
- Intent (high vs. low)
- Priority
The guiding principle was restraint.
While there are nearly infinite ways to label, score, and categorize customer data, not all variables justify becoming persistent system state. Data exists to change decisions. If a variable does not reliably alter a downstream decision and introduces implementation, maintenance, or interpretive cost for either humans or systems, it fails a basic ROI test and should not be persisted.
Persist primitives. Compute interpretations at query or workflow time.
This principle becomes increasingly important as data is consumed across multiple teams, where signals must remain interpretable, stable, and trustworthy outside their original context.
Why Not Include More Variables?
Many commonly proposed properties are not fundamentally new signals, but derivatives or alternate representations of existing primitives. Persisting these derivatives often increases complexity without delivering proportional decision value.
Example: Velocity
Velocity is typically defined as the rate of engagement over time, for example, multiple interactions occurring within a short window that suggest heightened interest.
While velocity can provide incremental insight, it is fundamentally derived from engagement plus time. Persisting it as canonical state introduces several structural questions:
- When does velocity update?
- What happens when activity slows or stops?
- Which time window is authoritative?
- What if different systems calculate it differently?
- Which version becomes the source of truth?
These questions compound system complexity and increase the risk of semantic drift.
That said, velocity can be operationally useful, particularly for identifying moments of elevated buyer attention when outreach may be more effective. In practice, however, its value lies in temporarily adjusting priority or triggering actions, not in maintaining a long lived velocity field as trusted system state.
This highlights an important distinction between decision triggers and stored truths.
Strategic Timing Signals
Signals tied to specific moments, such as holidays, promotions, contract cycles, or seasonal buying patterns, can be powerful within narrowly defined contexts.
However, when designing an industry and product agnostic foundation, these signals lack stable semantics. A high priority window can mean very different things in:
- A holiday driven B2C commerce model
- A B2B subscription renewal cycle
- A long cycle enterprise sales motion
Because meaning varies so widely, these signals are better treated as contextual overlays rather than persistent properties. In most cases, they belong in campaign logic, workflow conditions, or temporary scoring adjustments, not in the core data model.
Personality and Behavioral Modeling
Behavioral or personality based segmentation, such as research depth, price sensitivity, or tolerance for outreach, could meaningfully affect how engagement correlates with likelihood to purchase.
In theory, this modeling is powerful.
In practice, it introduces two major challenges:
-
Data availability and reliability
These traits are difficult to infer accurately without extensive behavioral history and long observation windows.
-
Interpretive ambiguity
The same engagement pattern may imply high intent for one persona and low intent for another. Without strong grounding, the signal becomes speculative rather than operational.
Because these models are assumption heavy and context dependent, they are best treated as human judgment aids or experimental analytics, not as universally trusted system state.
The RevOps Lens: Signal Consistency at Scale
From a RevOps perspective, the most important requirement of any shared signal is semantic consistency across teams.
Marketing, Sales, Customer Success, and Finance all consume the same data, often through dashboards that present signals confidently and without caveats. As a result, RevOps tends to favor primitive, stable signals whose meaning does not drift depending on audience or use case.
A useful internal test becomes:
“If this value changes, what does the system actually do differently?”
If the honest answer is:
- It depends
- Sales decides
- Marketing experiments
- We tailor messaging
then the variable is not an operational signal. It is a human judgment aid.
Human judgment aids are valuable, but when they are persisted as authoritative system state, they introduce risk:
- Stale interpretations
- Silent semantic drift
- Conflicting assumptions across teams
- Disputes when multiple departments rely on the same number for different decisions
These risks compound over time as systems scale and organizational trust in dashboards increases.
Conclusion: Why This Matters
A minimal, primitive data model is not a limitation. It is a governance strategy.
By persisting only signals that:
- Have stable meaning
- Drive explicit downstream decisions
- Remain valid across departments
and computing interpretations dynamically at query or workflow time, the system remains flexible without becoming ambiguous.
In practice, this approach reduces analytical debt, prevents cross team confusion, and allows Marketing Ops to move quickly without breaking downstream reporting or trust. It aligns with RevOps grade thinking while remaining grounded in day to day operational execution.
The system optimizes for trust, consistency, and long term interpretability over short term expressiveness or local optimization.