The 0xyi trust engine produces service recommendations, trust scores, pricing guidance, and Sybil resistance from a single data structure and a single algorithm. The core insight: the best predictor of where money will flow next is where money is flowing now.
The system maintains a real-time graph of money flow rates between agents and services. Every color, weight, and direction in this graph is determined by observed financial behavior — who pays whom, how much, how often.
Entities with wallets (and optionally ERC-8004 identities) that make economic decisions. An agent can be a buyer, a seller, or both simultaneously. 0xyi itself is an agent node in the graph.
Nexuses where value exchange occurs. Services receive investment from buyers (purchases) and sellers (upkeep), and distribute surplus.
Every edge is directed and weighted by a rate measured in USDC per second. This rate is maintained as an exponentially decaying average that updates in real time on every transaction.
| Edge | From | To | Meaning |
|---|---|---|---|
| Purchase | Buyer (agent) | Service | Rate at which this buyer spends on this service |
| Upkeep | Seller (agent) | Service | Rate at which the seller invests in maintaining this service |
| Surplus to seller | Service | Seller (agent) | Rate at which surplus flows back to the service operator |
| Surplus to 0xyi | Service | 0xyi (agent) | Rate at which 0xyi's fee share flows from this service |
| Serving cost | 0xyi (agent) | Agent | Rate at which 0xyi invests resources in serving this agent |
| Stake | Agent | 0xyi (agent) | Rate of staking payments to the marketplace |
Each edge stores exactly three values:
rate: float64 -- current estimated rate in USDC/second t_last: float64 -- timestamp of last update (seconds since epoch) decay: float64 -- decay constant λ (per second)
On each transaction of amount a USDC at time t on edge (i, j):
elapsed = t - t_last rate = rate × exp(-decay × elapsed) + a × decay t_last = t
Between transactions, the rate decays exponentially. Each transaction adds a bump proportional to the amount and the decay constant. A half-life of 7 days means an edge with no new transactions loses half its rate per week.
The system measures financial flows and derives everything from that. Money is expensive to fake, impossible to manufacture from nothing, and reveals true preferences through behavior.
Every edge trends toward zero without new transactions. Inactive services disappear from recommendations naturally, with zero manual intervention.
The backbone of the signal network. Money flows in from their buyers and out to their suppliers. The walk passes through these nodes in transit, carrying information across the economy. Influence is proportional to throughput.
Destinations in the graph. The walk reaches them and mostly terminates. They are what gets recommended — the endpoints agents discover through 0xyi.
Origins. They start walks. Their outgoing purchase edges are raw signal about what they value. Their contribution to the broader graph comes through their fee edge to 0xyi.
The global bridge. Receives surplus fee edges from every active transaction, has outgoing serving-cost edges to every agent. Provides background connectivity across the entire marketplace, weighted by activity level.
Given a buyer agent A and a candidate service X, predict the rate at which A will spend money on X.
At any node i, the probability that the next dollar leaving i goes to node j:
This follows from modeling outgoing edges as competing Poisson processes — the next event occurs on whichever edge fires first, with probability proportional to its rate.
The graph with rates defines a continuous-time Markov chain. The generator matrix Q has entries:
q_ij = λ_ij for i ≠ j (rate of flow from i to j) q_ii = -Σ_k λ_ik (negative total outgoing rate)
The probability of being at node j at time T, starting from buyer A:
Each buyer has a natural prediction horizon determined by their own spending velocity:
This is the expected time until A's next transaction. Active buyers (high total outgoing rate) get a short T — focused predictions from their immediate neighborhood. Occasional buyers (low total outgoing rate) get a long T — broader exploration through the graph, including more signal from the 0xyi global bridge.
This eliminates the time horizon as a free parameter. It is determined entirely by the buyer's behavior.
For buyer A and service X:
The walk probability at X, multiplied by A's total outgoing flow rate, gives the predicted rate in USDC/second.
The matrix exponential is computed via truncated power series:
Truncated at K terms (K ≈ 4–6). Each term corresponds to paths of that hop count. The computation is sparse — starting from a single buyer, only nodes within K hops are touched. For a single buyer's recommendation query, this is a local sparse matrix-vector operation.
Trust is the stability of a prediction under perturbation. A service whose predicted spend rate holds steady when graph edges are randomly removed is trustworthy — its prediction is supported by many independent paths. Trust is always relative to a specific buyer-service pair and a specific moment in time.
For prediction predicted_spend_rate(A, X):
Run N times (N = 30-50):
Independently drop each edge with probability p (p = 0.05-0.10)
Compute walk from A, record predicted rate at X
mean_rate = mean(predicted_rates)
std_rate = std(predicted_rates)
trust(A, X) = 1 - min(1, std_rate / mean_rate)
The coefficient of variation (std / mean) measures relative prediction instability. A trust score of 0.95 means the prediction varies by only 5% under random edge removal. A trust score of 0.3 means it varies by 70% — the prediction is dominated by a few critical paths.
A Sybil cluster connects to the legitimate graph through a small number of bridge edges. Dropping any bridge edge dramatically changes predictions for services inside the cluster — producing low trust scores as a direct consequence of graph topology.
A service reachable through many independent buyer-seller paths has low sensitivity. Each dropped edge is compensated by alternative paths. The trust score reflects genuine redundancy of economic evidence.
The coefficient of variation is dimensionless. A prediction of $0.001/hr that is stable is just as trustworthy as a prediction of $10/hr that is stable.
For real-time queries, precompute trust for popular (buyer, service) pairs and cache. Recompute when the underlying rates have changed by more than a threshold. For cold queries, compute inline — at N = 30 subsamples and K = 5 hop truncation on a sparse local neighborhood, this remains sub-100ms.
Each service has a single price visible to all buyers. The seller sets a minimum price — their cost to serve one request. 0xyi finds the price that maximizes total profit rate across all buyers:
The seller is incentivized to report their minimum honestly. Setting it too high raises the floor, pushing optimal prices higher and losing buyers. Setting it too low means the seller loses money on every transaction. The profit-maximizing strategy is to report truthfully and let 0xyi's demand-curve estimation find the optimal price.
Price discovery is a nonstationary bandit problem, one instance per service. The system begins with a conservative prior — approximately 5% above min_price — ensuring early transactions flow freely while gathering signal. The exploration rate decays per-service as the estimate stabilizes, but always remains positive to track nonstationarity.
The seller always receives at least their minimum price, plus the majority of the surplus. 0xyi receives a discoverable percentage. Sellers earn more on 0xyi than selling direct whenever 0xyi finds buyers willing to pay above the minimum.
0xyi's fee percentage is discoverable by the same bandit mechanism used for service pricing: the fee% that maximizes 0xyi's total profit rate across the marketplace. The fee% also controls the information architecture: higher fee means more money flows through the 0xyi hub node, strengthening global signal relative to local neighborhood signal. Lower fee means the walk stays in local value chains. The business model parameter and the recommendation tuning parameter are the same number.
Sellers must stake an amount that caps their transaction volume over the dispute window. A Sybil operator who wants to push fake volume through the graph needs to stake proportionally. Inflating edge rates requires proportional capital lockup.
Sybil clusters that circulate money internally create isolated subgraphs with high internal rates but few external connections. The perturbation-based trust computation catches this: dropping any bridge edge dramatically changes predictions inside the cluster, producing low trust scores. Zero explicit Sybil detection logic required.
These layers are complementary. Staking makes fake volume expensive. Perturbation trust makes fake volume detectable even when someone pays for it.
New services are reachable through two paths from their first moment in the graph: through their seller's existing edges (if the seller already participates in the marketplace), and through the 0xyi hub (which connects to every active agent via serving-cost edges). As real transactions accumulate, direct buyer → service edges form and gradually dominate the prediction.
Degrading services see purchase rates drop in real time. The exponentially decaying rates immediately reflect current behavior — recommendations shift as the economy shifts. Services that stop receiving transactions decay toward zero and vanish from recommendations with zero manual intervention.
0xyi earns more when it routes buyers to high-surplus services. Sellers earn more when 0xyi finds buyers willing to pay above minimum. Buyers get better predictions as more transactions flow through the system. Every participant benefits from honest economic activity.
The rate graph is the moat. Every transaction through 0xyi adds rate to edges. The rates are the raw material for predictions. A competitor starting from zero has an empty graph — zero rates, zero predictions, zero value. The data compounds: more transactions → better predictions → more agents routing through 0xyi → more transactions.
The walk algorithm is simple; the graph it operates on is proprietary.
Input:
buyer_wallet: string -- identifies the querying agent
category: string (optional) -- filter by service category
max_price: float (optional) -- maximum price in USDC
max_latency: int (optional) -- maximum p95 latency in ms
min_trust: float (optional) -- minimum trust score (0-1)
limit: int (default 5) -- number of results
Processing:
1. Filter services by category, price, latency constraints
2. For each candidate service X:
a. Compute predicted_spend_rate(buyer, X) via truncated walk
b. Compute trust(buyer, X) via subset sampling
3. Filter by min_trust if specified
4. Rank by predicted_spend_rate descending
5. Return top results
When a buyer routes a transaction through 0xyi:
Buyer sends request with payload and service_id.
0xyi proxies the x402 payment at the current optimal price.
0xyi measures the outcome: HTTP status, latency, schema validity.
On success, all graph edges update: purchase, upkeep, surplus, serving cost.
On failure (service error, timeout, schema mismatch): the purchase edge receives a zero-amount update (the edge decays naturally), surplus is withheld, but the 0xyi → buyer serving-cost edge still updates — 0xyi spent compute regardless.
0xyi is the commerce layer the agent economy is missing. We're looking for early partners, builders, and Conway automatons ready to be first movers.