ChainScore Labs
All Guides

Evaluating Historical Performance of Yield Optimizers

LABS

Evaluating Historical Performance of Yield Optimizers

Chainscore © 2025

Core Performance Metrics to Analyze

Quantitative analysis of a yield optimizer's historical performance requires examining specific, on-chain metrics. These data points reveal the strategy's risk-adjusted returns, efficiency, and reliability over time.

Annual Percentage Yield (APY)

Annual Percentage Yield (APY) represents the real rate of return earned, accounting for the effect of compounding. It is calculated from historical data, not forward-looking projections.

  • Distinguish between gross APY (before fees) and net APY (after protocol and strategy fees).
  • Analyze APY consistency; high volatility can indicate risky farming strategies or unsustainable incentives.
  • This metric matters as it directly compares the earning potential against traditional benchmarks and other DeFi products.

Sharpe Ratio

Sharpe Ratio measures risk-adjusted return by comparing the strategy's excess returns to its volatility. A higher ratio indicates better return per unit of risk taken.

  • Calculate using the strategy's historical daily returns versus a "risk-free" benchmark like stablecoin yields.
  • A consistently low or negative Sharpe Ratio suggests the high APY is not compensating for its inherent risk.
  • This matters for users seeking sustainable growth, as it helps filter out strategies that are overly volatile.

Maximum Drawdown (MDD)

Maximum Drawdown (MDD) is the largest peak-to-trough decline in the value of a deposited position, expressed as a percentage. It quantifies the worst historical loss.

  • A 40% MDD means a user's capital would have been temporarily worth 40% less at its lowest point.
  • Examine the duration of recovery; a long drawdown period indicates high correlation with market downturns.
  • This matters profoundly for capital preservation and emotional tolerance, revealing the strategy's downside risk.

Fee Efficiency

Fee Efficiency analyzes the proportion of generated yield that is captured by the user versus taken as fees. It's a key determinant of net returns.

  • Calculate the fee drag: (Gross APY - Net APY) / Gross APY.
  • High performance fees are acceptable only if the strategy's alpha (outperformance) justifies them.
  • This matters because opaque or excessive fees can erode compounding benefits, making a high-gross APY strategy unattractive.

Strategy Utilization & TVL

Total Value Locked (TVL) and its growth trajectory indicate market confidence and strategy capacity. Sudden inflows or outflows can impact performance.

  • Monitor TVL relative to the strategy's designed capacity; over-saturation can dilute yields.
  • Analyze correlation between TVL changes and APY; a declining APY with rising TVL suggests diminishing returns to scale.
  • This matters for assessing strategy scalability and potential future yield compression as more capital competes for rewards.

Win Rate & Consistency

Win Rate measures the frequency of positive return periods (e.g., daily or weekly) versus negative ones. Consistency shows the strategy's reliability across market cycles.

  • A 70% weekly win rate means profits were generated in 7 out of 10 weeks historically.
  • Combine with the Profit/Loss Ratio to see if wins are larger than losses.
  • This matters for psychological comfort and predictable income streams, distinguishing between steady compounders and lottery-ticket strategies.

Sourcing and Structuring Historical Data

Process overview

1

Identify and Access Primary Data Sources

Locate the raw on-chain and off-chain data required for analysis.

Detailed Instructions

Begin by identifying the smart contract addresses for the target yield optimizer vaults and their underlying strategies. Use block explorers like Etherscan or blockchain-specific RPC nodes to query historical transaction logs and event emissions. For off-chain data such as APY calculations and strategy descriptions, access the protocol's official API endpoints or subgraph queries. For example, Yearn's v3 vault data can be sourced from its public subgraph. Establish a reliable connection to an archival node or a service like The Graph to ensure access to full historical state, not just recent blocks.

  • Sub-step 1: Compile a list of vault contract addresses (e.g., 0x19D3364A399d251E894aC732651be8B0E4e85001 for yvDAI).
  • Sub-step 2: Identify the relevant event signatures for deposits, withdrawals, and strategy reports.
  • Sub-step 3: Set up a connection to a node provider (Infura, Alchemy) with archival capabilities.
javascript
// Example: Fetching vault creation events from Yearn Registry const eventSignature = 'VaultCreated(address,address,address,string,string)'; const topics = [ethers.id(eventSignature)]; const logs = await provider.getLogs({ address: registryAddress, topics: topics, fromBlock: 12000000 });

Tip: Bookmark the protocol's official GitHub repository for contract ABIs and documentation to ensure accurate event decoding.

2

Extract and Parse On-Chain Transaction Logs

Collect and decode event data to reconstruct vault activity.

Detailed Instructions

Query the blockchain for all logs emitted by the vault and strategy contracts over your analysis period. The key events are Deposit, Withdraw, and StrategyReported. Decode the log data using the contract's Application Binary Interface (ABI) to extract meaningful parameters: user address, asset amount, share price, and total assets. For performance analysis, the StrategyReported event is critical as it contains the gain, loss, debt paid, and new total debt for a strategy. Pay close attention to block timestamps to sequence events correctly. Use batch processing and consider gas costs when querying large historical ranges.

  • Sub-step 1: Filter logs by the specific vault address and event topics for a defined block range.
  • Sub-step 2: Decode the data and topics fields using the ethers.js or web3.py ABI.
  • Sub-step 3: Calculate derived fields like effective share price at the time of each user action.
python
# web3.py example: Decoding a StrategyReported event strategy_report_abi = vault_contract.events.StrategyReported._get_event_abi() log_data = w3.eth.get_logs({'address': vault_address, 'fromBlock': start_block, 'toBlock': end_block}) decoded_events = w3.eth.contract(abi=[strategy_report_abi]).events.StrategyReported().processReceipt({'logs': log_data}) for event in decoded_events: gain = event['args']['gain'] totalDebt = event['args']['totalDebt']

Tip: Store raw log data with block numbers and transaction hashes to maintain an immutable audit trail for your analysis.

3

Calculate Time-Series Performance Metrics

Transform raw event data into comparable yield and risk metrics.

Detailed Instructions

Process the parsed event data to construct a daily time-series for each vault. The core metric is the vault share price, which can be calculated from sequential StrategyReported events: newSharePrice = (totalGain - totalLoss + totalDebt) / totalSupply. From this, compute the daily return. Calculate the Annual Percentage Yield (APY) by compounding daily returns over 365 days. Also, derive risk metrics like drawdown (peak-to-trough decline in share price) and volatility (standard deviation of daily returns). For multi-asset vaults, normalize all values to a common denomination like USD using historical price oracles (e.g., Chainlink data feeds) to enable cross-vault comparison.

  • Sub-step 1: For each day, determine the closing share price from the last relevant event.
  • Sub-step 2: Compute daily return: (Price_t / Price_{t-1}) - 1.
  • Sub-step 3: Annualize returns: APY = ((1 + avg_daily_return) ^ 365) - 1.
  • Sub-step 4: Calculate the maximum drawdown over rolling 30-day windows.
sql
-- Example SQL to calculate daily returns from a share_price table SELECT date, share_price, LAG(share_price, 1) OVER (ORDER BY date) as prev_price, (share_price / LAG(share_price, 1) OVER (ORDER BY date)) - 1 as daily_return FROM vault_share_prices ORDER BY date;

Tip: Use a consistent compounding model (daily vs. continuous) when calculating APY and clearly document your methodology.

4

Structure Data into an Analysis-Ready Schema

Organize cleaned metrics into a queryable database or dataframe.

Detailed Instructions

Design a normalized database schema or Pandas DataFrame structure to support efficient analysis. Core tables should include vaults (metadata), daily_metrics (share price, TVL, APY), user_actions (deposits/withdrawals), and strategy_reports. Establish clear foreign key relationships. For time-series analysis, ensure the daily_metrics table has a composite key of (vault_id, date) with no gaps. Include derived columns for rolling volatility (e.g., 7-day and 30-day standard deviation) and risk-adjusted returns like the Sharpe Ratio (assuming a risk-free rate). This structure allows for complex queries comparing performance across vaults, strategies, and market conditions.

  • Sub-step 1: Create a vaults dimension table with columns: id, address, name, underlying_asset, version.
  • Sub-step 2: Populate a daily_metrics fact table with calculated fields for each vault and date.
  • Sub-step 3: Create indexes on frequently queried columns like vault_id and date.
  • Sub-step 4: Add a market_data table for historical ETH/USD or relevant asset prices.
python
# Pandas DataFrame structure example import pandas as pd df_metrics = pd.DataFrame({ 'date': pd.date_range(start='2023-01-01', periods=365), 'vault_id': 'yvUSDC', 'share_price': [...], 'tvl_usd': [...], 'daily_return': [...], 'apy_7d_ma': [...] # 7-day moving average APY }) df_metrics.set_index(['vault_id', 'date'], inplace=True)

Tip: Use a data versioning system (like DVC) to track changes to your processed datasets, especially when recalculating metrics with updated logic.

5

Validate Data Integrity and Consistency

Implement checks to ensure the dataset is accurate and reliable.

Detailed Instructions

Before analysis, run a series of validation rules on your structured dataset. Check for data completeness—ensure there are no missing days in the time-series for active vaults. Verify logical consistency: the total value locked (TVL) should be non-negative, and share price should be monotonically non-decreasing outside of fee events or losses. Cross-validate your calculated APY against official protocol dashboards for a sample period, allowing for minor discrepancies due to calculation timing. Perform sanity checks on outlier values; a daily return exceeding 50% likely indicates a data parsing error or a unique event like a massive liquidity injection. Automate these checks in your data pipeline.

  • Sub-step 1: Query for gaps in the daily time-series for each vault.
  • Sub-step 2: Assert that TVL equals (share_price * total_supply) within a small tolerance.
  • Sub-step 3: Spot-check APY calculations for specific dates against third-party sources like DefiLlama.
  • Sub-step 4: Identify and investigate days with returns beyond +/- 3 standard deviations from the mean.
python
# Example validation function for share price consistency def validate_share_price_integrity(df): # Check for non-negative share price assert (df['share_price'] > 0).all(), "Share price must be positive" # Check for drastic single-day drops not explained by fees (e.g., >5%) df['price_change'] = df['share_price'].pct_change() drastic_drops = df[(df['price_change'] < -0.05) & (df['fee_event'] == False)] if not drastic_drops.empty: logger.warning(f"Unexpected drops detected: {drastic_drops.index.tolist()}") return True

Tip: Maintain a validation log that records the date, check performed, and any anomalies found for audit purposes.

Analytical Frameworks and Methodologies

Foundational Performance Metrics

Risk-Adjusted Returns are the cornerstone of evaluation, moving beyond raw APY. The Sharpe Ratio measures excess return per unit of risk (volatility), while the Sortino Ratio focuses only on downside deviation, which is more relevant for asymmetric crypto yields.

Key Metrics to Track

  • Maximum Drawdown (MDD): The largest peak-to-trough decline in a strategy's value over a period. A high MDD indicates significant capital erosion risk, as seen in some leveraged farming strategies on Aave or Compound during volatile markets.
  • Win Rate & Profit Factor: The percentage of profitable periods and the ratio of gross profits to gross losses. A strategy on Convex Finance might have a high win rate but a low profit factor if small gains are frequently wiped out by occasional large losses.
  • Beta to ETH/BTC: Measures correlation and sensitivity to broader market movements. A yield optimizer with low beta, like a stablecoin-focused strategy on Yearn, may offer genuine alpha versus one that simply mirrors ETH staking rewards.

Practical Analysis

Start by collecting historical daily APY data and the underlying token price for the vault. Calculate rolling 30-day volatility and compare it to the benchmark return (e.g., ETH staking yield). This reveals if the extra yield compensates for the additional risk taken.

Comparative Analysis of Major Optimizers

Key performance and operational metrics for leading yield optimizer protocols.

MetricYearn FinanceBeefy FinanceConvex Finance

Average APY (30-day, ETH vaults)

4.2%

5.8%

3.9%

Performance Fee

20%

4.95%

17%

Withdrawal Fee

0.5%

0.1%

0%

TVL (USD, approx.)

$450M

$1.2B

$2.1B

Supported Chains

Ethereum, Fantom, Arbitrum

15+ EVM chains

Ethereum

Average Harvest Gas Cost

~$45

~$8

~$35

Native Token Utility

Governance, fee share

Governance, boost

Governance, fee share, vote-locking

Assessing Risk from Historical Events

Process for analyzing past incidents to quantify protocol risk exposure.

1

Identify and Categorize Past Incidents

Compile a comprehensive list of historical events affecting the optimizer or its underlying protocols.

Detailed Instructions

Begin by gathering data on all historical incidents involving the yield optimizer, its integrated protocols (e.g., Aave, Compound, Curve), and its underlying infrastructure (e.g., oracles, keepers). Sources include protocol post-mortems, DeFi security databases (Rekt.news, DeFiYield), and community forums. Categorize each event by type: smart contract exploit, oracle failure, governance attack, economic design flaw, or liquidity crisis. For example, note the specific optimizer vault (e.g., Yearn's yvUSDC) and the date of the incident. This creates a structured event log for quantitative analysis.

  • Sub-step 1: Query on-chain analysis platforms like Dune Analytics for unusual withdrawal spikes or TVL drops.
  • Sub-step 2: Review the optimizer's official GitHub repository for incident-related issue reports and commits.
  • Sub-step 3: Cross-reference with blockchain explorers (Etherscan) to verify transaction activity during suspected event periods.

Tip: Maintain a spreadsheet linking each event to its root cause analysis report and the affected total value locked (TVL).

2

Quantify Financial Impact and User Losses

Calculate the monetary damage and recovery rates from each historical event.

Detailed Instructions

For each categorized incident, calculate the financial impact. Determine the total value at risk (VaR) at the time of the event and the actual realized loss. Use on-chain data to track fund movements from optimizer vaults to attacker addresses or during emergency withdrawals. Calculate the recovery rate if funds were reimbursed via treasury or insurance. For instance, analyze the 2021 Yearn v1 DAI vault exploit where $11 million was lost, with a subsequent 100% reimbursement. Use block explorers to trace transactions like 0x6c905b4108a87499ced1e0498721f2b831c6ab13 (the attacker's contract). This quantifies the protocol's historical loss severity and its financial resilience.

  • Sub-step 1: Use Dune dashboards to query vault balance changes before, during, and after the incident.
  • Sub-step 2: Calculate the loss as a percentage of the vault's TVL at the block height of the exploit.
  • Sub-step 3: Verify any reimbursement transactions from the protocol's treasury or governance-controlled addresses.

Tip: Distinguish between permanent loss and temporary insolvency; the latter may indicate robust emergency mechanisms.

3

Analyze Protocol Response and Mitigation Actions

Evaluate the effectiveness of the team's incident response and subsequent security upgrades.

Detailed Instructions

Assess the protocol's response timeline and mitigation actions. Review official communications (Discord, Twitter) for time-stamped announcements. Evaluate the technical response: were emergency pauses (pause() function) invoked, and how quickly? Examine post-incident governance proposals for security upgrades, such as migrating to a new vault version or adding multisig signers. For example, after the Pickle Finance Jarrifier exploit, the team implemented a new time-lock on critical functions. Analyze the code diff in the repository commit to understand the fix. This step measures the team's operational security maturity and commitment to improving system robustness after a failure.

  • Sub-step 1: Check the optimizer's smart contract for pause function calls on the incident date using Etherscan's "Internal Txns" tab.
  • Sub-step 2: Review all governance forum posts and Snapshot votes created in the 30 days following the event.
  • Sub-step 3: Audit the commit history in the vault's GitHub repository for security-related patches post-incident.
solidity
// Example of a time-lock modifier added post-incident modifier timelock() { require(block.timestamp >= executionTime, "Timelock: too early"); require(block.timestamp <= executionTime + gracePeriod, "Timelock: too late"); _; }

Tip: A swift, transparent response with verifiable on-chain actions is a positive risk indicator.

4

Calculate Key Risk Metrics from Event History

Derive quantitative risk metrics like Mean Time Between Failures and Probability of Loss.

Detailed Instructions

Synthesize the collected data into actionable risk metrics. Calculate the Mean Time Between Failures (MTBF) for the optimizer by averaging the days between significant incidents. Estimate the historical Probability of Loss per epoch (e.g., annually) by dividing the number of loss events by the protocol's operational lifespan. Compute the Average Loss Given Default (LGD) as a percentage of TVL. For a concrete example, if an optimizer experienced 2 loss events over 800 days with an average loss of 5% of TVL, its annualized probability is ~0.91% and MTBF is 400 days. These metrics allow for comparative risk assessment against other yield protocols and inform position sizing decisions.

  • Sub-step 1: Define a "significant incident" threshold (e.g., loss > 1% of vault TVL) to filter noise.
  • Sub-step 2: Use your event log to calculate the exact number of days between qualifying incidents.
  • Sub-step 3: Aggregate total losses and average them against the relevant TVL figures for each event to find LGD.

Tip: Combine these historical metrics with current security measures (audit scores, bug bounty payouts) for a forward-looking view.

5

Benchmark Against Industry Peers and Correlate with Market Stress

Contextualize the optimizer's risk profile by comparing it to competitors and market downturns.

Detailed Instructions

Benchmarking provides relative risk context. Compare your calculated metrics (MTBF, Probability of Loss) with those of direct competitors (e.g., Yearn vs. Convex vs. Beefy). Also, analyze if incidents correlate with broader market stress, such as the May 2022 UST depeg or the March 2020 Black Thursday. Did the optimizer experience disproportionate withdrawals or losses during these periods? Use total crypto market cap or DeFi TVL charts as a baseline. For instance, assess if a Curve pool exploit that impacted a Convex wrapper occurred during low liquidity conditions. This identifies systemic risk linkages and determines if the optimizer's risk is idiosyncratic or market-driven.

  • Sub-step 1: Gather publicly available post-mortems from competitor protocols to build a comparable incident dataset.
  • Sub-step 2: Overlay the optimizer's incident dates on a chart of ETH price or DeFi Pulse TVL to visually identify correlations.
  • Sub-step 3: Calculate the optimizer's TVL drawdown during market-wide events versus the sector average.

Tip: An optimizer with incidents isolated to its own design flaws may be riskier than one only affected by widespread, exogenous shocks.

Tools for Performance Analysis

Essential platforms and methodologies for analyzing the historical returns, risks, and operational metrics of yield optimization protocols.

On-Chain Analytics Dashboards

Protocol-specific dashboards provide granular, real-time data on vault performance.

  • Track metrics like APY history, total value locked (TVL), and user counts over custom timeframes.
  • Example: Analyzing a Yearn Finance vault's APY stability through multiple market cycles.
  • This matters for assessing a protocol's consistency and scalability under different conditions.

DeFi Llama & TVL Aggregators

Total Value Locked (TVL) aggregators offer a macro view of protocol growth and market share.

  • Compare TVL trends across competing optimizers like Convex Finance and Aura Finance.
  • Identify correlations between TVL inflows and reward token emissions.
  • This matters for evaluating protocol adoption, liquidity depth, and potential sustainability concerns.

Blockchain Explorers & Event Logs

Raw transaction analysis using Etherscan or Arbiscan reveals on-chain execution details.

  • Audit harvest transactions to verify strategy efficiency and fee costs.
  • Trace fund flows during a vault migration or strategy update.
  • This matters for technical due diligence, verifying protocol claims, and understanding smart contract risks.

Custom Scripts & Subgraphs

Programmatic data queries enable bespoke historical analysis not available on standard dashboards.

  • Use The Graph subgraphs to calculate user-level ROI or impermanent loss for LP vaults.
  • Script historical backtests of strategy performance using archived RPC data.
  • This matters for advanced users and researchers requiring tailored, reproducible performance audits.

Risk Assessment Frameworks

Structured risk models evaluate non-financial metrics critical to long-term viability.

  • Analyze governance decentralization, multisig signer activity, and timelock durations.
  • Audit smart contract upgrade histories and dependency risks (e.g., oracle failures).
  • This matters for assessing protocol resilience, administrative risk, and potential single points of failure.
SECTION-FAQ

Common Questions on Historical Analysis

Ready to Start Building?

Let's bring your Web3 vision to life.

From concept to deployment, ChainScore helps you architect, build, and scale secure blockchain solutions.