Inside the Probabilistic Equity Valuator
Published: February 2026
There is a peculiar form of self-deception practiced daily in the temples of finance — a quiet, almost ceremonial act of faith in which intelligent men and women take a handful of assumptions about the unknowable future, feed them into an equation, and accept the single number that emerges as though it were a revelation. They call this number the "fair value" of a stock, and they speak it with a conviction that would be admirable were it not so dangerously misplaced.
When someone tells you a stock is "undervalued," you would do well to pause and ask two questions. The first: according to what model? The second, and by far the more important: how confident is the model?
Most valuation tools answer the first question with admirable precision and ignore the second entirely. They accept a growth rate, a discount rate, a terminal value — each one a guess dressed in the respectability of a decimal point — and run them through a single deterministic calculation that produces a single number. That number arrives with the quiet authority of arithmetic, as though the future were a spreadsheet that someone has already filled in, as though the world were not, at every moment, trembling with possibilities that no formula can contain.
It is not. The future is a distribution — vast, uncertain, and humbling. Every assumption you feed into a valuation model carries within it the seeds of error. The growth rate could be twelve percent or it could be eight; nobody knows, and those who claim to know are merely those who have forgotten their ignorance. The discount rate depends on a risk-free rate that shifts daily with the restless tides of bond markets, an equity premium that academics have debated for decades without resolution, and a beta coefficient that measures the past with mathematical exactitude and then pretends, with touching naïveté, to predict the future. And the terminal growth rate — that fateful number which typically drives sixty to seventy percent of the final answer — is nothing less than a guess about what happens in perpetuity, which is, one must confess, rather a long time to be guessing.
The Probabilistic Equity Valuator was built around a conviction so simple it ought to be obvious, and yet so rarely honoured that it amounts to something almost radical: if the inputs are uncertain, the output should say so. It combines a Discounted Cash Flow engine with Monte Carlo simulation, M2 money supply normalization, and sentiment-adjusted growth priors — and then tells you not merely what a stock might be worth, but how wide the range of plausible values truly is, and how much of what you think you know is, in truth, a hope.
This post explains how it works, what changed in the February 2026 update, and why the mathematics matter even if you choose to skip the equations.
What Changed (February 2026)
Every model, however lovingly constructed, must eventually face the tribunal of reality — and the verdict, when it came, was sobering. A backtest of the original model across ten major stocks (AAPL, MSFT, GOOGL, AMZN, NVDA, META, TSLA, JPM, JNJ, WMT) over the course of one year revealed truths that were, to put it charitably, uncomfortable:
- 30% directional accuracy — worse than the blind flip of a coin at predicting whether a stock would rise or fall
- -34.7 percentage point bias — the model was systematically, almost stubbornly pessimistic, declaring six out of ten stocks "overvalued" in a market that rose by +22.8%
- 73.5pp mean absolute error — predictions that missed their marks by staggering margins
And yet — and here is the turn in the story, the glimmer of redemption — the relative ranking proved informative: a Spearman correlation of 0.585. The framework possessed signal. It had, buried within its errors, a genuine intuition for value. What it lacked was calibration — the patient, humbling work of bringing theory into accord with the world as it actually is, rather than as one imagines it ought to be.
The update addressed six root causes:
| Problem | Fix | Impact |
|---|---|---|
| Growth anchored to historical FCF CAGR | Forward-heavy consensus weighting (75% forward, 10% historical) | Less backward-looking |
| Flat 2.5% terminal growth for all sectors | Sector-adaptive terminal rates (2.0%–4.0%) | Tech gets 4%, utilities get 2% |
| Single-year FCF base was noisy | 3-year trailing average FCF | Smooths cyclical swings |
| Step-function WACC premiums | Continuous log-scale size premium, linear leverage premium | No artificial cliff edges |
| Static equity risk premium | VIX-responsive ERP | Adapts to market fear |
| No reality check | EV/FCF multiples cross-check with blended fair value | Catches DCF outliers |
After the update: 40% directional accuracy, -26.9pp bias (twenty-two percent less pessimistic), 61.0pp MAE (seventeen percent lower), and — most tellingly — stocks flagged as "undervalued" returned +44.5% versus the +22.8% benchmark, generating +21.7 percentage points of alpha.
Not perfect. But better — and in the uncertain art of valuation, better is the only honest ambition.
I. The Core: Discounted Cash Flow
The Idea in One Sentence
A company is worth the sum of all the cash it will generate in the future, adjusted for the melancholy fact that a dollar tomorrow is worth less than a dollar today.
The Math
The enterprise value is the present value of projected free cash flows, augmented by a terminal value that reaches — with a kind of mathematical bravery — beyond the projection horizon into the indefinite future:
\[EV = \sum_{i=1}^{N} \frac{FCF_i}{(1 + r)^i} + \frac{FCF_N \cdot (1 + g_\infty)}{(r - g_\infty) \cdot (1 + r)^N}\]
Where:
- \(FCF_i\) is the projected free cash flow in year \(i\)
- \(r\) is the discount rate (WACC)
- \(g_\infty\) is the terminal growth rate
- \(N\) is the total projection period (typically 10 years)
To arrive at a per-share fair value from this enterprise-level figure:
\[\text{Fair Value} = \frac{EV - \text{Total Debt} + \text{Cash}}{{\text{Shares Outstanding}}}\]
What Is Free Cash Flow?
Free cash flow is the cash a business generates after paying for everything it needs to keep the machinery of its existence running. It is, in the purest sense, the surplus — the money that could, in theory, be handed to shareholders without diminishing the enterprise itself:
\[FCF = \text{Operating Cash Flow} - \text{Capital Expenditures}\]
The valuator computes four FCF bases from the financial statements, each offering a slightly different lens through which to view the same underlying reality:
- Raw FCF — Operating cash flow minus capex, as reported, in all its unvarnished immediacy
- Normalized FCF — Adds back depreciation & amortization, then subtracts capex, smoothing the distortions that different accounting conventions introduce
- Operating FCF — Uses operating income plus D&A minus capex, filtering out the noise of one-time items
- 3-Year Average FCF (new, and recommended) — Averages normalized FCF across the three most recent fiscal years, reducing the outsized influence of any single extraordinary year
The three-year average was added in recognition of a truth that experience teaches painfully: a single year of cash flow can be wildly misleading. A company that enjoyed a brilliant 2024 but endured mediocre performance in 2022 and 2023 should not be valued as though every future year will resemble its best year — just as one would not judge a person's character by a single afternoon.
II. Growth Projection: Looking Forward, Not Backward
The Problem With History
Older valuation models lean heavily — one might say fatally — on historical free cash flow growth rates. If FCF grew at fifteen percent per year over the preceding four years, the model projects something like fifteen percent going forward, as though the future were obliged to repeat the past.
This assumption breaks badly, and sometimes catastrophically, for companies whose past bears no resemblance to their future. NVIDIA's historical FCF CAGR exceeded ninety percent — a rate that, if projected forward with a straight face, would eventually value the company at more than the entire global economy. Tesla's was negative — and projecting that forward produces a fair value near zero, which is manifestly not what the market, with all its collective intelligence and folly, believes.
Consensus Growth Estimation
The valuator constructs a forward-looking growth estimate from four sources, weighted deliberately to favour what analysts expect over what has already occurred:
\[g = \frac{\sum_{j} w_j \cdot g_j}{\sum_{j} w_j}\]
| Source | Weight | What It Captures |
|---|---|---|
| Earnings growth forecast (FY) | 0.35 | Analyst consensus on near-term profitability |
| Revenue growth forecast (FY) | 0.40 | Top-line momentum — harder to manipulate than earnings, and therefore more trustworthy |
| Target price implied upside | 0.15 | The market's forward view, embedded in analyst targets |
| Historical FCF CAGR | 0.10 | A backward-looking sanity check, nothing more |
The final estimate is capped at \(\pm50\%\) to prevent extreme values from overwhelming the projection with their reckless enthusiasm or despair.
Growth Fade: The Convex Decay Function
No company grows at thirty percent forever. This is one of the iron laws of economics — perhaps its most melancholy one — that all extraordinary growth eventually bends toward the ordinary, that every empire of commerce, however triumphant its ascent, must eventually submit to the gravitational pull of competition, saturation, and time.
The valuator models this inevitability by splitting the projection into two phases:
- High-growth phase (default 5 years): growth holds at the initial rate, like a runner still buoyed by the momentum of the start
- Fade phase (default 5 years): growth declines toward the terminal rate, tracing a path toward the long, quiet plateau
The fade follows a convex decay curve rather than a straight line:
\[g_i = g_0 + (g_\infty - g_0) \cdot \left[1 - (1 - t)^{\alpha}\right]\]
Where \(t = \frac{i - N_{\text{high}} + 1}{N_{\text{fade}}}\) measures progress through the fade phase, and \(\alpha\) is the decay exponent.
When \(\alpha > 1\), the curve is convex: growth holds up in the early years of the fade and drops more precipitously toward the end. This models the real-world observation — confirmed again and again in the annals of corporate history — that competitive advantages erode gradually at first, almost imperceptibly, and then with sudden, devastating speed as new entrants close the gap.
The decay exponent is now dynamic, calibrated to the particular character of each company's historical cash flow consistency:
\[\alpha = \text{clip}\left(1.0 + 1.5 \cdot (1 - CV), \; 1.0, \; 2.5\right)\]
Where \(CV\) is the coefficient of variation of historical FCF growth rates:
\[CV = \frac{\sigma(g_{\text{hist}})}{\left|\mu(g_{\text{hist}})\right|}\]
- Stable companies (low \(CV\)): \(\alpha \approx 2.0\text{–}2.5\) — a gentler fade, rewarding the quiet virtue of consistency
- Volatile companies (high \(CV\)): \(\alpha \approx 1.0\text{–}1.2\) — a swifter fade, penalising the chaos of unpredictability
Sector-Adaptive Terminal Growth
The terminal growth rate \(g_\infty\) was once set at a flat 2.5% for every company — a uniformity that, while tidy, ignored a basic truth about the world: technology companies operate in markets that are still expanding, reaching hungrily into new territories of human activity, while utilities operate in landscapes that matured decades ago. The valuator now assigns sector-specific defaults:
| Sector | Terminal Growth |
|---|---|
| Technology | 4.0% |
| Communication Services | 3.5% |
| Healthcare | 3.5% |
| Consumer Cyclical | 3.0% |
| Consumer Defensive | 2.5% |
| Financial Services | 2.5% |
| Industrials | 2.5% |
| Energy / Utilities / Real Estate / Basic Materials | 2.0% |
These can be overridden manually. The point is not to dictate a truth but to offer reasonable starting points — defaults that are sensible rather than arbitrary, so that the analyst begins from solid ground rather than from the false neutrality of a universal constant.
III. The Discount Rate: WACC
The Weighted Average Cost of Capital determines with what severity future cash flows are discounted — how ruthlessly the present punishes the future for the crime of not yet existing. A higher WACC means future cash is worth less today, reflecting the premium that risk exacts from hope.
\[WACC = R_f + \beta \cdot ERP + S_{\text{size}} + S_{\text{leverage}} + S_{\text{country}}\]
Risk-Free Rate (\(R_f\))
Fetched live from the 10-Year U.S. Treasury yield (^TNX via yfinance), cached for twenty-four hours. This is the theoretical return on a "riskless" investment — the quiet, unexciting baseline against which all risky ventures are measured, the financial equivalent of solid ground.
Equity Risk Premium (ERP) — Now VIX-Responsive
The equity risk premium represents the additional return that investors demand for bearing the turbulence of the stock market rather than resting in the shelter of bonds. Traditionally modelled as a static 5.5% — a number plucked from the long averages of history — the valuator now adjusts it in response to the market's own measure of its fear:
\[ERP = 0.05 + 0.001 \cdot \max(0, \; VIX - 20)\]
| VIX Level | Interpretation | ERP |
|---|---|---|
| 15 | Calm markets | 5.0% |
| 20 | Normal | 5.0% |
| 30 | Elevated fear | 6.0% |
| 40 | Crisis | 7.0% |
When markets are afraid, the cost of equity rises — as it must, for fear is not irrational but is itself a form of information, a collective signal about the perceived fragility of the world. The model does not presume to judge whether the fear is justified. It merely acknowledges that fear is a fact, and facts have consequences.
Size Premium (\(S_{\text{size}}\)) — Now Continuous
Smaller companies are riskier — more exposed to the caprices of fortune, less insulated by scale and diversification, more likely to be swept away by a single misfortune. The old model captured this truth crudely, using step functions that assigned fixed premiums at arbitrary thresholds: below $300M in market capitalisation received two percent; above $50B received nothing. The new model employs a smooth logarithmic scale:
\[S_{\text{size}} = \max\left(0, \; 0.02 - 0.004 \cdot \log_{10}\!\left(\frac{\text{Market Cap}}{10^8}\right)\right)\]
This yields roughly 2% at $100M market cap, tapering smoothly to 0% at $50B and beyond. No cliffs, no discontinuities — no company is "suddenly" less risky because it crossed an arbitrary threshold that exists nowhere except in the imagination of the modeller.
Leverage Premium (\(S_{\text{leverage}}\)) — Now Continuous
Debt is a wager against the future — a bet that tomorrow's revenues will be sufficient to service today's obligations. The more a company borrows, the higher the stakes. The premium scales linearly with the debt-to-equity ratio:
\[S_{\text{leverage}} = \min(0.015, \; 0.0075 \cdot D/E)\]
Zero debt pays no premium. A debt-to-equity ratio of 2.0 pays the maximum of 1.5%.
IV. Monte Carlo Simulation: Embracing Uncertainty
Why Single-Point Estimates Fail
A deterministic DCF pronounces, with the false confidence of an oracle: "The fair value is $185." A Monte Carlo DCF speaks with altogether greater honesty: "The fair value is probably between $140 and $240, with a median around $185, and there is a 65% chance the stock is currently undervalued."
The second answer is more truthful. It is also, for the thoughtful investor, immeasurably more useful — for it does not merely tell you what to think, but shows you how much room there is for doubt.
How It Works
The simulation runs thousands of independent DCF calculations — ten thousand by default, each one a separate imagining of how the future might unfold. In each run, the growth trajectory is perturbed using log-normal increments, a technique drawn from the companion paper (Section 4.1), which guarantees, by mathematical construction, that cash flows remain positive in every scenario, as they must in the real world.
In each simulated year \(i\), the cash flow evolves as:
\[CF_i = CF_0 \cdot \exp\!\left(\sum_{k=1}^{i} l_k\right)\]
Where each log-increment \(l_k\) is drawn from a normal distribution:
\[l_k \sim \mathcal{N}\!\left(\ln(1 + g_k), \; \sigma_l^2\right)\]
The key parameter is \(\sigma_l\), which governs how far each simulation may wander from the expected growth path — how much the imagined future is permitted to diverge from the central forecast. This parameter is now data-driven, calibrated to the particular history of each company:
\[\sigma_l = \text{clip}(CV \cdot 0.5, \; 0.02, \; 0.15)\]
A company with erratic cash flow history — one that has lurched from abundance to scarcity and back again — receives wider simulation bands, as befits a future that is genuinely harder to foresee. A company with steady, dependable cash generation receives tighter bands, honouring the relative predictability that its track record has earned.
The output is not a number but a landscape: a full probability distribution of fair values — percentiles, histogram, and explicit probabilities of the stock being undervalued or overvalued. It is, one might say, a portrait of uncertainty itself.
V. M2 Money Supply Normalization
The Hidden Variable
Between 2020 and 2022, the U.S. M2 money supply grew by roughly forty percent — an expansion of extraordinary magnitude, undertaken in extraordinary circumstances. If you had valued a stock in January 2020 and then again in January 2022, the raw DCF fair value might have appeared similar. But the dollars in the second calculation were no longer the same dollars. They were lighter, thinner, diminished by the sheer profusion of their kind. Asset prices rose in part not because the assets had become more valuable, but because there were simply more dollars in the world, each one chasing the same finite collection of real things.
The M2 normalization module adjusts for this hidden, pervasive influence. It fetches M2 data from the Federal Reserve (FRED API) and computes a scaling ratio:
\[\text{M2 Factor} = \frac{M2_{\text{current}}}{M2_{\text{baseline}}}\]
When M2 has expanded, the factor exceeds 1, nudging fair values upward to reflect that nominal prices should naturally be higher in an environment saturated with additional money. When M2 contracts — as it did during 2023's quantitative tightening — the factor pulls fair values back, acknowledging the receding tide.
This module does not make a claim about inflation, that contested and politically fraught concept. It makes a simpler, more defensible claim: the quantity of dollars in circulation affects the price at which assets trade, and to ignore this fact is itself a choice — one with consequences that compound silently over time.
VI. Sentiment-Enhanced Growth Priors
What Sentiment Adds
A pure DCF model exists in a kind of Platonic isolation, concerned only with cash flows and discount rates, indifferent to the passions and convictions of the human beings who actually buy and sell stocks. It ignores the mood of the market — and mood, as any student of financial history knows, is not merely noise but a force that shapes prices with a power sometimes exceeding that of fundamentals themselves.
Sentiment analysis bridges this gap by adjusting the growth prior according to how analysts and the market currently feel about a stock. The aggregated sentiment score combines three sources, weighted by credibility and recency:
\[S_{\text{agg}} = \frac{\sum_{i} c_i \cdot r_i \cdot s_i}{\sum_{i} c_i \cdot r_i}\]
| Source | Credibility (\(c_i\)) | What It Measures |
|---|---|---|
| Analyst recommendation mean | 0.50 | Consensus view (bias-corrected: "Buy" is treated as neutral, and only "Strong Buy" registers as genuinely positive) |
| Target price vs current price | 0.30 | Implied upside (bias-corrected: 15% upside = neutral, reflecting the structural optimism that is Wall Street's native tongue) |
| Quarterly earnings surprise | 0.20 | Recent execution versus expectations |
The sentiment score enters the growth prior through a sensitivity parameter \(\gamma\):
\[\mu_{\text{adjusted}} = \mu_{\text{base}} + \gamma \cdot S_{\text{agg}}\]
With \(\gamma = 0.05\) (the default), a maximally bullish sentiment (\(S_{\text{agg}} = +1\)) adds five percentage points to the base growth rate; a maximally bearish sentiment subtracts five. In practice, most stocks land between \(-0.3\) and \(+0.3\), producing growth adjustments of roughly \(\pm1.5\%\) — a gentle nudge, not a revolution.
The bias corrections are essential, and they reveal something instructive about the nature of the financial world. Wall Street's average recommendation hovers at approximately 2.0, which translates to "Buy," and the average analyst target price sits roughly fifteen percent above the current price. This is not analysis; it is institutional temperament. Without re-centering, the model would interpret this structural optimism — this permanent, reflexive cheerfulness — as a genuine bullish signal. After correction, only relative deviations from the market's habitual sunniness register as meaningful information.
VII. The Reality Check: Multiples Cross-Validation
DCF models, for all their mathematical elegance, can produce results that are frankly absurd. A small error in the growth rate or discount rate, innocent enough in isolation, is compounded over ten years and then capitalised into a terminal value that extends to infinity — and so a $200 stock can, through the quiet accumulation of modest errors, appear to be worth $50 or $500. The model, left to its own devices, has no mechanism for recognising its own madness.
It needs, therefore, a sanity check — a tether to the observable world.
The multiples cross-check provides one. It computes a second fair value estimate using the sector-median EV/FCF multiple, grounding the soaring abstractions of the DCF in the concrete reality of what the market, with all its imperfections, actually pays for comparable cash flow streams:
\[FV_{\text{multiples}} = \frac{FCF \cdot M_{\text{sector}} - \text{Debt} + \text{Cash}}{\text{Shares Outstanding}}\]
Where \(M_{\text{sector}}\) is the median Enterprise Value to Free Cash Flow multiple for the company's sector:
| Sector | Median EV/FCF |
|---|---|
| Technology | 30x |
| Healthcare | 25x |
| Communication Services | 22x |
| Consumer Cyclical | 20x |
| Consumer Defensive | 18x |
| Industrials | 18x |
| Real Estate | 16x |
| Utilities | 14x |
| Financial Services | 12x |
| Basic Materials | 12x |
| Energy | 10x |
When the DCF and multiples estimates diverge by more than fifty percent — a gap too wide to be dismissed as mere difference of method — the valuator displays a blended fair value:
\[FV_{\text{blended}} = 0.70 \cdot FV_{\text{DCF}} + 0.30 \cdot FV_{\text{multiples}}\]
This prevents the DCF from drifting too far into the realm of pure theory, divorced from the messy, human-driven market that it purports to describe. The seventy-thirty weighting preserves the DCF's primacy — for it is, after all, the more rigorous framework — while anchoring it, gently but firmly, to observable reality.
VIII. Limitations and Honest Caveats
No valuation model predicts the future. This one does not either. And it would be a disservice — to the reader, to the discipline, to the truth — to present it otherwise. Some specific limitations deserve honest acknowledgment:
Financial sector stocks — banks, insurance companies, the great intermediaries of modern capitalism — generate cash through lending and underwriting, not through operations in the traditional sense. Their "free cash flow" is, in economic terms, a fiction — a number that the financial statements dutifully report but that bears no meaningful relationship to the cash-generative capacity of the enterprise. DCF produces unreliable results for such companies, and the valuator displays a warning to this effect, suggesting alternative approaches such as dividend discount models.
Negative FCF companies — the young, the ambitious, the not-yet-profitable — cannot be meaningfully valued by discounting cash that does not yet exist. One cannot compound what has not been generated. The model handles such cases by falling back to operating metrics, but the uncertainty here is not merely high; it is, in the deepest sense, irreducible.
Terminal value dominance — in a standard ten-year DCF, the terminal value typically accounts for sixty to eighty percent of the total enterprise value. Consider what this means: the number about which you are most uncertain — the terminal growth rate, a guess about what will happen in perpetuity — exerts more influence over the final answer than the carefully projected near-term cash flows about which you actually possess some knowledge. The Monte Carlo simulation partially addresses this paradox by varying the terminal assumptions across thousands of scenarios, but the fundamental sensitivity remains, stubborn and irreducible, a reminder that all long-horizon forecasting is, at bottom, an act of imagination.
Sector medians are approximations. The EV/FCF multiples used in the cross-check are historical medians, not live market data. They provide a reasonable anchor — a sense of where the ground lies — but not a precise one.
Sentiment is backward-looking. Analyst recommendations and target prices reflect current consensus, which may already be embedded in the stock price. Sentiment adjustments improve calibration on average but cannot anticipate sentiment shifts — those sudden reversals of collective mood that are, perhaps, the most consequential and least predictable events in all of finance.
IX. How to Use It
- Enter a ticker. The system fetches financial data automatically, drawing from the vast repositories of market information.
- Review the auto-filled assumptions. Growth rate, WACC, and terminal growth are pre-populated from market data and sector defaults. Adjust them if you have a thesis — if you believe you see something the consensus has missed.
- Choose your FCF base. The 3-Year Average is recommended for its steadying effect. Use raw or normalized if you have a specific reason to prefer the unsmoothed view.
- Run the valuation. The system computes the deterministic DCF, Monte Carlo simulation, M2 adjustment, and multiples cross-check simultaneously — a symphony of calculations, each one illuminating the question from a different angle.
- Read the distribution, not just the number. The Monte Carlo histogram reveals the range of plausible fair values in all its breadth and uncertainty. The probability of being undervalued is, in almost every case, more informative than the point estimate — for it tells you not what the answer is, but how much you should trust it.
- Check the cross-validation. If the DCF and multiples estimates diverge significantly, pause and investigate why. Large divergences are not errors to be dismissed but signals to be understood — they often mean the growth or discount assumptions need revisiting.
The goal, ultimately, is not to produce a number that is "right." Such a number does not exist, and the pretence that it does is the source of more financial folly than any other single delusion. The goal is to produce a range that is honest — and to understand what drives the answer so thoroughly that you can decide, with open eyes and clear judgment, whether you believe the assumptions upon which the entire edifice rests.
For in the end, every valuation is an act of belief — and the only dishonesty is to pretend otherwise.
The Probabilistic Equity Valuator is part of the MacroPulze platform. The mathematical framework is described in the companion paper: "Probabilistic Framework for AI-Enhanced Equity Valuation."