Skip to main content
Note: Real estate data APIs have inherent lag (1–24h for listings, 30–60 days for sold data). Always verify critical decisions with direct MLS access. Data quality notes →
Real Estate Series Part 4 of 5

Part 4: Comparable Sales & Market Reports — Automate CMA Research in Seconds

Updated March 26, 2026 15 min read Advanced

Use cases at a glance

Comparable sales analysis is the foundation of valuation and investment decisions. When you're evaluating a property to buy, refinance, or hold, you need recent comparable sales data filtered by proximity, size, condition, and sale date. OpenClaw agents automate this research, pulling sold data, applying filters, and generating formatted reports in seconds.

Instant comps for target address

Pull recent comps for a specific property address to estimate current market value.

Price-per-sqft analysis

Compute median price/sqft from comps and apply to subject property for quick valuation.

Adjusted value range

Generate low/mid/high estimate based on comp adjustments for quality, condition, updates.

Market absorption rate

Divide active listings by monthly sold count to determine buyer vs. seller market.

Weekly sold data digest

Automated report of all closed sales in your market for the prior week, sorted by price.

CMA report generation

Formatted markdown/HTML report ready to send to clients or file in your records.

Comp filtering criteria

Strong comps are essential. Weak filters lead to junk comparables. Here are the key parameters:

Parameter Recommended Value Reasoning
distance_miles 0.25 (urban), 0.5-1 (suburban), 1-2 (rural) Tighter = better. Real estate is hyper-local. 1/4 mile in dense areas captures same micro-market.
sold_within_days 180 (6 months, default), 90 (hot market), 365 (thin market) Market conditions change. 6 months is a reasonable window balancing freshness and sample size.
sqft_variance_pct 15% A 2,000 sqft subject allows comps 1,700-2,300 sqft. Broader range if thin market.
property_type_match Exact (SFR=SFR, condo=condo) Do not mix property types. A condo comp is not valid for an SFR subject.
year_built_variance ±15-20 years Allows for construction quality differences. A 1990 home vs. 2010 is meaningful; 1960 vs. 1990 less so in older areas.
beds_exact_match True (or ±0.5 if thin market) Number of bedrooms significantly impacts value. Prefer exact matches; allow variance only in thin markets.
baths_min_threshold Within 0.5 baths 1 BA vs. 1.5 BA is material. Allow minor variance only.
exclude_sale_types Foreclosures, short sales, gifts, estate sales These distressed sales skew valuation. Use separate analysis if tracking distressed properties.

Comps agent configuration

Here's a complete comps agent configuration in AGENTS.md format:

---
name: comps-report
tools:
  - zillow-rapidapi
config:
  subject:
    address: "123 Oak Street, Austin, TX 78704"
    zpid: 12345678

  comp_filters:
    distance_miles: 0.25
    sold_within_days: 180
    sqft_variance_pct: 15
    property_type: "Single Family"
    year_built_variance: 20
    beds_match: true
    baths_variance: 0.5
    exclude_sale_types: ["foreclosure", "short_sale", "gift", "estate"]

  output:
    format: "markdown"
    include_price_sqft: true
    include_adjusted_values: true
    include_market_stats: true
---

job: |
  subject = get_subject_property(config.subject.zpid)
  comps = await zillow_rapidapi.search_comps(
    address=config.subject.address,
    filters=config.comp_filters
  )

  # Sort by sale date (most recent first)
  comps = sorted(comps, key=lambda x: x.sale_date, reverse=True)

  # Calculate price per sqft
  comps_price_sqft = [
    (c.sale_price / c.sqft) for c in comps if c.sqft > 0
  ]
  median_price_sqft = median(comps_price_sqft)

  # Estimate subject value
  estimated_value = subject.sqft * median_price_sqft

  # Generate report
  report = format_cma_report(subject, comps, estimated_value)
  print(report)

Price-per-sqft analysis and adjustment

Price per square foot is a normalized metric for quick valuation. Extract price/sqft from each comp, compute the median, and apply to your subject property:

Subject Property: 2,150 sqft, list price TBD

Comps (past 6 months, within 0.25 mi):
- 100 Main St: sold $550k, 2,050 sqft → $268/sqft
- 200 Oak Ave: sold $530k, 2,100 sqft → $252/sqft
- 300 Pine Rd: sold $575k, 2,200 sqft → $261/sqft
- 400 Elm Ct: sold $560k, 2,080 sqft → $269/sqft
- 500 Maple Dr: sold $545k, 2,150 sqft → $253/sqft

Median price/sqft: $261
Subject estimated value: 2,150 sqft × $261/sqft = $561,150

Valuation range (80%-120% of median price/sqft):
- Conservative (80%): 2,150 × ($261 × 0.8) = $449k
- Market (100%): 2,150 × $261 = $561k
- Aggressive (120%): 2,150 × ($261 × 1.2) = $673k

This gives you a quick range. For more precision, apply manual adjustments for condition, updates, or amenities not captured in the comps.

Market absorption rate

Absorption rate tells you if it's a buyer's or seller's market:

Absorption Rate = Active Listings ÷ Monthly Closed Sales

For Austin, TX 78704:
- Active listings: 42
- Closed sales last month: 28
- Absorption rate: 42 ÷ 28 = 1.5 months

Interpretation:
- <2 months = Seller's market (inventory moves quickly, prices rise)
- 2-4 months = Balanced market (normal conditions)
- >4 months = Buyer's market (inventory sits, price pressure down)

Use this context in your reports: "At current sales pace, it would take 1.5 months to sell all active inventory. This is a seller's market where price reductions are rare and negotiating room is limited."

RESO Web API for direct MLS access

Zillow's RapidAPI has a lag of 30-60 days for sold data in most markets. If you need same-week sold data, you need direct MLS access via RESO Web API (Real Estate Standards Organization).

To use RESO:

  1. Get MLS credentials: You must be a licensed agent with an MLS subscription. Work with your brokerage's IT to get RESO API credentials.
  2. Authenticate: RESO uses OAuth 2.0. Exchange your credentials for an access token.
  3. Query the API: Pull listings and closed sales from the MLS directly, with data current to within 24 hours.
  4. Data freshness: RESO data is typically 1-3 hours old, vs. 30-60 days for Zillow.

RESO setup is more complex than RapidAPI, but it's worth it for serious agents working multiple markets where data lag is material. The guide for setting up RESO access is beyond this article's scope; work with your MLS to provision credentials.

Tip: If you're a solo investor without MLS access, stay with Zillow/RapidAPI. If you're an agent or have brokerage partnerships, pursuing RESO access will unlock faster comps and sold data.

HEARTBEAT schedule

HEARTBEAT: Comps and market reporting

On-demand comps: Run when needed (not on a schedule). Triggered manually when you need comps for a specific address.

Weekly sold digest: 0 9 * * 1 (Monday 9 AM). Generate a report of all closed sales in your target zips for the prior week, sorted by price descending.

Monthly market snapshot: 0 8 1 * * (1st of month). Compute market absorption rate, median price/sqft, and month-over-month price trends.

Sample output

Here's a formatted CMA report generated by your agent:

COMPARABLE SALES ANALYSIS
Subject Property
123 Oak Street, Austin, TX 78704
ZPId: 12345678
Beds/Baths/Sqft: 3 / 2 / 2,150 sqft
Year Built: 2005
Comparable Sales (past 6 months, within 0.25 mi)
100 Main St
Sold: $550,000 | Beds: 3 | Baths: 2 | SqFt: 2,050 | $/SqFt: $268
200 Oak Ave
Sold: $530,000 | Beds: 3 | Baths: 2 | SqFt: 2,100 | $/SqFt: $252
300 Pine Rd
Sold: $575,000 | Beds: 3 | Baths: 2 | SqFt: 2,200 | $/SqFt: $261
Valuation Summary
Median Price/SqFt (comps) $261
Subject SqFt 2,150
Estimated Market Value $561,150
Range (±20%): $449k – $673k
Market Conditions
Absorption Rate 1.5 months Seller's market
Days-to-Sell (avg) 18 days
Price Trend (3m) +2.3%

FAQ

How current is the sold data in Zillow's API?

Zillow's sold data via RapidAPI typically reflects MLS-reported sales within 30–60 days of closing in most markets. For fast-moving markets where you need same-week sold data, you'll need direct MLS access via RESO Web API. The guides include notes on setting up RESO access if you have a brokerage or MLS subscription.

What makes a good comp?

A strong comp is: within ¼ mile for urban/suburban (½-1 mile for rural), sold within the past 6 months (3 months in hot markets), within 15% of subject property's square footage, same property type (SFR, condo, townhouse), similar age (within 15-20 years unless significant renovation). The agent applies these filters automatically based on your configured thresholds.

Can I generate a full CMA report to send to a client?

The agent generates a markdown or HTML report with the comps table, price-per-sqft analysis, adjusted value range, and local market context. It's a starting point for a CMA, not a replacement for a licensed agent's analysis. For client-ready reports, you'd want to post-process the output into a branded template.

What if there are fewer than 3 comps available?

Thin markets (rural areas, new subdivisions) may have few comps. If <3 comps after filtering, the agent should alert you and suggest expanding search parameters (increase distance, relax sqft variance, broaden time window). Do not generate a report with <3 comps—it's unreliable. Instead, note the limitation in your analysis.

Should I adjust comps for condition or updates?

Yes, if the subject property has material differences from comps (new roof, HVAC, foundation issues, etc.). Standard CMA practice is to compute an adjustment schedule: e.g., +$10k for a new roof, -$15k for deferred maintenance. The agent can automate this if you provide adjustment rules, but it requires manual observation of condition.

How do I account for outlier sales (fire sales, cash deals)?

Exclude foreclosures and short sales from standard comps (configure exclude_sale_types). If you're specifically analyzing a distressed property, create a separate agent with sale_type = ["foreclosure", "short_sale"]. This gives you two perspectives: market comps and distressed comps.