← Back to Thought Leadership

Know Your Numbers: How to Measure Email Deliverability and Reputation

July 11, 2023·14 min read

Before you can improve your email deliverability, you need a way to accurately measure it. There is no single dashboard metric that tells the whole story — instead, strong programs combine several methods and tools to measure deliverability and reputation over time.

Common approaches include:

Inbox placement tests

One of the best ways to measure how your mail is actually landing is inbox placement testing— either run manually or with automated services. In essence, the workflow is:

  1. Create a seedlist of email addresses. This is a controlled set of inboxes across providers (Gmail, Yahoo, AOL, etc.). Seedlists can range from dozens to hundreds of addresses.
  2. Send a live message to the seedlist. Use a real production send, not a preview: preview sends often use different infrastructure and can misrepresent placement. Ideally, include seed addresses as part of a normal campaign to subscribers so results match real-world conditions.
  3. Analyze where each message landed — inbox, spam, or missing — and roll up results by domain. That gives you a clear picture of placement percentages per provider.
Inbox placement test results across mailbox providers
Placement testing shows where a live send landed — inbox, spam, or missing — rolled up by provider.

Placement tests are among the more accurate ways to measure inboxing, but they are not perfect. Seed addresses typically do not open, click, or engage. Because many providers weight past user behavior, seed-only results can sometimes differ from what engaged subscribers see. Use placement tests as a strong signal — especially when you change domains, IPs, or volume — not as the only truth.

Delivery rates

Most ESPs show an email delivery rate. The name is misleading: it is not the percentage of mail that reached the inbox. It is usually calculated as non-bouncing sends divided by total sends. Bounces are rejections (invalid address, full mailbox, server issues, etc.).

Because bounces are often a small fraction of volume, delivery rates often look very high (often 95–99%). That does not mean 95–99% inbox placement — a large share could still land in spam. Delivery rate does not tell you where the message landed.

Still, delivery rate is useful when it is abnormally low: that usually means a real infrastructure or list-quality problem worth investigating immediately.

Email delivery rate as shown in an ESP dashboard
Delivery rate reflects non-bounces, not inbox placement — useful when it drops unexpectedly.

DMARC reports

Every sender should have a proper DMARC policy. Beyond policy enforcement, DMARC generates aggregate reports that show SPF and DKIM pass/fail across sending sources and IPs. Reviewing those reports on a cadence helps you catch authentication drift, unauthorized senders, and alignment issues before they hurt reputation. For a deeper dive on tags and setup, see our DMARC record guide.

DMARC aggregate report (sample)
yourbrand.com · 24h window

Aggregate XML reports (rua) roll up message counts by source IP with SPF/DKIM alignment and how your DMARC policy was applied — useful for spotting auth drift and unknown senders.

Source IPMessagesSPFDKIMDisposition
192.0.2.1038,420passpassnone
198.51.100.4412,104passpassnone
203.0.113.82,891failpassquarantine
203.0.113.201412failfailreject

Raw aggregate XML (excerpt)

Reports arrive as XML. Each <record> bundles row, identifiers, and auth_results — the table under it is the same two records flattened into columns.

<?xml version="1.0" encoding="UTF-8" ?>
<feedback>
  <version>1.0</version>
  <report_metadata>
    <org_name>Enterprise Gmail</org_name>
    <email>noreply-dmarc-support@google.com</email>
    <report_id>2847562148901627394</report_id>
    <date_range>
      <begin>1743206400</begin>
      <end>1743292799</end>
    </date_range>
  </report_metadata>
  <policy_published>
    <domain>yourbrand.com</domain>
    <adkim>r</adkim>
    <aspf>r</aspf>
    <p>quarantine</p>
    <pct>100</pct>
  </policy_published>
  <!-- Each <record> is one row in the table below -->
  <record>
    <row>
      <source_ip>192.0.2.10</source_ip>
      <count>38420</count>
      <policy_evaluated>
        <disposition>none</disposition>
        <dkim>pass</dkim>
        <spf>pass</spf>
      </policy_evaluated>
    </row>
    <identifiers>
      <header_from>yourbrand.com</header_from>
    </identifiers>
    <auth_results>
      <spf>
        <domain>yourbrand.com</domain>
        <result>pass</result>
      </spf>
      <dkim>
        <domain>yourbrand.com</domain>
        <result>pass</result>
        <selector>k1</selector>
      </dkim>
    </auth_results>
  </record>
  <record>
    <row>
      <source_ip>203.0.113.8</source_ip>
      <count>2891</count>
      <policy_evaluated>
        <disposition>quarantine</disposition>
        <dkim>pass</dkim>
        <spf>fail</spf>
      </policy_evaluated>
    </row>
    <identifiers>
      <header_from>yourbrand.com</header_from>
    </identifiers>
    <auth_results>
      <spf>
        <domain>mail.bad-relay.example</domain>
        <result>fail</result>
      </spf>
      <dkim>
        <domain>yourbrand.com</domain>
        <result>pass</result>
        <selector>k1</selector>
      </dkim>
    </auth_results>
  </record>
</feedback>

Parsed from <record> elements

row/source_iprow/countpolicy_evaluated/dispositionpolicy_evaluated/dkimpolicy_evaluated/spfidentifiers/header_fromauth_results/spf/domainauth_results/spf/resultauth_results/dkim/domainauth_results/dkim/resultauth_results/dkim/selector
192.0.2.1038420nonepasspassyourbrand.comyourbrand.compassyourbrand.compassk1
203.0.113.82891quarantinepassfailyourbrand.commail.bad-relay.examplefailyourbrand.compassk1
Top: dashboard-style summary. Below: excerpt from aggregate XML and the same two <record> blocks as a field-aligned table — real files add more records and optional elements (e.g. reason, envelope_from).

Postmaster reports

Providers such as Gmail offer Postmaster Tools: verified domains can surface sender reputation, domain health, volume, and spam complaint feedback. This adds another layer to your measurement stack. Note that smaller senders may see little or no data until volume crosses minimum thresholds.

Gmail Postmaster Tools domain reputation and delivery metrics
Postmaster Tools adds provider-level reputation and complaint signals when volume is sufficient.

Relative inbox open rates

A rough, indirect check is to compare open rates by inbox provider in your campaign reports. This method is imperfect — Apple Mail Privacy and similar features skew opens — but big gaps between providers can flag deliverability issues worth investigating.

For example, you might see something like:

InboxOpen rate
All31.2%
Gmail32.5%
Apple47.0%
Yahoo12.2%
AOL7.6%

Here, Gmail might look close to your overall average, while Apple looks inflated (privacy). Yahoo and AOL far below average could indicate placement or engagement issues for those domains — always cross-check list size per domain so small segments do not over-influence the story.

Open rates broken down by email client or inbox provider in campaign reporting
ESP reporting by domain or client helps surface gaps — interpret alongside MPP-inflated Apple opens.

SenderScore

Third-party tools can add reputation context. For example, Validity Sender Score (and similar services) score sending IPs. Scores are evaluated at the IP level; if you send from a shared pool, run the relevant IPs through the tool to understand the distribution — not just a single number.

Content spam detectors

Before send, you can run creative through spam / content checkers to catch triggers that might affect filtering. Tools such as Email on Acid, Litmus, or similar preflight flows can flag issues aligned with filters like SpamAssassin or Barracuda so you can adjust copy, links, and markup before the message leaves your ESP.

Pre-flight spam or content check results for an email campaign
Pre-send checks surface content and DNS issues before the message hits real inboxes.

Summary

Measuring deliverability is the prerequisite to improving it. Once you have a baseline across placement, authentication, reputation, and content, you can prioritize fixes — authentication and list hygiene first, then content and cadence — and monitor whether changes move the right numbers.

If you want help auditing, troubleshooting, and monitoring deliverability end to end, our team offers a Find & Fix deliverability program built for eCommerce senders. You can also book a call to discuss your current setup.

Related articles

Need help with your email program?

Schedule a free consultation with our team.

Book a Call →