Digital JEDI Compliance
for B Corp Certification
PEER (Performance, Experience, Emissions, Ranking) is a four-pillar digital audit framework that measures a website's sustainability, accessibility, and inclusivity. It produces an evidence-backed report covering over 200 individual checks — built specifically to address B Lab's JEDI Standards V2.2.
200+
Individual checks
87
WCAG rules tested
<4 min
Scan duration
4
Carbon methodologies
The Problem We Solve
B Corp certification requires companies to demonstrate commitment to justice, equity, diversity, and inclusion across their entire operation — including their digital presence. JEDI Standards V2.2 includes specific requirements around website accessibility, inclusive communications, and product inclusivity that most companies have no way to measure.
The typical approach is manual audits costing thousands of pounds, delivered months after engagement, with findings that are already outdated. PEER automates the measurable parts and produces auditor-grade evidence in under four minutes.
What We Test
WCAG 2.2 AAA Accessibility
JEDI Standard 2.m
B Lab's JEDI Standards V2.2 explicitly recommends websites meet the AAA criteria of the Web Content Accessibility Guidelines. PEER tests against the full WCAG 2.2 specification — not just Level A or AA, but the complete AAA standard.
Automated Checks
Flagged for Manual Review
WCAG 2.2 includes criteria requiring human judgement. PEER identifies these and lists them as remaining steps:
- • Focus Appearance (2.4.13) — focus indicator area and contrast
- • Dragging Movements (2.5.7) — single-pointer alternatives for drag actions
- • Accessible Authentication (3.3.8 / 3.3.9) — no cognitive function tests for login
- • Consistent Help (3.2.6) — help mechanism placement across pages
- • Redundant Entry (3.3.7) — previously entered information auto-populated
Product and Service Inclusivity
JEDI Standard 2.q
JEDI Standards require that products and services are assessed for inclusivity — specifically font size, colour, and screen reader compatibility. PEER tests all three.
Font Size Assessment
Detects text rendered below 12px (desktop) or 16px (mobile). Counts affected elements and samples the smallest font sizes found.
Colour and Contrast
Every text element checked against WCAG thresholds. Pixel-level verification using screenshots with RGBA alpha channel support for semi-transparent text.
Screen Reader Compatibility
Scored 0–100. ARIA landmark presence, unlabelled icon controls, heading level jumps — each with weighted penalties.
Keyboard Navigation
Scored 0–100. Tab traversal, focus movement, trap detection, focus indicator visibility, skip link functionality.
Mobile Usability
Scored 0–100. Responsive viewport, horizontal overflow, fixed-width detection, tap target sizing (48×48px minimum), viewport obstruction detection with four-stage classification.
Inclusive Communications
JEDI Standard 2.p
JEDI Standards require that external communications — including websites — use inclusive, accessible language. PEER tests content readability and inclusive language alongside structural accessibility.
Content Readability
- Flesch-Kincaid Grade Level (target: Grade 10 or below)
- Flesch Reading Ease (0–100 scale)
- Gunning Fog Index
- Reading age, words per sentence, syllables per word
- Analyses main content area only for accuracy
Inclusive Language Scanning
- 30+ patterns across five categories
- Gendered, ableist, exclusionary, cultural, age-related
- Each flag includes category, count, and suggested replacement
- Density metric: flags per 1,000 words scanned
| Found | Category | Suggested Alternative |
|---|---|---|
| chairman | Gendered | chairperson / chair |
| manpower | Gendered | workforce / staffing |
| handicapped | Ableist | disabled / person with a disability |
| suffering from | Ableist | living with / has |
| wheelchair-bound | Ableist | wheelchair user |
| rockstar | Exclusionary | skilled / talented |
| tribe | Cultural | team / community / group |
| elderly | Age | older adults / seniors |
| digital native | Age | digitally skilled |
Carbon Emissions Measurement
Every PEER audit includes a full carbon footprint assessment using the EcoPigs Digital Carbon Methodology v2.0 — covering page weight, resource breakdown, data centre energy, network energy, user device energy, live grid intensity, green hosting verification, and waste analysis.
| Score | Purpose |
|---|---|
| Baseline | Global grid average, no green hosting credit. For cross-site comparison. |
| Traditional | What Website Carbon Calculator would report (SWDM methodology). Like-for-like comparison. |
| Live | Real-time emissions using live grid intensity for the hosting country. Most accurate snapshot. |
| Measured | CDP-instrumented device energy with per-segment grid splitting. Most granular methodology. |
| Grade | Max gCO2/visit | Approx Page Weight |
|---|---|---|
| A+ | ≤ 0.095g | ~700 KB (top 12%) |
| A | ≤ 0.186g | ~1,370 KB (top 28%) |
| B | ≤ 0.341g | ~2,520 KB (median) |
| C | ≤ 0.493g | ~3,640 KB |
| D | ≤ 0.656g | ~4,840 KB |
| F | > 0.846g | > 6,250 KB |
Performance and Core Web Vitals
Google's Core Web Vitals are confirmed ranking signals and directly affect user experience. PEER measures all three core metrics using real browser instrumentation.
| Metric | Good | Needs Improvement | Poor |
|---|---|---|---|
| LCP | < 2.5s | 2.5–4.0s | > 4.0s |
| CLS | < 0.1 | 0.1–0.25 | > 0.25 |
| TTFB | < 600ms | 600–1,800ms | > 1,800ms |
Additional Performance Checks
SEO and Search Visibility
| Check | What We Test |
|---|---|
| Page titles | Presence, length (50–60 chars optimal), uniqueness |
| Meta descriptions | Presence, length (150–160 chars optimal) |
| Heading structure | Single H1, valid hierarchy, no empty headings |
| Structured data | JSON-LD presence, Schema.org types detected |
| Crawlability | robots.txt, sitemap.xml, indexability status |
| Canonical URLs | Presence, self-referencing, host consistency |
| Internal linking | Orphan page detection, content depth analysis |
| Mobile-first | Viewport, overflow, tap targets, text size, mobile speed |
How the P.E.E.R. Score Works
| Pillar | Weight | What It Covers |
|---|---|---|
| Performance | 25% | Core Web Vitals, resource optimisation, network performance |
| Experience | 25% | WCAG compliance, mobile usability, forms, keyboard, screen reader |
| Emissions | 30% | Carbon per page view, green hosting, waste, grid intensity |
| Ranking | 20% | SEO technical health, content quality, mobile-first, structured data |
Grade Scale
A+ (95+) → A (90+) → B+ (85+) → B (80+) → C+ (75+) → C (70+) → D (60+) → F (below 60)
Floor Rules
A website cannot achieve an overall A+ unless every pillar scores at least B+. High performance cannot mask poor accessibility or excessive emissions.
JEDI Standards V2.2 Coverage
| JEDI Requirement | What It Requires | What PEER Tests |
|---|---|---|
| JEDI2.m | Website meets WCAG AAA criteria | 87 WCAG rules, pixel-verified contrast, full axe-core scan |
| JEDI2.q | Product/service assessed for inclusivity (font, colour, screen readers) | Font size, contrast ratio, screen reader score, keyboard score, mobile usability |
| JEDI2.p | Inclusive external communications | Readability (Flesch-Kincaid), inclusive language scan (30+ bias patterns), structural accessibility |
ON_TRACK
All three requirements pass
PARTIAL_COMPLIANCE
Some pass, none fail
ACTION_REQUIRED
One or more requirements fail
What Makes PEER Different
Built for B Corp, not bolted on
Most accessibility tools test against WCAG AA. PEER tests against AAA because that is what B Lab's JEDI Standards actually recommend. The readability analysis, inclusive language scanning, and JEDI-specific evidence structure exist because B Corp requires them.
Evidence, not opinions
Every finding includes the specific element, the standard violated, and the measured value. Contrast failures are pixel-verified against screenshots. Readability scores are computed per-page using established algorithms. Nothing is subjective.
Carbon methodology that stands up to scrutiny
Four independent carbon scores using different methodologies, referenced against peer-reviewed research (IEA, Ember, Malmodin), with live grid data from the National Grid ESO API. Green hosting verified through the Green Web Foundation.
Multi-page, multi-device
PEER scans across desktop and mobile viewports, testing up to 14 page views per audit. Scores reflect the whole site, not just the homepage.
Actionable remediation
Every issue maps to a specific remediation code with severity, description, and implementation guidance. 39 issue codes across accessibility (15), performance (10), SEO (8), and sustainability (6).
Frameworks and Standards Referenced
| Standard | Version | How PEER Uses It |
|---|---|---|
| WCAG | 2.2 | Full Level A, AA, and AAA automated testing |
| B Lab JEDI Standards | V2.2 (Feb 2026) | JEDI2.m, JEDI2.q, JEDI2.p compliance assessment |
| Google Core Web Vitals | 2024 thresholds | LCP, CLS, TTFB measurement and grading |
| GHG Protocol | Corporate Standard | Scope 3 Category 1 alignment for digital emissions |
| IEA World Energy Outlook | 2024 | Energy intensity baselines |
| Ember Global Electricity Review | 2024 | Grid carbon intensity (473 gCO2/kWh global average) |
| Malmodin & Lundén | 2023 | Data centre and network energy factors |
| Green Web Foundation | Current | Green hosting verification |
| National Grid ESO | Live API | Real-time UK grid carbon intensity |
| HTTP Archive Web Almanac | 2025 | Carbon grade calibration against real web distribution |
| DEFRA | 2025 | UK electricity emission factors |
| Sustainable Web Design Model | v4 | Comparative reference (traditional score) |
| Flesch-Kincaid / Gunning Fog | — | Content readability grading |
What PEER Does Not Do
PEER is an automated measurement tool. It covers the measurable, verifiable parts of JEDI compliance. It does not replace:
Manual Accessibility Testing
Keyboard-only walkthroughs, screen reader testing with NVDA/JAWS/VoiceOver, testing with real users with disabilities
Organisational JEDI Measures
Equity audits, diversity policies, stakeholder engagement, workforce demographics
Policy Review
JEDI2.c policy review against JEDI principles
Supplier Diversity Tracking
JEDI2.o demographic data on suppliers
Commitment Statements
JEDI2.a public JEDI commitment approved by management
The Digital Audit?
PEER tells you exactly where you stand on the digital requirements, with evidence an assessor can verify. That part is covered.
Ready to Audit Your Digital JEDI Compliance?
Get an evidence-backed PEER Audit report covering accessibility, inclusivity, carbon emissions, performance, and SEO — mapped directly to B Lab's JEDI Standards V2.2.
Under 4 minutes. Over 200 checks. Auditor-grade evidence.