How We Do It

Data You Can Actually Trust

No opinions. No subjective curation. We analyze 20,000+ real website screenshots every month—desktop and mobile—so you get design intelligence you can confidently act on.

Why Our Data Is Different

Three principles that make the difference between noise and insight

Scale That Matters

10,000+ sites analyzed every month. Large enough to detect real patterns, not cherry-picked examples.

Freshness That Counts

Monthly updates, not annual reports. When trends shift, you know within weeks—not months after your competitors.

Objectivity You Need

Automated analysis removes human bias. Expert validation adds context. You get insights you can defend.

How We Turn Screenshots Into Insights

Five steps from raw data to decisions you can act on

1

Data Collection

We capture screenshots of 10,000+ websites monthly from the performant, accessible web—sites that load quickly, respect users, and prioritize good UX. Our analysis covers both desktop (1920x1080) and mobile (375x812) viewports across 8 major industry verticals.

  • Desktop + mobile screenshots for every site
  • Sites must load within 30 seconds (faster than 90% of user patience)
  • Balanced across verticals and geographies
  • Focus on accessible, public-facing production sites
2

Automated Analysis

Our proprietary technology processes each website to extract quantitative design data across 15+ metrics covering visual design, layouts, and UI patterns.

  • Consistent measurement methodology
  • Multi-dimensional data extraction
  • Quality validation checks
3

Pattern Recognition

Advanced algorithms identify emerging patterns, trend shifts, and outliers. We track month-over-month changes to detect design evolution in real-time.

  • Statistical trend analysis
  • Vertical-specific pattern detection
  • Historical comparison tracking
4

Expert Validation

Our design team reviews automated findings to ensure accuracy, filter noise, and provide context. Human expertise validates the data before publication.

  • Design team review process
  • Contextual interpretation
  • Actionable recommendations
5

Report Generation

Insights are compiled into comprehensive reports with visualizations, benchmarks, and actionable takeaways tailored for designers and product teams.

  • Interactive data visualizations
  • Cross-vertical comparisons
  • Industry-specific insights

What We Measure (And Why It Matters)

15+ metrics that give you the complete picture

Visual Design

  • Color palette extraction and trend analysis
  • Typography classification and usage patterns
  • Dark mode vs light mode adoption rates
  • Image treatment styles and ratios
  • Brand aesthetic categorization

Layout & Structure

  • Navigation patterns and menu structures
  • Hero section layout variations (desktop vs mobile)
  • Grid system and spacing analysis
  • Page structure and information hierarchy
  • Responsive design and mobile-specific patterns

UI Components

  • Button styles, sizes, and states
  • Form field design patterns
  • CTA placement and design
  • Card component variations
  • Micro-interactions and animations

User Experience

  • Loading states and skeleton screens
  • Error message patterns
  • Accessibility features (ARIA, contrast)
  • Mobile-first vs desktop-first design approaches
  • Touch target sizes and mobile interactions
  • Progressive disclosure patterns

The Numbers Behind Our Numbers

Quality at scale isn't easy—here's how we ensure accuracy

99.5%

Accuracy Rate

Validated against manual design audits by our expert team

10K+

Sites Monthly

Largest dataset in the industry for statistical significance

8

Verticals

Industry-specific analysis for relevant benchmarking

Coming Q1 2026

Next-Generation Analysis

Expanding beyond visual design into sentiment, ethics, and cultural intelligence

Soon

Frustration Index

Which UX patterns frustrate users most? We're developing proprietary methods to identify design decisions that correlate with negative user experiences—so you can avoid the patterns that hurt satisfaction before you ship.

  • Regional differences in user expectations
  • Industry-specific pain points
  • Pattern-to-outcome correlation
Soon

Dark Patterns Tolerance Matrix

Not all manipulative UX is equally rejected. We're measuring regional tolerance for dark patterns— forced continuity, hidden costs, privacy zuckering, and more. Design ethically for your specific market.

  • 6 regions: US/UK, Germany, Japan, France, Brazil, India
  • Tolerance scores: -100 (zero tolerance) to +100 (high acceptance)
  • GDPR vs non-GDPR market differences
Soon

Cultural Context Frameworks

Raw design data without cultural interpretation misleads. We're building frameworks that explain why patterns work differently across cultures—so you can design respectfully for global audiences.

  • Germany: Datensparsamkeit (data minimization as default)
  • Japan: Wa (harmony)—flow disruptions violate expectations
  • Brazil: WhatsApp-first interaction model
  • India: Data frugality on budget Android devices
Soon

Pattern Lifecycle Tracking

Not all trends are created equal. We're adding lifecycle tags to every pattern—emerging, mature, or declining— so you know when to adopt and when to hold back.

  • Month-over-month adoption rate tracking
  • Early adopter vs mainstream timing
  • Industry-specific adoption curves
Data Quality

Why We Analyze the Performant Web

Sites that can't load in 30 seconds aren't just slow—they're bad UX by definition. Our methodology naturally filters for sites that care about user experience.

What We Include

  • +Sites optimized for real-world connections
  • +Sites that prioritize Core Web Vitals
  • +Publicly accessible, user-focused designs
  • +Sites where performance IS a UX decision

What We Exclude

  • -Bloated enterprise sites with 10MB JS bundles
  • -Heavy SPAs that sacrifice speed for features
  • -Paywalled or gated content
  • -Sites inaccessible to users on slower connections

If a site can't load for our analysis, it can't load for users on slower connections either. Our data represents what real users actually experience—not what enterprise developers wish they experienced.

Methodology Limitations

We believe in transparency. Here are the known limitations of our approach:

  • Static Analysis: We analyze visual design and layout patterns, not user behavior or conversion data.
  • Public Sites Only: Analysis is limited to publicly accessible websites. No analysis of authenticated experiences.
  • Temporal Snapshots: Monthly snapshots may miss short-lived experiments or A/B tests.
  • Standard Viewports: Mobile analysis uses standard iPhone viewport (375x812). May not capture tablet or larger mobile devices.

Ready to See the Data?

First report launches January 2026. Sign up to get notified.