Bias Detection
ATHENA detects bias across demographic subgroups, satisfying FDA AI/ML Guidelines Section IV.B and EU AI Act Article 10 requirements.
What is Bias?
Bias occurs when AI system performance varies systematically across protected groups:
Accuracy Disparity
AI is less accurate for some groups
85% accuracy for men, 62% for women
Representation Disparity
Some groups underrepresented in training
90% of training data from ages 25-45
Treatment Disparity
Different recommendations for similar cases
Recommending different treatments by race
Supported Subgroups
ATHENA supports both standard and custom demographic attributes:
Standard Attributes
Gender
male, female, non-binary, other
Age Group
18-24, 25-34, 35-44, 45-54, 55-64, 65+
Ethnicity
Configurable per customer
Region
Configurable per customer
Custom Attributes
Define any subgroup relevant to your domain:
Detection Methods
1. Statistical Parity
Compare rates across groups:
2. Accuracy Disparity
Compare AI accuracy across groups:
3. Four-Fifths Rule (EEOC Standard)
If any group's rate is less than 80% of the highest group:
Severity Levels
High
Disparity >20% OR affects >100 users
Immediate intervention required
Medium
Disparity 10-20% OR affects 50-100 users
Review within 7 days
Low
Disparity <10% OR affects <50 users
Monitor and track
API Example
Detect Bias in Real-Time
Get Subgroup Performance
Bias Alert Feed
Real-time feed of bias alerts:
Compliance Mapping
EU AI Act Art 10
Training data bias
Representation disparity detection
FDA AI/ML IV.B
Demographic performance
Accuracy disparity by subgroup
Texas TRAIGA
Bias detection
Real-time bias feed
Colorado AI Act
Impact assessment
Subgroup performance reports
Webhooks
Set up real-time alerts for bias events:
Webhook Payload:
Next: Compliance Mapping
Last updated