Skip to main content

Conformance Testing

How to run conformance tests locally and in CI, interpret results, and update fixtures.

Running Conformance Tests

Local Execution

# Clone the conformance repository
git clone https://github.com/RegistryAccord/registryaccord-conformance.git
cd registryaccord-conformance

# Install dependencies
npm install

# Start devstack in another terminal
# (from devstack directory)
make up

# Run all conformance tests
make test

# Run specific service tests
make test-identity
make test-cdv
make test-gateway

# Run with verbose output
make test VERBOSE=1

Test Environment Setup

Ensure the following environment variables are set:

export RA_IDENTITY_BASE_URL=http://localhost:3000
export RA_CDV_BASE_URL=http://localhost:3001
export RA_GATEWAY_BASE_URL=http://localhost:3002

Or create a .env.local file with these values.

CI Execution

Conformance tests run automatically in CI for all pull requests:

# GitHub Actions workflow snippet
- name: Run conformance tests
run: |
cd registryaccord-conformance
npm install
make test
env:
RA_IDENTITY_BASE_URL: http://localhost:3000
RA_CDV_BASE_URL: http://localhost:3001
RA_GATEWAY_BASE_URL: http://localhost:3002

Quality Gates

  • Threshold: ≥95% pass rate required for all services
  • Blocking: PRs blocked if threshold not met
  • Reporting: JSON/JUnit/HTML reports generated
  • Artifacts: Test results attached to build

Test Categories

Positive Tests

Verify correct behavior for valid inputs:

  • Successful DID creation and resolution
  • Proper JWT issuance and validation
  • Correct record creation and listing
  • Valid media upload and finalization
  • Proper feed generation and search

Negative Tests

Verify proper error handling for invalid inputs:

  • Expired/used nonce handling
  • Unknown/retired kid validation
  • Malformed cursor rejection
  • Oversize/disallowed media rejection
  • Schema invalid record rejection

Performance Tests

Verify system performance under load:

  • P95 latency assertions
  • Concurrent request handling
  • Resource utilization monitoring

Interpreting Results

Report Formats

JSON Report

{
"summary": {
"total": 100,
"passed": 96,
"failed": 4,
"passRate": 96.0
},
"failedTests": [
{
"name": "IdentityService.keyRotationOverlap",
"error": "Key rotation window not properly validated",
"correlationId": "abc123"
}
]
}

JUnit Report

Compatible with CI systems for test result visualization.

HTML Report

Human-readable format with detailed test information.

Common Failure Patterns

  1. Envelope Format Issues: Response format doesn't match specification
  2. Error Taxonomy Mismatches: Error codes or structure incorrect
  3. Timestamp Inconsistencies: RFC 3339 formatting issues
  4. Pagination Problems: Cursor encoding/decoding failures
  5. Authentication Failures: JWT validation issues
  6. Schema Validation Errors: Records not properly validated

Updating Golden Fixtures

When API changes require fixture updates:

# Regenerate fixtures after API changes
make regenerate-fixtures

# Update specific service fixtures
make regenerate-identity-fixtures
make regenerate-cdv-fixtures
make regenerate-gateway-fixtures

Fixture Structure

fixtures/
├── identity/
│ ├── create-valid.json
│ ├── resolve-valid.json
│ └── session-valid.json
├── cdv/
│ ├── record-create-valid.json
│ ├── record-list-valid.json
│ └── media-finalize-valid.json
└── gateway/
├── feed-following-valid.json
├── feed-author-valid.json
└── search-valid.json

Thresholds and Requirements

Service Requirements

ServicePass RateTotal TestsNotes
Identity≥95%~50 testsIncludes security validation
CDV≥95%~70 testsIncludes schema validation
Gateway≥95%~40 testsIncludes pagination tests

Performance Requirements

  • P95 latency < 500ms for read operations
  • P95 latency < 1000ms for write operations
  • Concurrent requests: 100 simultaneous users
  • Resource utilization < 80% under load

Debugging Test Failures

Local Debugging

# Run specific failing test
npm test -- --testNamePattern="IdentityService.keyRotation"

# Run with debug logging
DEBUG=registryaccord:* make test

# Run with increased timeout
TIMEOUT=30000 make test

CI Debugging

  1. Check test logs in GitHub Actions
  2. Download artifact reports
  3. Reproduce locally with same environment
  4. Verify against specification requirements

Best Practices

  1. Run tests regularly during development
  2. Check conformance before submitting PRs
  3. Update fixtures when API changes
  4. Add negative tests for new error cases
  5. Monitor performance metrics
  6. Review reports for patterns

Conformance testing ensures RegistryAccord implementations meet protocol requirements and maintain interoperability.