Skip to main content

hitspec contract

Verify API contracts against a live provider.
hitspec contract verify <contracts-dir> [flags]

Arguments

ArgumentDescription
<contracts-dir>Directory containing contract .http files (or a single file)

Flags

FlagShortDescriptionDefault
--provider-pProvider URL (required)
--state-handlerPath to a state handler script
--verbose-vEnable verbose outputfalse

Contract Annotations

Define contract metadata using annotations in your .http files:
### Get User
# @name getUser
# @contract.provider UserService
# @contract.state "user exists with id 1"
GET {{provider}}/users/1

>>>
expect status == 200
expect body.id == 1
expect body.name type string
<<<
AnnotationDescription
@contract.providerName of the provider service
@contract.stateProvider state required for this interaction

State Handler

The --state-handler flag specifies a script that sets up provider state before each interaction. The script receives the state description as an argument:
#!/bin/bash
# setup-states.sh
case "$1" in
  "user exists with id 1")
    curl -s -X POST http://localhost:3000/_setup -d '{"user_id": 1}'
    ;;
  "no users exist")
    curl -s -X POST http://localhost:3000/_reset
    ;;
esac

Examples

hitspec contract verify contracts/ --provider http://localhost:3000

Sample Output

Contract Verification Results

contracts/user-service.http
  + getUser [passed] (45ms)
  + createUser [passed] (120ms)
  x deleteUser [failed] (89ms)
    -> status ==
      Expected: 204
      Actual:   404

Summary
  Contracts: 1 file(s)
  Interactions: 2 passed, 1 failed, 3 total

Use Cases

  • Consumer-driven contracts — Define what your service expects from its dependencies, then verify those contracts against the real provider.
  • API versioning — Ensure that provider changes do not break existing consumers.
  • CI/CD gates — Run contract verification as part of the provider’s deployment pipeline.

hitspec diff

Compare two JSON test result files to identify regressions, improvements, and changes between test runs.
hitspec diff <results1.json> <results2.json> [flags]

Arguments

ArgumentDescription
<results1.json>Baseline test results (JSON output from hitspec run --output json)
<results2.json>Current test results to compare against the baseline

Flags

FlagShortDescriptionDefault
--output-oOutput format: console, json, htmlconsole
--thresholdFail if any test is slower by this percentage (e.g., 10%)

How It Works

The diff command compares tests by name and file, then categorizes each as:
StatusMeaning
improvedTest now passes (previously failed) or is >10% faster
regressedTest now fails (previously passed) or is >10% slower
unchangedNo significant change in status or duration
newTest exists only in the second result file
removedTest exists only in the first result file

Generating Input Files

First, generate JSON results from your test runs:
# Baseline run
hitspec run tests/ --output json --output-file baseline.json

# After changes
hitspec run tests/ --output json --output-file current.json

Examples

hitspec diff baseline.json current.json

Console Output

Test Results Comparison
  File 1: baseline.json
  File 2: current.json

Summary
  Total Tests:    5
  Improved:       1
  Regressed:      1
  Unchanged:      3

Duration
  Total (File 1): 450ms
  Total (File 2): 520ms
  Avg (File 1):   90ms
  Avg (File 2):   104ms

Test Details
  ^ healthCheck  45ms -> 42ms  -6.7%
  = login        120ms -> 125ms +4.2%
  v getProfile   89ms -> 150ms +68.5%
  + newEndpoint  (new, 30ms)
  - oldEndpoint  (removed)

x Threshold check failed (some tests exceeded 10% regression)

Threshold Check

When --threshold is set, the command exits with a non-zero status if any test’s duration increases by more than the specified percentage. This is useful for catching performance regressions in CI:
hitspec diff baseline.json current.json --threshold 15%
# Exit code 0 if all tests are within 15% of baseline
# Exit code 1 if any test regressed by more than 15%

CI/CD Integration

# GitHub Actions
- name: Run baseline tests
  run: hitspec run tests/ --output json --output-file baseline.json

- name: Apply changes
  run: # ... your deployment or code change steps

- name: Run current tests
  run: hitspec run tests/ --output json --output-file current.json

- name: Check for regressions
  run: hitspec diff baseline.json current.json --threshold 10%