Generate Unit Tests

Category: Testing October 1, 2025 Source: awesome-copilot

Create comprehensive unit tests for selected code following testing best practices and frameworks.

TestingUnit TestsTDDQuality Assurance
# Generate Unit Tests

Create comprehensive unit tests for the selected code, function, or class following testing best practices and using appropriate testing frameworks.

## Test Generation Guidelines

### 1. Test Framework Selection
- Detect the project's testing framework (Jest, pytest, JUnit, Moq, etc.)
- Use framework-specific syntax and conventions
- Include necessary imports and setup code

### 2. Test Coverage Areas

**Functionality Testing**
- Happy path scenarios
- Expected inputs and outputs
- Return values and state changes
- Method chaining and composition

**Edge Cases**
- Boundary values
- Empty inputs (null, undefined, empty string, empty array)
- Maximum and minimum values
- Special characters and unusual inputs

**Error Handling**
- Invalid inputs
- Exception scenarios
- Error messages validation
- Graceful degradation

**Integration Points**
- External dependencies (mock or stub)
- Database interactions
- API calls
- File system operations

### 3. Test Structure

Each test should follow the **Arrange-Act-Assert** (AAA) pattern:

// Arrange: Set up test data and dependencies // Act: Execute the code under test // Assert: Verify the expected outcome


Or **Given-When-Then** for BDD:

// Given: Initial context // When: Action taken // Then: Expected result


### 4. Test Naming Convention

Use descriptive test names that clearly indicate:
- What is being tested
- Under what conditions
- What the expected outcome is

Examples:
- `test_calculate_sum_returns_correct_result_for_positive_numbers`
- `should_throw_error_when_input_is_null`
- `getUserById_returns_user_when_exists`

### 5. Mocking and Stubbing

- Mock external dependencies (APIs, databases, file systems)
- Use dependency injection for testability
- Isolate the unit under test
- Verify mock interactions when relevant

### 6. Test Data

- Use realistic test data
- Create test fixtures or factories for complex objects
- Consider parameterized tests for multiple scenarios
- Use descriptive variable names in tests

## Expected Output

Generate tests that include:

1. **Test file structure** matching project conventions
2. **Setup and teardown** methods if needed
3. **Mock definitions** for dependencies
4. **Individual test cases** covering all scenarios
5. **Assertions** verifying expected behavior
6. **Comments** explaining complex test logic
7. **Test utilities** or helper functions if beneficial

## Best Practices

- Keep tests independent and isolated
- Make tests deterministic (no random values or time dependencies)
- Test one thing per test case
- Use meaningful assertion messages
- Avoid testing implementation details
- Make tests fast and reliable
- Follow the F.I.R.S.T principles:
  - Fast
  - Independent
  - Repeatable
  - Self-validating
  - Timely

## Coverage Goals

Aim for:
- 100% coverage of public interfaces
- Critical path coverage
- Error handling coverage
- Edge case coverage
- Integration point coverage (with mocks)