Testing Strategy - Garume/Manifold GitHub Wiki

Testing Strategy

This page documents the testing approach used across the Manifold project, covering all five test projects, the xUnit v3 framework configuration, code coverage enforcement, test organization patterns, and strategies for testing source-generated code. The testing infrastructure ensures correctness of the core contracts, the source generator, both runtime surfaces (CLI and MCP), and the sample host applications.

The test suite is tightly integrated into the build system and CI/CD pipeline, where the quality.ps1 orchestration script runs all tests as part of a unified quality gate that also includes formatting checks and architecture validation.

Test Framework and Infrastructure

xUnit v3 with Microsoft Testing Platform

All test projects use xUnit v3 (version 3.2.2) with the Microsoft Testing Platform runner (xunit.v3.mtp-v2). This is configured globally in the repository root Directory.Build.props for any project that sets IsManifoldTestProject=true.

<PropertyGroup Condition="'$(IsManifoldTestProject)' == 'true'">
  <OutputType>Exe</OutputType>
  <UseMicrosoftTestingPlatformRunner>true</UseMicrosoftTestingPlatformRunner>
  <TestingPlatformDotnetTestSupport>true</TestingPlatformDotnetTestSupport>
</PropertyGroup>

<ItemGroup Condition="'$(IsManifoldTestProject)' == 'true'">
  <PackageReference Include="coverlet.MTP" Version="8.0.0" />
  <PackageReference Include="xunit.analyzers" Version="1.27.0">
    <PrivateAssets>all</PrivateAssets>
    <IncludeAssets>runtime; build; native; contentfiles; analyzers; buildtransitive</IncludeAssets>
  </PackageReference>
  <Content Include="$(MSBuildThisFileDirectory)xunit.runner.json"
           Link="xunit.runner.json"
           CopyToOutputDirectory="PreserveNewest" />
</ItemGroup>

Sources: Directory.Build.props:18-33

Key aspects of this configuration:

Setting Value Purpose
OutputType Exe Required by xUnit v3 MTP runner
UseMicrosoftTestingPlatformRunner true Enables the Microsoft Testing Platform
TestingPlatformDotnetTestSupport true Enables dotnet test integration
coverlet.MTP 8.0.0 Code coverage collection
xunit.analyzers 1.27.0 Static analysis rules for test code

All test projects declare a global using for Xunit and target net10.0.

Test Projects Overview

The repository contains five test projects under the tests/ directory, each targeting a specific package or integration concern.

flowchart TD
    A[Manifold.Tests] -->|tests| B[Manifold]
    C[Manifold.Cli.Tests] -->|tests| D[Manifold.Cli]
    C -->|uses generator| E[Manifold.Generators]
    F[Manifold.Generators.Tests] -->|tests| E
    G[Manifold.Mcp.Tests] -->|tests| H[Manifold.Mcp]
    G -->|uses generator| E
    I[Manifold.Samples.Tests] -->|smoke tests| J[Sample Hosts]

    style A fill:#4a9eff,color:#fff
    style C fill:#4a9eff,color:#fff
    style F fill:#4a9eff,color:#fff
    style G fill:#4a9eff,color:#fff
    style I fill:#f5a623,color:#fff
Loading
Test Project Package Under Test Coverage Threshold Key Dependencies
Manifold.Tests Manifold 90% xunit.v3.mtp-v2
Manifold.Cli.Tests Manifold.Cli 90% xunit.v3.mtp-v2, DI, Generator (as Analyzer)
Manifold.Generators.Tests Manifold.Generators 90% xunit.v3.mtp-v2, Roslyn CSharp 4.14.0
Manifold.Mcp.Tests Manifold.Mcp 90% xunit.v3.mtp-v2, DI, Generator (as Analyzer)
Manifold.Samples.Tests Sample host apps 0% xunit.v3.mtp-v2

Sources: Manifold.Tests.csproj:1-18, Manifold.Cli.Tests.csproj:1-23, Manifold.Samples.Tests.csproj:1-15

Generator Reference Pattern

The CLI and MCP test projects reference Manifold.Generators as an analyzer rather than a standard project reference, enabling the source generator to run during their compilation. This allows these test projects to exercise the actual generated code.

<ProjectReference Include="..\..\src\Manifold.Generators\Manifold.Generators.csproj"
                  OutputItemType="Analyzer"
                  ReferenceOutputAssembly="false" />

Sources: Manifold.Cli.Tests.csproj:19-21

Code Coverage Enforcement

90% Threshold Policy

Every test project must declare a CoverageThreshold MSBuild property. The Directory.Build.targets file contains a target that fails the build if this property is missing:

<Target Name="FailIfCoverageThresholdMissing"
        BeforeTargets="Test"
        Condition="'$(IsManifoldTestProject)' == 'true' and '$(CoverageThreshold)' == ''">
  <Error Text="CoverageThreshold must be set for all test projects." />
</Target>

Sources: Directory.Build.targets:13-17

The four package test projects enforce a 90% code coverage threshold. The Manifold.Samples.Tests project uses a threshold of 0% because it performs smoke tests against compiled sample host processes, where coverage collection does not apply.

flowchart TD
    A[Test Project Build] --> B{CoverageThreshold set?}
    B -->|No| C[Build Error]
    B -->|Yes| D{Run Tests}
    D --> E[Collect Coverage]
    E --> F{Coverage >= Threshold?}
    F -->|Yes| G[Pass]
    F -->|No| H[Fail]

    style C fill:#e74c3c,color:#fff
    style H fill:#e74c3c,color:#fff
    style G fill:#27ae60,color:#fff
Loading

Coverage is collected using Coverlet (version 8.0.0) via the coverlet.MTP package, which integrates directly with the Microsoft Testing Platform runner.

Sources: Directory.Build.props:25

Test Execution Pipeline

Build Scripts

The test.ps1 script discovers test projects from the solution file using XPath, then runs each project individually via dotnet test:

[xml]$solutionDocument = Get-Content (Resolve-Path $Solution)
$testProjects =
    $solutionDocument.SelectNodes('//Project[@Path]') |
    ForEach-Object { $_.Path } |
    Where-Object { $_ -match '^(tests[\\/].+\.csproj)$' } |
    ForEach-Object { (Resolve-Path (Join-Path $root $_)).Path }

Test output is written to .artifacts/test-output/{guid}/{projectName}/ for each run, using a unique GUID directory per execution to avoid collisions.

Sources: build/test.ps1:15-27

Quality Gate Orchestration

The quality.ps1 script runs tests as part of a five-step quality gate:

flowchart TD
    A[restore.ps1] --> B[build.ps1]
    B --> C[format.ps1]
    C --> D[test.ps1]
    D --> E[architecture.ps1]

    style D fill:#4a9eff,color:#fff
Loading
  1. Restore — package restoration
  2. Build — compile all projects
  3. Format — code style enforcement
  4. Test — run all test projects with coverage
  5. Architecture — structural invariant checks

Any step failure halts the pipeline immediately.

Sources: build/quality.ps1:1-35

CI Integration

The GitHub Actions CI workflow runs quality.ps1 on every push and pull request:

jobs:
  quality-and-test:
    runs-on: windows-latest
    steps:
      - uses: actions/checkout@v5
      - uses: actions/setup-dotnet@v5
        with:
          global-json-file: global.json
          cache: true
      - name: Quality
        shell: pwsh
        run: ./build/quality.ps1

Sources: .github/workflows/ci.yml:1-24

Test Organization Patterns

File and Class Conventions

Each test file contains a single sealed test class following the naming pattern {Feature}Tests. All test methods use the [Fact] attribute (no parameterized [Theory] tests). File-scoped namespaces match the project namespace.

tests/
├── Manifold.Tests/
│   ├── ClassBasedOperationTests.cs
│   ├── OperationAttributeTests.cs
│   ├── OperationContextTests.cs
│   └── ParameterAttributeTests.cs
├── Manifold.Cli.Tests/
│   ├── CliApplicationTests.cs
│   ├── CliPerformanceTests.cs
│   ├── GeneratedCliInvokerTests.cs
│   └── Samples/SampleCliOperations.cs
├── Manifold.Generators.Tests/
│   ├── GeneratedOperationRegistryTests.cs
│   ├── OperationDescriptorGeneratorDiagnosticsTests.cs
│   └── Samples/
├── Manifold.Mcp.Tests/
│   ├── GeneratedMcpCatalogTests.cs
│   ├── GeneratedMcpInvokerTests.cs
│   ├── GeneratedMcpToolsTests.cs
│   ├── McpTextContentResponseWriterTests.cs
│   └── Samples/SampleMcpOperations.cs
└── Manifold.Samples.Tests/
    └── SampleHostSmokeTests.cs

Sample Operation Classes

Each test project that exercises generated code includes a Samples/ directory containing operation definitions annotated with [Operation] and related attributes. These serve as concrete inputs for the source generator during test compilation:

Sample File Project Purpose
SampleCliOperations.cs Manifold.Cli.Tests CLI-specific operations with aliases, DI, formatters
SampleMcpOperations.cs Manifold.Mcp.Tests MCP-specific operations with class-based patterns
SampleOperations.cs Manifold.Generators.Tests Static-method operations for registry validation
SampleClassOperations.cs Manifold.Generators.Tests IOperation<TRequest, TResult> class-based pattern

Test Helper Types

Tests use lightweight helper types rather than a shared test utilities library:

Type Location Purpose
IMathOffsetProvider / ConstantMathOffsetProvider SampleCliOperations.cs DI testing for CLI surface
IGreetingPrefixProvider / ConstantGreetingPrefixProvider SampleMcpOperations.cs DI testing for MCP surface
WeatherPreviewFormatter SampleCliOperations.cs IResultFormatter<T> testing
NullCliInvoker CliApplicationTests.cs Mock invoker for failure paths
RawJsonCliInvoker CliApplicationTests.cs Mock invoker for raw JSON output
DictionaryServiceProvider OperationContextTests.cs Minimal IServiceProvider mock

Assertion Patterns

Tests use standard xUnit assertions without custom helper libraries:

  • Assert.Equal() / Assert.NotEqual() — value comparisons
  • Assert.True() / Assert.False() — boolean conditions
  • Assert.Collection() — ordered collection element validation
  • Assert.Single() — single-element collections with optional predicate
  • Assert.Contains() — substring and collection membership
  • Assert.InRange() — numeric range validation (used in performance tests)
  • Assert.Throws<T>() — exception verification
  • Assert.IsType<T>() — type checking

CancellationToken via TestContext

All async tests use TestContext.Current.CancellationToken provided by the xUnit v3 runner for cooperative cancellation:

string stdout = await process.StandardOutput.ReadToEndAsync(TestContext.Current.CancellationToken);

Sources: SampleHostSmokeTests.cs:23

Generated Code Testing Strategy

Testing source-generated code requires two distinct approaches: testing the generated output (registry, invokers, catalogs) and testing the generator's diagnostic reporting.

flowchart TD
    A[Generated Code Tests] --> B[Output Validation]
    A --> C[Diagnostic Validation]

    B --> D[GeneratedOperationRegistry]
    B --> E[GeneratedCliInvoker]
    B --> F[GeneratedMcpInvoker]
    B --> G[GeneratedMcpCatalog]

    C --> H[Inline C# Compilation]
    C --> I[Generator Driver Execution]
    C --> J[Diagnostic Message Assertion]

    style B fill:#4a9eff,color:#fff
    style C fill:#f5a623,color:#fff
Loading

Testing Generated Output

Test projects that reference the generator as an analyzer compile their Samples/ operation definitions through the generator at build time. The tests then validate the generated types at runtime.

Registry validation checks descriptor counts, operation IDs, metadata, and parameter details:

Assert.Equal(4, GeneratedOperationRegistry.Operations.Count);
Assert.Collection(
    GeneratedOperationRegistry.Operations.Select(static operation => operation.OperationId),
    static operationId => Assert.Equal("math.add", operationId),
    static operationId => Assert.Equal("sample.class-hello", operationId),
    static operationId => Assert.Equal("sample.hello", operationId),
    static operationId => Assert.Equal("weather.get", operationId));

Sources: GeneratedOperationRegistryTests.cs:10-16

Descriptor metadata validation covers declaring type, method name, result type, visibility, descriptions, CLI command paths, and MCP tool names:

Assert.Equal(typeof(Samples.SampleOperations), descriptor!.DeclaringType);
Assert.Equal("Hello", descriptor.MethodName);
Assert.Equal(typeof(string), descriptor.ResultType);
Assert.Equal(OperationVisibility.Both, descriptor.Visibility);
Assert.Equal("Say hello.", descriptor.Description);

Sources: GeneratedOperationRegistryTests.cs:22-35

Parameter metadata validation covers name, type, source binding, position, aliases, CLI/MCP name overrides, and required/optional status:

Assert.Equal("name", classHelloParameter.Name);
Assert.Equal("person", classHelloParameter.CliName);
Assert.Equal("targetName", classHelloParameter.McpName);
Assert.Equal(typeof(string), classHelloParameter.ParameterType);
Assert.Equal(["n", "username"], classHelloParameter.Aliases);

Sources: GeneratedOperationRegistryTests.cs:109-114

Testing Generator Diagnostics

The OperationDescriptorGeneratorDiagnosticsTests class validates all five compile-time diagnostic rules (DMCF001–DMCF005) by compiling inline C# source through the Roslyn API and running the generator driver:

private static ImmutableArray<Diagnostic> RunGenerator(string source)
{
    CSharpCompilation compilation = CSharpCompilation.Create(
        assemblyName: "Manifold.Generators.Diagnostics.Tests",
        syntaxTrees: [CSharpSyntaxTree.ParseText(source)],
        references: GetMetadataReferences(),
        options: new CSharpCompilationOptions(OutputKind.DynamicallyLinkedLibrary));

    GeneratorDriver driver = CSharpGeneratorDriver.Create(
        new global::Manifold.Generators.OperationDescriptorGenerator());
    GeneratorDriverRunResult runResult = driver.RunGenerators(compilation).GetRunResult();
    return [.. runResult.Diagnostics];
}

Sources: OperationDescriptorGeneratorDiagnosticsTests.cs:114-125

Each diagnostic rule has a dedicated test that provides invalid inline source and asserts the expected diagnostic ID and message:

Diagnostic Test Invalid Pattern
DMCF001 Conflicting visibility [CliOnly] + [McpOnly] on same operation
DMCF002 Conflicting parameter binding [Option] + [Argument] on same parameter
DMCF003 Unsupported parameter binding Parameter without any binding attribute
DMCF004 Missing IOperation implementation [Operation] on class without IOperation<,>
DMCF005 Non-writable request property Read-only property in request class
Diagnostic diagnostic = Assert.Single(diagnostics, static candidate => candidate.Id == "DMCF001");
Assert.Contains("both [CliOnly] and [McpOnly]",
    diagnostic.GetMessage(CultureInfo.InvariantCulture), StringComparison.Ordinal);

Sources: OperationDescriptorGeneratorDiagnosticsTests.cs:28-29

Metadata references are resolved from the runtime's trusted platform assemblies plus the Manifold assembly:

string trustedPlatformAssemblies = (string?)AppContext.GetData("TRUSTED_PLATFORM_ASSEMBLIES")
    ?? throw new InvalidOperationException("Could not resolve trusted platform assemblies.");

Sources: OperationDescriptorGeneratorDiagnosticsTests.cs:127-138

Performance Testing

The CliPerformanceTests class enforces allocation budgets using GC.GetAllocatedBytesForCurrentThread() to verify the zero-allocation fast-path design.

flowchart TD
    A[Warm-up Invocation] --> B[Record GC Bytes Before]
    B --> C[Execute Operation]
    C --> D[Record GC Bytes After]
    D --> E{Allocation Delta <= Budget?}
    E -->|Yes| F[Pass]
    E -->|No| G[Fail]

    style F fill:#27ae60,color:#fff
    style G fill:#e74c3c,color:#fff
Loading

Three allocation budgets are enforced:

Test Operation Budget
TryFindOptionValue_primary_name_does_not_allocate Option lookup by primary name 0 bytes
ExecuteAsync_common_path_stays_under_allocation_budget Full CLI invocation (argument path) 1,024 bytes
ExecuteAsync_option_path_stays_under_allocation_budget Full CLI invocation (option path) 2,048 bytes

The tests use synchronous .GetAwaiter().GetResult() to keep execution on the current thread for accurate GC measurement, suppressing xUnit warning xUnit1031:

[Fact]
[SuppressMessage(
    "xUnit",
    "xUnit1031:Test methods should not use blocking task operations",
    Justification = "Allocation measurement must stay on the current thread.")]
public void ExecuteAsync_common_path_stays_under_allocation_budget()
{
    // Warm-up invocation (primes caches, JIT)
    _ = application.ExecuteAsync(["math", "add", "4", "5"], TextWriter.Null, TextWriter.Null, ...)
        .GetAwaiter().GetResult();

    long before = GC.GetAllocatedBytesForCurrentThread();
    int exitCode = application.ExecuteAsync(["math", "add", "4", "5"], TextWriter.Null, TextWriter.Null, ...)
        .GetAwaiter().GetResult();
    long after = GC.GetAllocatedBytesForCurrentThread();

    Assert.Equal(CliExitCodes.Success, exitCode);
    Assert.InRange(after - before, 0, 1024);
}

Sources: CliPerformanceTests.cs:29-50

Integration Smoke Tests

The Manifold.Samples.Tests project validates that all three sample host applications build and run correctly. These are true end-to-end tests that compile and execute sample projects as separate processes.

Test Collection and Parallelization

Smoke tests are grouped into a named collection with parallelization disabled to avoid port conflicts and resource contention:

[CollectionDefinition(Name, DisableParallelization = true)]
public sealed class SampleHostsCollectionDefinition
{
    public const string Name = "sample-hosts";
}

[Collection(SampleHostsCollectionDefinition.Name)]
public sealed class SampleHostSmokeTests

Sources: SampleHostSmokeTests.cs:5-12

Three Smoke Test Scenarios

sequenceDiagram
    participant T as Test Runner
    participant B as dotnet build
    participant P as Sample Process
    participant H as HTTP Client

    Note over T,H: CLI Sample Host Test
    T->>B: Build CliHost project
    B-->>T: DLL path
    T->>P: Start with args [math, add, 2, 3]
    P-->>T: stdout = "5", exit code 0

    Note over T,H: MCP HTTP Sample Host Test
    T->>B: Build McpHttpHost project
    B-->>T: DLL path
    T->>P: Start (no args)
    T->>H: GET http://127.0.0.1:38474/
    H-->>T: Response body
    T->>P: Kill process

    Note over T,H: MCP Stdio Sample Host Test
    T->>B: Build McpStdioHost project
    B-->>T: DLL path
    T->>P: Start (no args)
    T->>T: Wait 750ms
    T->>T: Assert not exited
    T->>P: Kill process
Loading
Test Validates
Cli_sample_host_executes_a_command CLI host builds, runs math add 2 3, returns "5" with exit code 0
Mcp_http_sample_host_serves_root_endpoint HTTP host builds, starts, responds at root endpoint within 10s timeout
Mcp_stdio_sample_host_starts_and_stays_alive Stdio host builds, starts, remains alive for 750ms

Sources: SampleHostSmokeTests.cs:14-78

Process Management Utilities

The smoke tests include several helper methods for process lifecycle management:

  • BuildSampleAsync — Compiles a sample project via dotnet build and returns the output DLL path. Throws XunitException on build failure or missing output.
  • StartDotNetProcess — Creates a Process with redirected I/O streams for capturing output.
  • WaitForHttpRootAsync — Polls an HTTP endpoint with a 10-second deadline and 200ms retry interval.
  • StopProcessAsync — Kills the process tree and waits for exit.
  • GetRepositoryRoot — Walks up from AppContext.BaseDirectory looking for Manifold.slnx to locate the repository root.

Sources: SampleHostSmokeTests.cs:80-211

Test Categories Summary

flowchart TD
    A[Manifold Test Suite] --> B[Unit Tests]
    A --> C[Generated Code Tests]
    A --> D[Performance Tests]
    A --> E[Integration Smoke Tests]

    B --> B1[Attribute Validation]
    B --> B2[Context & Service Resolution]
    B --> B3[CLI Application Logic]
    B --> B4[MCP Response Formatting]

    C --> C1[Registry Output Validation]
    C --> C2[Invoker Fast-Path Testing]
    C --> C3[MCP Catalog Validation]
    C --> C4[Diagnostic Rule Verification]

    D --> D1[Zero-Alloc Option Lookup]
    D --> D2[CLI Invocation Budgets]

    E --> E1[CLI Host Process]
    E --> E2[MCP HTTP Host Process]
    E --> E3[MCP Stdio Host Process]
Loading
Category Projects Focus
Unit tests Manifold.Tests, Manifold.Cli.Tests, Manifold.Mcp.Tests Core contracts, attribute behavior, CLI/MCP logic
Generated code tests Manifold.Generators.Tests, Manifold.Cli.Tests, Manifold.Mcp.Tests Registry, invokers, catalogs, diagnostics
Performance tests Manifold.Cli.Tests Allocation budgets for fast paths
Integration smoke tests Manifold.Samples.Tests End-to-end sample host validation

Related Pages

⚠️ **GitHub.com Fallback** ⚠️