Parallelism

Intro

Parallelism is about finishing CPU-bound work faster by using multiple cores at the same time. In .NET, the main tools are Parallel.ForEachAsync, PLINQ, and custom partitioned pipelines. Effective parallel code maximizes throughput while preserving determinism, bounded resource usage, and observability.

How It Works

The practical pipeline is: partition work, execute partitions concurrently, collate results safely.

There is two patterns that are still useful decision anchors:

Example

public async Task<IReadOnlyList<Result>> ComputeAsync(
    IReadOnlyList<Job> jobs,
    CancellationToken cancellationToken)
{
    var results = new ConcurrentBag<Result>();

    await Parallel.ForEachAsync(
        jobs,
        new ParallelOptions
        {
            MaxDegreeOfParallelism = Environment.ProcessorCount,
            CancellationToken = cancellationToken
        },
        (job, ct) =>
        {
            var value = ExpensiveTransform(job, ct);
            results.Add(value);
            return ValueTask.CompletedTask;
        });

    return results.ToList();
}

PLINQ example for pure transforms

public int[] ComputePrimes(int fromInclusive, int toExclusive)
{
    return Enumerable.Range(fromInclusive, toExclusive - fromInclusive)
        .AsParallel()
        .Where(n => n > 1 && Enumerable.Range(2, (int)Math.Sqrt(n) - 1)
            .All(i => n % i != 0))
        .ToArray();
}

PLINQ works best when each element has enough CPU work to amortize partitioning and merge costs.

Pitfalls

Tradeoffs

Approach Best for Cost
Parallel.ForEachAsync CPU-bound work with async-compatible bodies Overhead per partition; requires CancellationToken threading
PLINQ Pure transforms on in-memory sequences Merge cost; ordering adds extra overhead; harder to debug
Task.WhenAll fan-out I/O-bound work (HTTP, DB) Thread-pool friendly; no CPU parallelism benefit
Manual partitioning + channels Streaming pipelines with backpressure Most complex; best throughput for producer/consumer patterns

Decision rule: start with Parallel.ForEachAsync for CPU-bound batch work. Switch to PLINQ when the operation is a pure transform and you want terse syntax. Use Task.WhenAll for I/O. Reach for channels only when you need backpressure or streaming.

Questions


Whats next