Select a result to preview
Queue<T> is a FIFO (first in, first out) collection. The earliest enqueued item is processed first. Use it for buffering, breadth-first traversal, and producer-consumer style pipelines.
Queue<T> is implemented as a circular buffer in .NET:
Enqueue adds at the tail; Dequeue removes from the head.Because head and tail indices advance modulo the array length, neither Enqueue nor Dequeue shifts elements — only index arithmetic occurs. When the buffer fills completely, the queue copies all elements to a larger array with the head reset to index 0, then resumes the circular layout. This one-time O(n) resize is amortized over many operations so steady-state throughput stays O(1).
graph LR
F[front] --> A[item one] --> B[item two] --> C[item three] --> T[back]var jobs = new Queue<string>();
jobs.Enqueue("job-1");
jobs.Enqueue("job-2");
Console.WriteLine(jobs.Dequeue()); // job-1
Console.WriteLine(jobs.Peek()); // job-2
Dequeue/Peek on an empty queue throws InvalidOperationException. Guard with Count when queue emptiness is expected.PriorityQueue<TElement, TPriority> when ordering by priority is required.Queue<T> vs Stack<T>: queue preserves arrival order, stack prioritizes newest items.Queue<T> vs Channel<T>: queue is simple in-memory buffering, channels provide richer async coordination for concurrent producers/consumers.Queue<T> suitable for BFS?BFS processes nodes in discovery order by levels. FIFO behavior naturally enforces this traversal order.
Queue<T> with PriorityQueue<TElement, TPriority>?When business correctness depends on priority rather than arrival time, such as shortest-path, scheduler, or SLA-driven dispatching.
Complexity is not the only risk. If producers outpace consumers, memory grows and latency spikes. Throughput and backpressure design matter more than method complexity.
Parent
02 Computer Science
Pages