The Rust Collections Guide

Queues and Alternative Sequential Collections

Last Updated: 2026-04-05

Why sequential collections are not all the same

At first glance, Vec<T>, VecDeque<T>, and LinkedList<T> can all look like containers that hold items in order. That is true at a high level, but their strengths come from different access patterns. The important question is not whether a collection is sequential. The important question is how the sequence is used.

A queue-like workload usually cares about insertion at one end and removal at the other. Some workloads need efficient pushing and popping at both ends. Others mostly append and iterate, in which case Vec<T> is often enough. Understanding these distinctions matters because a collection that looks abstractly appropriate can still be the wrong fit in practice.

This lesson focuses on the main standard-library alternatives for front-heavy sequential work: VecDeque<T> and LinkedList<T>. The broader goal is to learn how to reason from workload shape rather than from data-structure folklore.

Why `Vec<T>` struggles with front-heavy workloads

Vec<T> is the default sequential collection in Rust because it is contiguous, flexible, and fast for append-heavy workloads. But front operations are not its strength. Removing an item from the front of a vector usually requires shifting all later elements one position to the left. Inserting at the front usually requires shifting them to the right.

fn main() {
    let mut values = vec![10, 20, 30, 40];
    let first = values.remove(0);
 
    println!("removed = {}", first);
    println!("remaining = {:?}", values);
}

This works, but repeated front removal is a bad fit. If your algorithm regularly pops from the front, the issue is not correctness. The issue is that the data structure does not match the access pattern.

That is the first major reason VecDeque<T> exists.

What `VecDeque<T>` is for

VecDeque<T> is a double-ended queue. It is designed for efficient pushing and popping at both the front and the back. That makes it a natural fit for queues, breadth-first search frontiers, sliding windows, schedulers, and buffering situations where items enter and leave from both ends.

Unlike Vec<T>, VecDeque<T> is not conceptually just a contiguous prefix of storage that grows only at the back. It is implemented as a ring buffer, which means the logical beginning of the queue can move around within the allocated storage.

use std::collections::VecDeque;
 
fn main() {
    let mut queue = VecDeque::new();
    queue.push_back("task-1");
    queue.push_back("task-2");
    queue.push_front("urgent");
 
    println!("queue = {:?}", queue);
    println!("front = {:?}", queue.pop_front());
    println!("back = {:?}", queue.pop_back());
    println!("remaining = {:?}", queue);
}

This is the core value proposition. If both ends matter, VecDeque<T> is usually the first standard-library type to consider.

Ring-buffer behavior and why it matters

A ring buffer stores elements in allocated space that can wrap around. The logical front of the sequence does not need to live at the first physical slot of the buffer. Instead, front and back positions move as items are added and removed.

That design avoids repeated element shifting for front operations. A pop from the front can simply advance the logical starting position. A push to the front can move the start backward within the ring if capacity allows.

You usually do not need to think about the internal indices directly, but understanding the model helps explain why VecDeque<T> supports efficient double-ended operations while still avoiding the pointer-heavy structure of a linked list.

A useful mental picture is that VecDeque<T> keeps sequence semantics while relaxing the requirement that the first logical element must be stored at the first physical slot.

Basic `VecDeque<T>` operations

The most common VecDeque<T> operations are push_back, push_front, pop_back, and pop_front. These directly match queue and deque workloads.

use std::collections::VecDeque;
 
fn main() {
    let mut deque = VecDeque::new();
 
    deque.push_back(1);
    deque.push_back(2);
    deque.push_front(0);
 
    println!("deque = {:?}", deque);
    println!("pop_front = {:?}", deque.pop_front());
    println!("pop_back = {:?}", deque.pop_back());
    println!("after pops = {:?}", deque);
}

These methods express the intent of the algorithm more clearly than trying to force the same pattern through repeated Vec<T> insertions and removals at index 0.

Using `VecDeque<T>` as a queue

A common way to use VecDeque<T> is as a FIFO queue: push at the back, pop from the front.

use std::collections::VecDeque;
 
fn main() {
    let mut jobs = VecDeque::new();
    jobs.push_back("parse");
    jobs.push_back("validate");
    jobs.push_back("write");
 
    while let Some(job) = jobs.pop_front() {
        println!("processing {job}");
    }
}

This is one of the cleanest examples of choosing a collection by access pattern. The data structure maps directly to the algorithm. That is usually a good sign.

A breadth-first search style example

Breadth-first search is a classic front-heavy workload. New nodes are discovered and added to the back, while pending nodes are processed from the front.

use std::collections::VecDeque;
 
fn main() {
    let graph = vec![
        vec![1, 2],
        vec![3],
        vec![3, 4],
        vec![],
        vec![]
    ];
 
    let mut visited = vec![false; graph.len()];
    let mut queue = VecDeque::new();
 
    visited[0] = true;
    queue.push_back(0);
 
    while let Some(node) = queue.pop_front() {
        println!("visit {node}");
 
        for &next in &graph[node] {
            if !visited[next] {
                visited[next] = true;
                queue.push_back(next);
            }
        }
    }
}

This example is simple, but it captures the main pattern. When the front advances steadily and new work arrives at the back, VecDeque<T> is usually the natural standard-library answer.

Indexing and iteration with `VecDeque<T>`

VecDeque<T> is sequential, so it can still be iterated over in logical order. It also supports indexing, though its main value is not random-access performance but balanced end operations.

use std::collections::VecDeque;
 
fn main() {
    let mut deque = VecDeque::new();
    deque.push_back("a");
    deque.push_back("b");
    deque.push_back("c");
 
    println!("index 1 = {}", deque[1]);
 
    for item in &deque {
        println!("item = {item}");
    }
}

This matters because VecDeque<T> is not only for abstract queue operations. It is still a usable sequential collection. But when most of your work is just append, iterate, and occasionally index, Vec<T> will often remain the simpler default.

Capacity and growth in `VecDeque<T>`

Like vectors, deques grow dynamically and track spare capacity. They can allocate additional space as needed, and you can preallocate when you expect a substantial number of elements.

use std::collections::VecDeque;
 
fn main() {
    let mut deque = VecDeque::with_capacity(16);
    println!("len = {}, cap = {}", deque.len(), deque.capacity());
 
    for n in 0..8 {
        deque.push_back(n);
    }
 
    println!("deque = {:?}", deque);
    println!("len = {}, cap = {}", deque.len(), deque.capacity());
}

The exact internal growth strategy is not something to depend on, but the same high-level reasoning applies as with Vec<T>: reserve space when you have useful knowledge about future size, especially in performance-sensitive code.

When `VecDeque<T>` is better than `Vec<T>`

Choose VecDeque<T> when front and back operations are both central to the workload. This includes classic queues, double-ended task scheduling, rolling buffers, and algorithms that repeatedly consume the front while producing more items at the back.

Do not choose it just because it sounds more specialized or more advanced. If your workload mostly pushes at the back, iterates, sorts, or transforms data, Vec<T> is usually still the better default.

A good rule is that VecDeque<T> earns its complexity when front removal or front insertion is not occasional but structural to the algorithm.

What `LinkedList<T>` is

LinkedList<T> is a doubly linked list in the standard library. Each element is stored in its own node, and each node points to its neighbors. This structure allows insertion and removal at linked positions without shifting large contiguous regions of elements.

use std::collections::LinkedList;
 
fn main() {
    let mut list = LinkedList::new();
    list.push_back(10);
    list.push_back(20);
    list.push_front(5);
 
    println!("list = {:?}", list);
}

At an abstract data-structures level, linked lists can sound appealing because insertion and removal do not require bulk shifting the way vectors do. But that is not the whole performance story.

Why linked lists are rarely preferred in Rust

In practice, linked lists are rarely the best default in Rust. The reason is not that they are incorrect or useless. The reason is that they trade away too much for the workloads most real programs actually have.

Each node in a linked list is separately allocated. That means more pointer chasing, poorer cache locality, more allocator interaction, and less efficient traversal compared with contiguous storage. Many operations that look attractive in asymptotic terms perform worse in real code because memory layout matters a great deal.

A vector or deque often wins simply because its elements are stored much more compactly and are traversed more efficiently. In other words, linked lists often lose not on theory but on actual machine behavior.

A practical comparison mindset

Suppose you need a queue. A vector gives excellent end-appends but poor front removal. A deque gives efficient front and back operations while keeping sequence semantics and avoiding pointer-heavy traversal. A linked list also supports end operations well, but its node-based layout often makes iteration and memory behavior worse.

That is why VecDeque<T> is usually preferred over LinkedList<T> for queue-like workloads in Rust. It handles the access pattern you care about while preserving much better locality.

This is a general lesson in data-structure selection. The collection with the best-looking isolated operation is not always the best choice for the whole workload.

When `LinkedList<T>` might still make sense

LinkedList<T> is not forbidden or broken. It can make sense in niche situations where stable node-based structure matters more than contiguous traversal, or when you are explicitly modeling linked behavior and understand the costs.

But those cases are uncommon in everyday Rust. Most application code that thinks it wants a linked list actually wants one of three things instead: a Vec<T>, a VecDeque<T>, or a more domain-specific structure built around explicit indices or handles.

That is why many Rust programmers treat LinkedList<T> as a type to know about, but not as a routine first-choice collection.

A side-by-side queue example

The following example shows the same queue idea implemented first with Vec<T> and then with VecDeque<T>. The point is not that the vector version is impossible. The point is that the deque version better matches the intended operations.

use std::collections::VecDeque;
 
fn process_with_vec(mut items: Vec<&str>) {
    while !items.is_empty() {
        let item = items.remove(0);
        println!("vec processed {item}");
    }
}
 
fn process_with_vecdeque(mut items: VecDeque<&str>) {
    while let Some(item) = items.pop_front() {
        println!("deque processed {item}");
    }
}
 
fn main() {
    process_with_vec(vec!["a", "b", "c"]);
 
    let mut deque = VecDeque::new();
    deque.push_back("a");
    deque.push_back("b");
    deque.push_back("c");
    process_with_vecdeque(deque);
}

The difference is small in code and large in fit. The deque version expresses queue semantics directly.

Choosing among `Vec<T>`, `VecDeque<T>`, and `LinkedList<T>`

Use Vec<T> when the main operations are append, iterate, transform, sort, search, or occasional indexing.

Use VecDeque<T> when your workload naturally pushes and pops from both ends, especially when front removal is frequent.

Be cautious with LinkedList<T>. Reach for it only when you have a concrete reason tied to node-based structure and understand why contiguous storage is not serving the workload.

This is less about memorizing rules and more about recognizing shapes. Ordered append-heavy batch processing suggests Vec<T>. FIFO processing suggests VecDeque<T>. A linked list is usually not the winning answer unless the problem is unusually specific.

API design for queue-oriented code

When designing APIs, the collection you store internally does not always need to be the collection you expose. A system may store pending work in a VecDeque<T> internally because queue operations matter, while still returning slices, iterators, or snapshots elsewhere depending on what callers need.

The general lesson from earlier collection topics still applies here: choose internal representation based on workload, and choose interfaces based on what downstream code actually needs.

If a function only consumes queue items one by one, it may accept an iterator rather than a concrete deque. If a type manages a queue internally, VecDeque<T> is often a good implementation detail rather than a public commitment.

A small decision checklist

Ask four quick questions.

Are items mostly appended and later iterated in order? Start with Vec<T>.

Are items frequently removed from the front or added to both ends? Consider VecDeque<T>.

Do you think you want a linked list mainly because insertion and removal sound cheap? Pause and check whether locality and traversal costs matter more. They usually do.

Do you need a node-based structure for a specific reason rather than a general sequential container? Only then seriously consider LinkedList<T>.

A small sandbox project

A tiny Cargo project is enough to compare queue-oriented behavior between vectors, deques, and linked lists.

[package]
name = "queues-guide"
version = "0.1.0"
edition = "2024"

Create it and run it like this.

cargo new queues-guide
cd queues-guide
cargo run

A minimal src/main.rs could look like this.

use std::collections::{LinkedList, VecDeque};
 
fn main() {
    let mut deque = VecDeque::new();
    deque.push_back("job-1");
    deque.push_back("job-2");
    deque.push_front("urgent");
 
    println!("processing deque:");
    while let Some(job) = deque.pop_front() {
        println!("  {job}");
    }
 
    let mut list = LinkedList::new();
    list.push_back(1);
    list.push_back(2);
    list.push_front(0);
 
    println!("linked list iteration:");
    for value in &list {
        println!("  {value}");
    }
}

This small project shows the main contrast. VecDeque<T> is a practical queue-oriented sequential collection. LinkedList<T> exists and works, but its niche is much narrower than many beginners first assume.