What are the trade-offs between quote::TokenStreamExt::append_all and extend for combining token streams?

append_all and extend both add tokens to a TokenStream, but append_all is a quote-specific extension that flattens nested token streams and iterates over any IntoIterator<TokenStream>, while extend is the standard library's Extend trait implementation that treats each item as a complete TokenStream to append. The key difference is how they handle the source type: append_all works with iterators of token trees or token streams and flattens them, while extend works with TokenStream values directly.

Basic TokenStream Extension

use proc_macro2::TokenStream;
use quote::quote;
 
fn basic_extension() {
    let mut stream = TokenStream::new();
    
    // Using extend from std::iter::Extend
    let one = quote! { let x = 1; };
    stream.extend(one);
    
    // extend takes IntoIterator<Item = TokenTree>
    // or a TokenStream (which is IntoIterator)
    
    // Using append_all from quote::TokenStreamExt
    let two = quote! { let y = 2; };
    stream.append_all(two);
    
    // Result: stream contains both statements
    println!("{}", stream);
}

Both methods add tokens to the stream, but with different input expectations.

The TokenStreamExt Trait

use proc_macro2::TokenStream;
use quote::{quote, TokenStreamExt};
 
fn trait_overview() {
    // TokenStreamExt is a trait from the quote crate
    // It extends proc_macro2::TokenStream with additional methods
    
    // The trait provides:
    // - append(t: T) where T: Into<TokenTree>
    // - append_all<I>(iter: I) where I: IntoIterator, I::Item: Into<TokenStream>
    // - append_separated<I, S>(iter: I, sep: S)
    
    let mut stream = TokenStream::new();
    
    // append adds a single TokenTree
    let tree: proc_macro2::TokenTree = quote!(x).into_iter().next().unwrap();
    stream.append(tree);
    
    // append_all adds multiple token streams, flattening each
    let items = vec![quote!(a), quote!(b), quote!(c)];
    stream.append_all(items);
    
    // append_separated adds items with a separator
    let items = vec![quote!(a), quote!(b), quote!(c)];
    stream.append_separated(items, quote!(,));
}

TokenStreamExt provides quote-specific methods optimized for macro generation.

extend from std::iter::Extend

use proc_macro2::TokenStream;
use quote::quote;
 
fn extend_trait() {
    // TokenStream implements std::iter::Extend
    // The Extend trait has two methods:
    // - extend<I: IntoIterator<Item = T>>(&mut self, iter: I)
    // - extend_one(&mut self, item: T)
    
    let mut stream = TokenStream::new();
    
    // extend with a TokenStream
    let one = quote! { let x = 1; };
    stream.extend(one);
    
    // extend with an iterator of TokenTrees
    let trees: Vec<proc_macro2::TokenTree> = quote!(let y = 2;).into_iter().collect();
    stream.extend(trees);
    
    // extend with an iterator of TokenStreams
    let streams = vec![quote!(a), quote!(b)];
    // This works because TokenStream is IntoIterator<Item = TokenTree>
    for s in streams {
        stream.extend(s);
    }
}

extend is the standard library's trait for adding items to a collection.

Key Difference: Input Type Handling

use proc_macro2::TokenStream;
use quote::{quote, TokenStreamExt};
 
fn input_type_difference() {
    let mut stream1 = TokenStream::new();
    let mut stream2 = TokenStream::new();
    
    // Both work with a single TokenStream
    let single = quote! { fn foo() {} };
    stream1.extend(single.clone());
    stream2.append_all(single);
    // Result is the same for single streams
    
    // Difference appears with collections
    let items = vec![quote!(a), quote!(b), quote!(c)];
    
    // extend: takes IntoIterator<Item = TokenTree>
    // Cannot directly take Vec<TokenStream>
    // stream1.extend(items);  // This would NOT compile
    
    // Must iterate manually
    for item in items.clone() {
        stream1.extend(item);
    }
    
    // append_all: takes IntoIterator where Item: Into<TokenStream>
    // This DOES work with Vec<TokenStream> directly
    stream2.append_all(items.clone());
    
    // Both produce the same result, but append_all is more ergonomic
    assert_eq!(stream1.to_string(), stream2.to_string());
}

append_all accepts collections of token streams directly; extend requires manual iteration.

Flattening Behavior

use proc_macro2::TokenStream;
use quote::{quote, TokenStreamExt};
 
fn flattening_behavior() {
    // append_all flattens each item into the stream
    // Each TokenStream is iterated and its TokenTrees appended
    
    let mut stream = TokenStream::new();
    
    // These are equivalent:
    let items = vec![quote!(a), quote!(b)];
    
    // Approach 1: append_all
    stream.append_all(items.clone());
    
    // Approach 2: Manual iteration with extend
    let mut manual_stream = TokenStream::new();
    for item in items {
        manual_stream.extend(item);
    }
    
    assert_eq!(stream.to_string(), manual_stream.to_string());
    
    // Both produce: "a b"
    // Each TokenStream's tokens are appended in order
}

append_all is syntactic sugar for iterating and extending each item.

Iterators of Different Types

use proc_macro2::TokenStream;
use quote::{quote, TokenStreamExt};
 
fn different_iterators() {
    let mut stream = TokenStream::new();
    
    // append_all works with various iterator types
    // The key is that each item must be Into<TokenStream>
    
    // Vec<TokenStream>
    let vec_items = vec![quote!(a), quote!(b)];
    stream.append_all(vec_items);
    
    // Iterator
    let iter_items = (0..3).map(|i| quote!(x # i));
    stream.append_all(iter_items);
    
    // Array
    let arr_items = [quote!(p), quote!(q)];
    stream.append_all(arr_items);
    
    // slice
    let slice_items: &[TokenStream] = &[quote!(m), quote!(n)];
    stream.append_all(slice_items.iter().cloned());
    
    // The flexibility comes from IntoIterator<Item: Into<TokenStream>>
}

append_all accepts any iterator where items can be converted to TokenStream.

Working with TokenTree Iterators

use proc_macro2::TokenStream;
use quote::{quote, TokenStreamExt};
 
fn token_tree_iterators() {
    let mut stream = TokenStream::new();
    
    // TokenStream itself is IntoIterator<Item = TokenTree>
    // So TokenStream can be passed to extend directly
    
    let one = quote! { a + b };
    stream.extend(one.clone());  // Works: TokenStream is IntoIterator
    
    // append_all also works with TokenStream
    stream.append_all(one.clone());  // Also works
    
    // With TokenTree iterators
    let trees: Vec<proc_macro2::TokenTree> = quote!(x + y).into_iter().collect();
    
    // extend accepts IntoIterator<Item = TokenTree>
    stream.extend(trees.clone());
    
    // append_all requires IntoIterator<Item: Into<TokenStream>>
    // TokenTree is Into<TokenStream> via IntoIterator for a single tree
    stream.append_all(trees.clone().into_iter());
    
    // Both produce the same result for TokenTree iterators
}

Both methods work with TokenTree iterators, though with slightly different ergonomics.

Performance Characteristics

use proc_macro2::TokenStream;
use quote::{quote, TokenStreamExt};
 
fn performance() {
    // Both methods are O(n) in the number of token trees
    // The difference is in allocation patterns
    
    let mut stream = TokenStream::new();
    
    // extend with single TokenStream
    // Allocates once for the tokens
    let single = quote! { fn example() {} };
    stream.extend(single);
    
    // append_all with iterator
    // May allocate multiple times depending on iterator
    let items: Vec<TokenStream> = (0..10).map(|i| quote!(x # i)).collect();
    stream.append_all(items);
    
    // For large collections, consider reserving capacity
    // (TokenStream doesn't expose reserve, but internal optimization exists)
    
    // Performance difference is minimal in practice
    // Choose based on ergonomics, not performance
    
    // append_all is more ergonomic for collections
    // extend is standard and works with any IntoIterator<Item = TokenTree>
}

Performance is similar; choose based on code clarity.

Common Pattern: Generating Repeated Code

use proc_macro2::TokenStream;
use quote::{quote, TokenStreamExt};
 
fn repeated_code_generation() {
    let mut output = TokenStream::new();
    
    // Generate multiple struct definitions
    let structs: Vec<TokenStream> = (0..3)
        .map(|i| {
            quote! {
                struct Struct#i {
                    field: i32,
                }
            }
        })
        .collect();
    
    // append_all is ergonomic for this pattern
    output.append_all(structs);
    
    // vs. manual iteration with extend
    let mut output2 = TokenStream::new();
    for struct_def in (0..3).map(|i| {
        quote! {
            struct Struct#i {
                field: i32,
            }
        }
    }) {
        output2.extend(struct_def);
    }
    
    assert_eq!(output.to_string(), output2.to_string());
    // Result:
    // struct Struct0 { field: i32, } struct Struct1 { field: i32, } struct Struct2 { field: i32, }
}

append_all provides cleaner syntax for generating repeated code structures.

append_separated for Delimited Lists

use proc_macro2::TokenStream;
use quote::{quote, TokenStreamExt};
 
fn separated_lists() {
    let mut stream = TokenStream::new();
    
    // append_separated adds a separator between items
    let items = vec![quote!(a), quote!(b), quote!(c)];
    stream.append_separated(items, quote!(,));
    // Result: a , b , c
    
    // Common pattern: generating function arguments
    let args: Vec<TokenStream> = (0..3)
        .map(|i| quote!(arg#i: i32))
        .collect();
    
    let mut function_stream = TokenStream::new();
    function_stream.extend(quote!(fn example());
    function_stream.append_separated(args, quote!(,));
    function_stream.extend(quote!( ) { }));
    // Result: fn example( arg0: i32 , arg1: i32 , arg2: i32 ) { }
    
    // This is a common macro generation pattern
    // No equivalent in extend - would need manual separator handling
}

append_separated is unique to TokenStreamExt and has no extend equivalent.

extend with Option Types

use proc_macro2::TokenStream;
use quote::quote;
 
fn option_handling() {
    let mut stream = TokenStream::new();
    
    // extend works with Option<TokenStream>
    // Because Option<T> implements IntoIterator
    let maybe_tokens: Option<TokenStream> = Some(quote!(let x = 1;));
    
    // extend accepts the Option directly
    stream.extend(maybe_tokens);
    // If Some, tokens are added; if None, nothing added
    
    let none_tokens: Option<TokenStream> = None;
    stream.extend(none_tokens);  // No effect
    
    // append_all also handles Option via IntoIterator
    // But the pattern is different
    stream.append_all(maybe_tokens.clone());
    
    // Both work the same for Option<TokenStream>
    // extend is slightly more idiomatic for Option
}

Both methods handle Option<TokenStream>, though extend is often clearer.

Interpolation with quote!

use proc_macro2::TokenStream;
use quote::{quote, TokenStreamExt};
 
fn interpolation() {
    // The quote! macro provides its own interpolation
    // Which handles repeated elements with #(...)
    
    let items = vec![quote!(a), quote!(b), quote!(c)];
    
    // Using quote! with repetition
    let stream1 = quote! {
        #(#items)*
    };
    
    // Using append_all
    let mut stream2 = TokenStream::new();
    stream2.append_all(&items);
    
    // Both produce: a b c
    // quote! with #(...) interpolation is usually more ergonomic
    
    // Use append_all when building streams programmatically
    // outside of quote! macro context
    
    let mut dynamic_stream = TokenStream::new();
    
    // Conditional appending
    let condition = true;
    if condition {
        dynamic_stream.append_all(quote!(enabled));
    } else {
        dynamic_stream.append_all(quote!(disabled));
    }
}

quote! interpolation is preferred when possible; append_all is for programmatic construction.

Practical Example: Derive Macro

use proc_macro2::TokenStream;
use quote::{quote, TokenStreamExt};
 
fn derive_macro_example() {
    // Imagine implementing a derive macro
    // that generates methods for struct fields
    
    let fields = vec!["name", "age", "email"];
    
    // Generate getter methods for each field
    let mut getters = TokenStream::new();
    
    for field in &fields {
        let field_name = syn::Ident::new(field, proc_macro2::Span::call_site());
        let getter = quote! {
            pub fn #field_name(&self) -> &str {
                &self.#field_name
            }
        };
        getters.extend(getter);
    }
    
    // Using append_all for the same pattern
    let getters2: Vec<TokenStream> = fields.iter()
        .map(|field| {
            let field_name = syn::Ident::new(field, proc_macro2::Span::call_site());
            quote! {
                pub fn #field_name(&self) -> &str {
                    &self.#field_name
                }
            }
        })
        .collect();
    
    let mut getters_stream = TokenStream::new();
    getters_stream.append_all(getters2);
    
    // Both approaches work; append_all is cleaner for collecting
    // mapped iterators into streams
}

append_all is often cleaner when collecting mapped iterators into streams.

Comparison Summary

use proc_macro2::TokenStream;
use quote::{quote, TokenStreamExt};
 
fn comparison() {
    let mut stream = TokenStream::new();
    
    // extend (from std::iter::Extend):
    // - Standard library trait
    // - Takes IntoIterator<Item = TokenTree>
    // - Works with TokenStream (which is IntoIterator)
    // - Requires manual iteration for collections
    // - Idiomatic for Option<TokenStream>
    
    let single = quote! { let x = 1; };
    stream.extend(single);
    
    // append_all (from quote::TokenStreamExt):
    // - quote crate extension
    // - Takes IntoIterator where Item: Into<TokenStream>
    // - Works directly with Vec<TokenStream>, iterators
    // - Ergonomic for collections of token streams
    // - No direct equivalent for Option (use extend)
    
    let items = vec![quote!(a), quote!(b)];
    stream.append_all(items);
    
    // Related: append (single TokenTree)
    // append_separated (with separator)
    
    // When to use which:
    // - Single TokenStream: extend or append_all (equivalent)
    // - Collection of TokenStreams: append_all
    // - Option<TokenStream>: extend (more idiomatic)
    // - With separator: append_separated (unique)
    // - Single TokenTree: append
}

Quick reference:

Method Source Input Type Use Case
extend std::iter::Extend IntoIterator<Item = TokenTree> Standard extension
append quote::TokenStreamExt Into<TokenTree> Single token
append_all quote::TokenStreamExt IntoIterator<Item: Into<TokenStream>> Collections
append_separated quote::TokenStreamExt IntoIterator + Separator Delimited lists

Key insight: The trade-off between append_all and extend is primarily ergonomic rather than performance-related. append_all accepts IntoIterator where each item implements Into<TokenStream>, making it work directly with Vec<TokenStream>, Iterator<Item = TokenStream>, and similar collections. extend implements the standard library's Extend trait, taking IntoIterator<Item = TokenTree>, which means it works naturally with TokenStream values (since TokenStream is IntoIterator<Item = TokenTree>) but requires manual iteration when you have a collection. For Option<TokenStream>, both work but extend is more idiomatic since it matches the standard library's pattern for optional values. The TokenStreamExt trait also provides append_separated, which has no extend equivalent and is specifically designed for generating comma-separated lists or other delimited structures common in macro generation. In practice, use append_all when working with collections of token streams, use extend for single streams or Option values, and use append_separated when generating delimited lists. When writing code inside quote!, prefer #(...)* interpolation over both methods for readability.