What is the purpose of quote::ToTokens trait and how is it used in procedural macros?

ToTokens is the foundational trait in the quote crate that defines how Rust syntax elements convert into TokenStream objects—the raw sequence of tokens that the Rust compiler processes. Every type that can appear in a quote! macro must implement ToTokens, including primitive types, strings, and syn's syntax tree types like Ident, Type, and Expr. The trait enables the core abstraction of procedural macros: transforming parsed syntax trees back into compilable token streams. When you write quote! { let #name: #ty = #expr; }, each interpolated variable (#name, #ty, #expr) has its to_tokens method called, converting that value into tokens that are spliced into the output stream.

The ToTokens Trait Definition

// From the quote crate
pub trait ToTokens {
    fn to_tokens(&self, tokens: &mut TokenStream);
    
    // Provided methods
    fn to_token_stream(&self) -> TokenStream {
        let mut tokens = TokenStream::new();
        self.to_tokens(&mut tokens);
        tokens
    }
    
    fn into_token_stream(self) -> TokenStream
    where
        Self: Sized,
    {
        self.to_token_stream()
    }
}

The trait is simple: implement to_tokens to append your type's tokens to an existing TokenStream.

Basic ToTokens Implementations

use quote::{quote, ToTokens, TokenStreamExt};
use proc_macro2::TokenStream;
 
// Built-in implementations for primitive types
fn primitive_implementations() {
    let tokens: TokenStream = quote! {
        let number = #42;           // i32 implements ToTokens
        let float = #3.14;          // f64 implements ToTokens
        let string = #"hello";      // &str implements ToTokens
        let boolean = #true;        // bool implements ToTokens
        let char_val = #'x';        // char implements ToTokens
    };
    
    println!("{}", tokens);
    // let number = 42i32 ; let float = 3.14f64 ; let string = "hello" ; ...
}

The quote crate provides ToTokens implementations for all Rust primitives.

ToTokens for syn Types

use quote::{quote, ToTokens};
use syn::{Ident, Type, Expr, parse_quote};
use proc_macro2::TokenStream;
 
fn syn_type_implementations() {
    // syn types implement ToTokens
    let name: Ident = parse_quote!(my_variable);
    let ty: Type = parse_quote!(Vec<String>);
    let expr: Expr = parse_quote!(vec!["hello", "world"]);
    
    // Each can be interpolated into quote!
    let tokens: TokenStream = quote! {
        let #name: #ty = #expr;
    };
    
    println!("{}", tokens);
    // let my_variable : Vec < String > = vec ! [ "hello" , "world" ] ;
}

All syn syntax tree types implement ToTokens, enabling round-trip parsing and generation.

How quote! Uses ToTokens

use quote::{quote, ToTokens};
use proc_macro2::TokenStream;
 
fn quote_mechanics() {
    let name = "my_function";
    let count = 42usize;
    
    // The quote! macro converts each #variable to tokens
    // by calling ToTokens::to_tokens(&variable, &mut output)
    let tokens: TokenStream = quote! {
        fn #name() -> usize {
            #count
        }
    };
    
    // This is roughly equivalent to:
    let mut output = TokenStream::new();
    output.extend(quote!(fn ));
    name.to_tokens(&mut output);  // ToTokens for &str
    output.extend(quote!( () -> usize { ));
    count.to_tokens(&mut output); // ToTokens for usize
    output.extend(quote!( } ));
}

The # interpolation syntax is syntactic sugar for calling to_tokens on the value.

Implementing ToTokens for Custom Types

use quote::{quote, ToTokens, TokenStreamExt};
use proc_macro2::TokenStream;
use syn::Ident;
 
// A custom structure in a procedural macro
struct FieldDefinition {
    name: Ident,
    ty: syn::Type,
    default_value: Option<syn::Expr>,
}
 
impl ToTokens for FieldDefinition {
    fn to_tokens(&self, tokens: &mut TokenStream) {
        // Generate: name: ty = default
        self.name.to_tokens(tokens);
        tokens.extend(quote!(:));
        self.ty.to_tokens(tokens);
        
        if let Some(default) = &self.default_value {
            tokens.extend(quote!( = ));
            default.to_tokens(tokens);
        }
    }
}
 
fn custom_totokens_usage() {
    let field = FieldDefinition {
        name: syn::parse_quote!(count),
        ty: syn::parse_quote!(usize),
        default_value: Some(syn::parse_quote!(0)),
    };
    
    // Can now use field in quote!
    let tokens = quote! {
        struct MyStruct {
            #field,
        }
    };
    
    println!("{}", tokens);
    // struct MyStruct { count : usize = 0 , }
}

Implementing ToTokens for your types enables clean interpolation into quote!.

ToTokens for Enum Variants

use quote::{quote, ToTokens, TokenStreamExt};
use proc_macro2::TokenStream;
 
enum Accessor {
    Field(syn::Ident),
    Index(syn::Expr),
    Method(syn::Ident, Vec<syn::Expr>),
}
 
impl ToTokens for Accessor {
    fn to_tokens(&self, tokens: &mut TokenStream) {
        match self {
            Accessor::Field(ident) => {
                tokens.extend(quote!(.#ident));
            }
            Accessor::Index(expr) => {
                tokens.extend(quote!([#expr]));
            }
            Accessor::Method(name, args) => {
                tokens.extend(quote!(.#name(#(#args),*)));
            }
        }
    }
}
 
fn enum_accessor_example() {
    let accessors = vec![
        Accessor::Field(syn::parse_quote!(name)),
        Accessor::Index(syn::parse_quote!(0)),
        Accessor::Method(syn::parse_quote!(len), vec![]),
    ];
    
    let base: syn::Ident = syn::parse_quote!(data);
    let tokens = quote! {
        let value = #base#(#accessors)*;
    };
    
    println!("{}", tokens);
    // let value = data . name [0usize] . len () ;
}

Enums with ToTokens can represent different code generation strategies.

Repetition with ToTokens

use quote::{quote, ToTokens};
use proc_macro2::TokenStream;
 
fn repetition_patterns() {
    let fields = vec![
        (syn::parse_quote!(name), syn::parse_quote!(String)),
        (syn::parse_quote!(age), syn::parse_quote!(u32)),
        (syn::parse_quote!(active), syn::parse_quote!(bool)),
    ];
    
    // #(...)* repeats for each element
    let tokens: TokenStream = quote! {
        struct Person {
            #(
                #fields.0: #fields.1,
            )*
        }
    };
    
    println!("{}", tokens);
    // struct Person { name : String , age : u32 , active : bool , }
    
    // Multiple variables zipped together
    let names: Vec<_> = fields.iter().map(|(n, _)| n).collect();
    let types: Vec<_> = fields.iter().map(|(_, t)| t).collect();
    
    let tokens: TokenStream = quote! {
        struct Person {
            #(
                #names: #types,
            )*
        }
    };
}

Repetition uses ToTokens internally, calling to_tokens for each element.

ToTokens in Derive Macros

use quote::{quote, ToTokens};
use syn::{parse_macro_input, DeriveInput, Data, Fields};
use proc_macro::TokenStream as ProcTokenStream;
 
#[proc_macro_derive(Builder)]
fn derive_builder(input: ProcTokenStream) -> ProcTokenStream {
    let input = parse_macro_input!(input as DeriveInput);
    
    // The struct name implements ToTokens
    let name = &input.ident;
    
    // Build the builder struct name
    let builder_name = syn::Ident::new(
        &format!("{}Builder", name),
        name.span(),
    );
    
    // Extract fields
    let fields = match &input.data {
        Data::Struct(data) => match &data.fields {
            Fields::Named(fields) => &fields.named,
            _ => panic!("Only named fields supported"),
        },
        _ => panic!("Only structs supported"),
    };
    
    // Generate builder struct fields
    let builder_fields = fields.iter().map(|f| {
        let name = &f.ident;
        let ty = &f.ty;
        quote! {
            #name: Option<#ty>,
        }
    });
    
    // Generate build method
    let field_names: Vec<_> = fields.iter().map(|f| &f.ident).collect();
    let build_fields = fields.iter().map(|f| {
        let name = &f.ident;
        quote! {
            #name: self.#name.take().ok_or(concat!("missing ", stringify!(#name)))?,
        }
    });
    
    let expanded = quote! {
        pub struct #builder_name {
            #(#builder_fields)*
        }
        
        impl #builder_name {
            pub fn build(self) -> Result<#name, &'static str> {
                Ok(#name {
                    #(#build_fields)*
                })
            }
        }
    };
    
    ProcTokenStream::from(expanded)
}

Derive macros use ToTokens on syn types to generate code from parsed input.

ToTokens for Code Generation Patterns

use quote::{quote, ToTokens, TokenStreamExt};
use proc_macro2::TokenStream;
use syn::Ident;
 
struct MatchArm {
    pattern: syn::Pat,
    guard: Option<syn::Expr>,
    body: syn::Expr,
}
 
impl ToTokens for MatchArm {
    fn to_tokens(&self, tokens: &mut TokenStream) {
        self.pattern.to_tokens(tokens);
        
        if let Some(guard) = &self.guard {
            tokens.extend(quote!( if ));
            guard.to_tokens(tokens);
        }
        
        tokens.extend(quote!( => ));
        self.body.to_tokens(tokens);
    }
}
 
fn match_generation() {
    let arms = vec![
        MatchArm {
            pattern: syn::parse_quote!(Some(x)),
            guard: Some(syn::parse_quote!(x > 0)),
            body: syn::parse_quote!(x),
        },
        MatchArm {
            pattern: syn::parse_quote!(Some(_)),
            guard: None,
            body: syn::parse_quote!(0),
        },
        MatchArm {
            pattern: syn::parse_quote!(None),
            guard: None,
            body: syn::parse_quote!(-1),
        },
    ];
    
    let expr: syn::Expr = syn::parse_quote!(option_value);
    let tokens = quote! {
        match #expr {
            #(#arms),*
        }
    };
    
    println!("{}", tokens);
}

Complex code structures can implement ToTokens for clean generation.

Conditional Code Generation

use quote::{quote, ToTokens, TokenStreamExt};
use proc_macro2::TokenStream;
 
struct OptionalField {
    name: syn::Ident,
    ty: syn::Type,
    is_optional: bool,
}
 
impl ToTokens for OptionalField {
    fn to_tokens(&self, tokens: &mut TokenStream) {
        self.name.to_tokens(tokens);
        tokens.extend(quote!(:));
        
        if self.is_optional {
            tokens.extend(quote!(Option<));
            self.ty.to_tokens(tokens);
            tokens.extend(quote!(>));
        } else {
            self.ty.to_tokens(tokens);
        }
    }
}
 
// Alternative: use quote! with conditional logic
fn conditional_generation() {
    let optional = true;
    let inner_type: syn::Type = syn::parse_quote!(String);
    
    let ty = if optional {
        quote!(Option<#inner_type>)
    } else {
        quote!(#inner_type)
    };
    
    let tokens = quote! {
        let value: #ty = None;
    };
}

ToTokens implementations can include conditional logic for different code paths.

ToTokens for Attribute Macros

use quote::{quote, ToTokens};
use syn::{parse_macro_input, ItemFn, Attribute, parse::Parse, parse::ParseStream};
use proc_macro::TokenStream as ProcTokenStream;
 
#[proc_macro_attribute]
pub fn log_calls(attr: ProcTokenStream, item: ProcTokenStream) -> ProcTokenStream {
    // Parse the function
    let mut input = parse_macro_input!(item as ItemFn);
    
    // Get function name
    let name = &input.sig.ident;
    
    // Get function inputs
    let inputs = &input.sig.inputs;
    
    // Generate logging for each parameter
    let log_statements = inputs.iter().filter_map(|arg| {
        if let syn::FnArg::Typed(typed) = arg {
            if let syn::Pat::Ident(pat_ident) = &*typed.pat {
                let ident = &pat_ident.ident;
                return Some(quote! {
                    println!("{}: {:?}", stringify!(#ident), #ident);
                });
            }
        }
        None
    });
    
    // Wrap the original block
    let original_block = &input.block;
    
    // Generate new block with logging
    let new_block = quote! {
        {
            println!("Entering function {}", stringify!(#name));
            #(#log_statements)*
            #original_block
        }
    };
    
    // The ItemFn's ToTokens generates the function signature
    input.block = syn::parse2(new_block).unwrap();
    
    ProcTokenStream::from(quote!(#input))
}

Attribute macros use ToTokens on the entire ItemFn to regenerate the modified function.

Span Preservation with ToTokens

use quote::{quote, ToTokens};
use proc_macro2::Span;
use syn::Ident;
 
fn span_preservation() {
    // Create identifiers with specific spans
    let original_span = Span::call_site();
    let name = Ident::new("my_var", original_span);
    
    // ToTokens preserves the span
    let tokens = quote! {
        let #name = 42;
    };
    
    // Error messages will point to the original location
    // This is crucial for good error messages in proc macros
}
 
fn span_for_error_messages() {
    // When generating errors, preserve spans for good diagnostics
    let field_name: syn::Ident = syn::parse_quote!(my_field);
    let field_type: syn::Type = syn::parse_quote!(String);
    
    // The span of field_name carries through ToTokens
    // If this code causes an error, the error points to the original source
    let tokens = quote! {
        #field_name: #field_type
    };
}

ToTokens preserves spans, enabling error messages that point to the original source location.

ToTokens vs Display

use quote::{quote, ToTokens};
use proc_macro2::TokenStream;
 
fn totokens_vs_display() {
    let name = "my_function";
    
    // ToTokens: produces valid Rust tokens
    let tokens: TokenStream = quote! {
        fn #name() {}
    };
    // Output: fn my_function () { }
    
    // stringifying produces a string representation of tokens
    let token_string = tokens.to_string();
    
    // Display on strings doesn't work for interpolation
    // This won't compile:
    // let tokens = quote! { fn #name() {} };
    // Because ToTokens for &str produces a string literal
    let name_str = "my_function";
    let tokens = quote! {
        let s = #name_str;  // Produces: let s = "my_function" ;
    };
    
    // To produce an identifier from a string, parse it:
    let name_ident: syn::Ident = syn::parse_quote!(my_function);
    let tokens = quote! {
        fn #name_ident() {}
    };
    // Output: fn my_function () { }
}

ToTokens for strings produces string literals; use syn::parse_quote! for identifiers.

Advanced ToTokens Patterns

use quote::{quote, ToTokens, TokenStreamExt, format_ident};
use proc_macro2::TokenStream;
 
// Generating code with generated identifiers
fn generated_identifiers() {
    let prefix = "field";
    let count = 3;
    
    let fields: Vec<_> = (0..count)
        .map(|i| format_ident!("{}_{}", prefix, i))
        .collect();
    
    let tokens = quote! {
        struct Fields {
            #(#fields: i32),*
        }
    };
    
    println!("{}", tokens);
    // struct Fields { field_0 : i32 , field_1 : i32 , field_2 : i32 }
}
 
// Chaining ToTokens implementations
struct BuilderPattern {
    receiver: syn::Expr,
    method: syn::Ident,
    args: Vec<syn::Expr>,
}
 
impl ToTokens for BuilderPattern {
    fn to_tokens(&self, tokens: &mut TokenStream) {
        self.receiver.to_tokens(tokens);
        tokens.extend(quote!(.));
        self.method.to_tokens(tokens);
        tokens.extend(quote!((#(#self.args),*)));
    }
}
 
fn chained_calls() {
    let calls = vec![
        BuilderPattern {
            receiver: syn::parse_quote!(builder),
            method: syn::parse_quote!(name),
            args: vec![syn::parse_quote!("MyStruct")],
        },
        BuilderPattern {
            receiver: syn::parse_quote!(builder.name("MyStruct")),  // Result of previous
            method: syn::parse_quote!(field),
            args: vec![syn::parse_quote!("count"), syn::parse_quote!(42)],
        },
    ];
    
    // Each call generates its own tokens
    for call in &calls {
        let tokens = quote!(#call);
        println!("{}", tokens);
    }
}

Complex generation patterns compose naturally through ToTokens.

ToTokens for Generics

use quote::{quote, ToTokens};
use syn::{Generics, GenericParam, parse_quote};
 
fn generic_handling() {
    // Parse generics
    let generics: Generics = parse_quote!(<T: Clone, U: ToString>);
    
    // ToTokens on Generics produces the full generic clause
    let tokens = quote! {
        struct Point #generics {
            x: T,
            y: U,
        }
    };
    
    // Split generics for impl blocks
    let (impl_generics, ty_generics, where_clause) = generics.split_for_impl();
    
    let tokens = quote! {
        impl #impl_generics Point #ty_generics #where_clause {
            fn new(x: T, y: U) -> Self {
                Self { x, y }
            }
        }
    };
    
    println!("{}", tokens);
}

syn::Generics provides specialized methods for generating correct generic syntax.

Real-World Example: Derive Macro with ToTokens

use quote::{quote, ToTokens};
use syn::{parse_macro_input, DeriveInput, Data, Fields, Ident};
use proc_macro::TokenStream as ProcTokenStream;
 
#[proc_macro_derive(Deserialize)]
fn derive_deserialize(input: ProcTokenStream) -> ProcTokenStream {
    let input = parse_macro_input!(input as DeriveInput);
    
    let name = &input.ident;
    let generics = &input.generics;
    let (impl_generics, ty_generics, where_clause) = generics.split_for_impl();
    
    let fields = match &input.data {
        Data::Struct(data) => match &data.fields {
            Fields::Named(fields) => &fields.named,
            Fields::Unnamed(fields) => &fields.unnamed,
            Fields::Unit => panic!("Unit structs not supported"),
        },
        _ => panic!("Only structs supported"),
    };
    
    // Generate field deserialization
    let field_desers: Vec<_> = fields.iter().enumerate().map(|(i, f)| {
        let field_ident = f.ident.as_ref()
            .map(|ident| quote!(#ident))
            .unwrap_or_else(|| quote!(#i));
        
        let field_key = f.ident.as_ref()
            .map(|ident| ident.to_string())
            .unwrap_or_else(|| i.to_string());
        
        quote! {
            #field_ident: map.next_value()?,
        }
    }).collect();
    
    let field_names: Vec<_> = fields.iter().map(|f| {
        f.ident.as_ref()
            .map(|ident| quote!(#ident))
            .unwrap_or_else(|| quote!(0))
    }).collect();
    
    let expanded = quote! {
        impl #impl_generics SomeDeserializeTrait for #name #ty_generics #where_clause {
            fn deserialize<D>(deserializer: D) -> Result<Self, D::Error>
            where
                D: SomeDeserializerTrait,
            {
                struct Visitor;
                
                impl<'de> SomeVisitorTrait<'de> for Visitor {
                    type Value = #name #ty_generics;
                    
                    fn expecting(&self, f: &mut std::fmt::Formatter) -> std::fmt::Result {
                        f.write_str("struct ")
                    }
                    
                    fn visit_map<M>(self, mut map: M) -> Result<Self::Value, M::Error>
                    where
                        M: SomeMapAccessTrait<'de>,
                    {
                        #(
                            let #field_names = map.next_value()?;
                        )*
                        Ok(#name {
                            #(#field_desers)*
                        })
                    }
                }
                
                deserializer.deserialize_struct(Visitor)
            }
        }
    };
    
    ProcTokenStream::from(expanded)
}

Complex derive macros orchestrate many ToTokens implementations to generate correct code.

Synthesis

The ToTokens trait is the bridge between Rust's syntax tree representation and the token streams that the compiler consumes:

Core purpose:

  • Define how any value converts to a TokenStream
  • Enable clean interpolation into quote! with #variable syntax
  • Preserve span information for error reporting

Key implementations:

  • Primitives (i32, &str, bool, etc.) produce literal tokens
  • syn types (Ident, Type, Expr, Item, etc.) produce their parsed representation
  • Custom types can implement ToTokens for domain-specific code generation

Integration with quote!:

  • #variable calls ToTokens::to_tokens(&variable, &mut tokens)
  • #(#iter)* repeats for each element, calling to_tokens on each
  • Splices tokens directly into the output stream

Best practices:

  • Preserve spans from input to output for good error messages
  • Use syn::parse_quote! for parsing strings into typed syntax elements
  • Implement ToTokens on your macro's intermediate types for clean code
  • Use split_for_impl() for correct generic handling in impl blocks

Key insight: ToTokens is what makes procedural macros composable. Every syntax element—whether a primitive, a parsed type, or a custom structure—can be interpolated into generated code through the same mechanism. This uniformity means you can build up complex code generation from simple parts, each responsible for its own token representation, without manually string-building or managing token boundaries. The trait transforms Rust code generation from string manipulation into a type-safe, composable system.