What are the trade-offs between quote::ToTokens::to_tokens and into_tokens for proc macro code generation?
ToTokens::to_tokens appends tokens to an existing TokenStream without consuming self, enabling reuse and composition, while into_tokens (when implemented) consumes self and returns a TokenStream directly, offering simpler semantics for types that should be used only once in code generation. The fundamental trade-off is between the flexibility of non-consuming append semantics (to_tokens) and the clarity of consuming ownership semantics (into_tokens), with to_tokens being the standard trait method that all quote! macros rely on for token expansion.
The ToTokens Trait
use quote::{quote, ToTokens, TokenStreamExt};
use proc_macro2::TokenStream;
// The ToTokens trait definition (simplified):
pub trait ToTokens {
fn to_tokens(&self, tokens: &mut TokenStream);
// Provided method:
fn to_token_stream(&self) -> TokenStream {
let mut tokens = TokenStream::new();
self.to_tokens(&mut tokens);
tokens
}
// Provided method:
fn into_token_stream(self) -> TokenStream
where
Self: Sized,
{
self.to_token_stream()
}
}The trait uses &self and appends to a mutable TokenStream, which enables reuse and composition.
Basic to_tokens Usage
use quote::{quote, ToTokens};
use proc_macro2::TokenStream;
struct Field {
name: String,
ty: String,
}
impl ToTokens for Field {
fn to_tokens(&self, tokens: &mut TokenStream) {
// Append tokens representing this field
let name = &self.name;
let ty = &self.ty;
tokens.extend(quote! {
#name: #ty,
});
}
}
fn basic_usage() {
let field = Field {
name: "count".to_string(),
ty: "u32".to_string(),
};
// to_tokens appends to existing TokenStream
let mut tokens = TokenStream::new();
field.to_tokens(&mut tokens);
// Or use to_token_stream for convenience
let stream = field.to_token_stream();
// Field can be reused because to_tokens takes &self
let mut more_tokens = TokenStream::new();
field.to_tokens(&mut more_tokens); // Works - field still owned
// Can use multiple times
let combined = quote! {
#field // Uses to_tokens internally
#field // Uses it again
};
}to_tokens appends without consuming, allowing the same value to be used multiple times.
The into_tokens Pattern
use quote::{quote, ToTokens};
use proc_macro2::TokenStream;
// into_tokens isn't a standard trait method
// It's a pattern some types implement for consuming semantics
// Some types implement Into<TokenStream> instead:
// This consumes the value and returns a TokenStream
struct OwnedTokens {
tokens: TokenStream,
}
impl From<OwnedTokens> for TokenStream {
fn from(owned: OwnedTokens) -> TokenStream {
owned.tokens
}
}
// Or manually implement a consuming method:
struct SingleUse {
data: Vec<String>,
}
impl SingleUse {
fn into_tokens(self) -> TokenStream {
// Consumes self, no reuse possible
let data = &self.data;
quote! {
#(#data),*
}
}
}
fn into_tokens_usage() {
let single = SingleUse { data: vec!["a".into(), "b".into()] };
// into_tokens consumes self
let tokens = single.into_tokens();
// single is moved, cannot use again
// let tokens2 = single.into_tokens(); // Error: use of moved value
}into_tokens (or similar consuming patterns) moves ownership, preventing accidental reuse.
Why to_tokens Uses Append Semantics
use quote::{quote, ToTokens, TokenStreamExt};
use proc_macro2::TokenStream;
struct StructDef {
name: String,
fields: Vec<Field>,
}
impl ToTokens for StructDef {
fn to_tokens(&self, tokens: &mut TokenStream) {
// Append struct keyword
tokens.extend(quote!(struct));
// Append name
let name = &self.name;
tokens.extend(quote!(#name));
// Append opening brace
tokens.extend(quote!({));
// Append each field - using to_tokens composition
for field in &self.fields {
field.to_tokens(tokens); // Append each field
}
// Append closing brace
tokens.extend(quote!(}));
}
}
// The append pattern enables composition:
// Each component adds to the same TokenStream
fn composition_example() {
let def = StructDef {
name: "Point".into(),
fields: vec![
Field { name: "x".into(), ty: "f64".into() },
Field { name: "y".into(), ty: "f64".into() },
],
};
let tokens = def.to_token_stream();
// Result: struct Point { x: f64, y: f64, }
}Append semantics enable building complex token streams from components.
Reuse Pattern with to_tokens
use quote::{quote, ToTokens};
use proc_macro2::TokenStream;
struct Attribute {
name: String,
}
impl ToTokens for Attribute {
fn to_tokens(&self, tokens: &mut TokenStream) {
let name = &self.name;
tokens.extend(quote!(#[#name]));
}
}
fn reuse_pattern() {
let attr = Attribute { name: "derive".into() };
// Same attribute on multiple items
let result = quote! {
#attr
struct Foo {}
#attr
struct Bar {}
};
// attr was used twice via to_tokens
// This works because to_tokens takes &self
// Alternative with into_tokens would require cloning:
// (if into_tokens consumed self)
let attr1 = Attribute { name: "derive".into() };
let attr2 = attr1.clone(); // Clone needed
let result = quote! {
#attr1
struct Foo {}
#attr2
struct Bar {}
};
}Non-consuming semantics allow reuse without cloning.
When into_tokens Makes Sense
use quote::{quote, ToTokens};
use proc_macro2::TokenStream;
// Some data structures are meant to be used once
struct IdGenerator {
prefix: String,
counter: usize,
}
impl IdGenerator {
// Consuming method - generator state is used up
fn into_tokens(self) -> TokenStream {
let ids: Vec<String> = (0..self.counter)
.map(|i| format!("{}_{}", self.prefix, i))
.collect();
quote! {
#(#ids),*
}
}
}
// Types wrapping expensive data
struct ParsedData {
// Large data structure
items: Vec<ComplexItem>,
}
impl ParsedData {
fn into_tokens(self) -> TokenStream {
// Consume the parsed data to generate tokens
// No need to keep the original data after
let items = &self.items;
quote! {
#(#items),*
}
}
}
struct ComplexItem {
name: String,
value: i32,
}
impl ToTokens for ComplexItem {
fn to_tokens(&self, tokens: &mut TokenStream) {
let name = &self.name;
let value = self.value;
tokens.extend(quote!(#name = #value));
}
}Consuming semantics signal intent when data should be used exactly once.
Comparison Table
// to_tokens vs into_tokens semantics:
// to_tokens (ToTokens trait):
// - Takes &self (non-consuming)
// - Appends to existing TokenStream
// - Enables reuse and composition
// - Used by quote! macro interpolation
// - Standard trait in quote ecosystem
// into_tokens (custom pattern):
// - Takes self (consuming)
// - Returns TokenStream
// - Prevents reuse
// - Signals single-use intent
// - Not a standard trait
fn comparison() {
let item = Field { name: "x".into(), ty: "i32".into() };
// to_tokens: non-consuming
let mut tokens = TokenStream::new();
item.to_tokens(&mut tokens); // item is still usable
item.to_tokens(&mut tokens); // use again
// into_tokens: consuming
// let tokens = item.into_tokens(); // item is consumed
// item.to_tokens(&mut tokens); // Error: item moved
}quote! Macro Integration
use quote::{quote, ToTokens};
use proc_macro2::TokenStream;
struct Ident {
name: String,
}
impl ToTokens for Ident {
fn to_tokens(&self, tokens: &mut TokenStream) {
let name = &self.name;
tokens.extend(quote!(#name));
}
}
fn quote_integration() {
let ident = Ident { name: "my_var".into() };
// The # syntax in quote! calls to_tokens
let tokens = quote! {
let #ident = 42;
};
// This is equivalent to:
let mut manual_tokens = TokenStream::new();
manual_tokens.extend(quote!(let));
ident.to_tokens(&mut manual_tokens); // Appends ident tokens
manual_tokens.extend(quote!(= 42;));
// quote! uses ToTokens::to_tokens for # interpolation
// If you want consuming behavior, you must do it manually
// quote! always uses to_tokens internally
// It cannot use into_tokens because it takes &self
}The quote! macro is built on ToTokens::to_tokens.
Implementing ToTokens for References
use quote::{quote, ToTokens};
use proc_macro2::TokenStream;
use std::borrow::Cow;
// ToTokens is implemented for references by default
// This enables #&value syntax
struct Container {
items: Vec<String>,
}
impl ToTokens for Container {
fn to_tokens(&self, tokens: &mut TokenStream) {
// &self is borrowed, we can iterate and append
let items = &self.items;
tokens.extend(quote! {
#(#items),*
});
}
}
fn reference_pattern() {
let container = Container {
items: vec!["a".into(), "b".into()],
};
// ToTokens is implemented for &T
let tokens = quote! {
#&container // Takes &Container, calls to_tokens
};
// container still usable
let more = quote! {
#&container // Use again
};
}
// The blanket impl:
impl<T: ToTokens + ?Sized> ToTokens for &T {
fn to_tokens(&self, tokens: &mut TokenStream) {
(**self).to_tokens(tokens);
}
}Reference implementations enable borrowing patterns in quote!.
Performance Considerations
use quote::{quote, ToTokens};
use proc_macro2::TokenStream;
fn performance() {
// to_tokens: Appends to existing TokenStream
// May cause reallocations as TokenStream grows
let mut tokens = TokenStream::new();
for i in 0..1000 {
let value = i.to_string();
// Each append may reallocate
tokens.extend(quote!(#value,));
}
// to_token_stream: Creates new TokenStream
// One allocation for the result
let value = "test";
let tokens = value.to_token_stream(); // Allocates once
// into_tokens (if it existed): Returns TokenStream directly
// Could potentially avoid intermediate TokenStream
// For large token streams, consider:
// 1. Use extend with multiple items at once
// 2. Collect into Vec and interpolate
// 3. Pre-allocate if size is known
}Append semantics may require reallocation for growing token streams.
Custom into_tokens Implementation
use quote::{quote, ToTokens};
use proc_macro2::TokenStream;
// Pattern: Implementing both to_tokens and into_tokens
struct SmartTokens {
data: Vec<String>,
cached_tokens: Option<TokenStream>,
}
impl ToTokens for SmartTokens {
fn to_tokens(&self, tokens: &mut TokenStream) {
// Non-consuming: use cached if available
if let Some(ref cached) = self.cached_tokens {
tokens.extend(cached.clone());
} else {
// Generate on the fly
let data = &self.data;
let generated = quote!(#(#data),*);
tokens.extend(generated);
}
}
fn to_token_stream(&self) -> TokenStream {
// Can use cache efficiently
self.cached_tokens.clone().unwrap_or_else(|| {
let data = &self.data;
quote!(#(#data),*)
})
}
}
impl SmartTokens {
// Consuming method - can take ownership of internal data
fn into_token_stream(self) -> TokenStream {
// Consume self, can reuse cached tokens without cloning
self.cached_tokens.unwrap_or_else(|| {
let data = self.data;
quote!(#(#data),*)
})
}
}
fn smart_tokens_example() {
let smart = SmartTokens {
data: vec!["a".into(), "b".into()],
cached_tokens: None,
};
// Using to_token_stream (non-consuming, may clone)
let tokens1 = smart.to_token_stream();
// Using into_token_stream (consuming, no clone needed)
let tokens2 = smart.into_token_stream();
// smart is moved
}A consuming method can avoid cloning cached tokens.
Standard Library Implementations
use quote::{quote, ToTokens};
use proc_macro2::TokenStream;
fn std_implementations() {
// Most types implement to_tokens via blanket impls
// &str
let s = "hello";
let tokens = quote!(#s);
// String
let s = String::from("world");
let tokens = quote!(#s);
// i32, u32, etc.
let n: i32 = 42;
let tokens = quote!(#n);
// bool
let b = true;
let tokens = quote!(#b);
// TokenStream itself
let inner = quote!(fn foo() {});
let outer = quote! {
#inner // Appends inner tokens
};
// All use to_tokens internally
// None have into_tokens variants
}Standard types use to_tokens consistently.
Composition Pattern
use quote::{quote, ToTokens, TokenStreamExt};
use proc_macro2::TokenStream;
struct Function {
name: String,
params: Vec<Param>,
body: Vec<Statement>,
}
impl ToTokens for Function {
fn to_tokens(&self, tokens: &mut TokenStream) {
let name = &self.name;
// Start with fn keyword and name
tokens.extend(quote!(fn #name());
// Append parameters (composition)
for (i, param) in self.params.iter().enumerate() {
if i > 0 {
tokens.extend(quote!(,));
}
param.to_tokens(tokens); // Append each param
}
tokens.extend(quote!()));
// Append body (composition)
tokens.extend(quote!({));
for stmt in &self.body {
stmt.to_tokens(tokens); // Append each statement
}
tokens.extend(quote!(}));
}
}
struct Param {
name: String,
ty: String,
}
impl ToTokens for Param {
fn to_tokens(&self, tokens: &mut TokenStream) {
let name = &self.name;
let ty = &self.ty;
tokens.extend(quote!(#name: #ty));
}
}
struct Statement {
code: String,
}
impl ToTokens for Statement {
fn to_tokens(&self, tokens: &mut TokenStream) {
// Statements don't need reuse typically
// But to_tokens is still the standard pattern
let code = &self.code;
tokens.extend(quote!(#code;));
}
}Composition is natural with append semantics.
When to Prefer into_tokens
use quote::{quote, ToTokens};
use proc_macro2::TokenStream;
// Scenario 1: Large owned data
struct BigData {
megabytes_of_strings: Vec<String>,
}
impl BigData {
// into_tokens: can move strings into tokens
// to_tokens: must clone or reference
fn into_tokens(self) -> TokenStream {
// Move strings into token generation
// No cloning needed
let strings = self.megabytes_of_strings;
quote!(#(#strings),*)
}
}
// Scenario 2: Single-use configuration
struct Config {
values: Vec<String>,
}
impl Config {
fn into_tokens(self) -> TokenStream {
// Config is consumed, can't be reused
// This signals: "use once for code generation"
let values = self.values;
quote! {
const VALUES: &[&str] = &[#(#values),*];
}
}
}
// Scenario 3: Builder pattern finalization
struct TokenBuilder {
parts: Vec<TokenStream>,
}
impl TokenBuilder {
fn into_tokens(self) -> TokenStream {
// Consume builder, produce final tokens
// Builder is no longer usable after
let mut result = TokenStream::new();
for part in self.parts {
result.extend(part);
}
result
}
}Use consuming semantics when data should be used exactly once.
Synthesis
Core trade-off:
// to_tokens: Flexible, reusable, standard
impl ToTokens for MyType {
fn to_tokens(&self, tokens: &mut TokenStream) {
// Append to tokens
// Takes &self - can be called multiple times
}
}
// into_tokens: Clear ownership, single-use, custom
impl MyType {
fn into_tokens(self) -> TokenStream {
// Return new TokenStream
// Takes self - consumed, can only be called once
}
}Comparison:
| Aspect | to_tokens |
into_tokens |
|---|---|---|
| Self type | &self |
self |
| Output | Appends to &mut TokenStream |
Returns TokenStream |
| Reuse | Yes - value remains usable | No - value consumed |
| Standard | Yes - ToTokens trait |
No - custom pattern |
quote! integration |
Direct via # |
Manual |
| Clone avoidance | May need to clone | Can move data |
Decision guide:
// Use to_tokens (standard):
// - When value should be reusable
// - For implementing ToTokens trait
// - For composition with other tokens
// - For quote! interpolation
impl ToTokens for MyStruct {
fn to_tokens(&self, tokens: &mut TokenStream) {
// Standard, reusable implementation
}
}
// Consider into_tokens:
// - When data should be used exactly once
// - When you want to move large data without cloning
// - For builder pattern finalization
// - To signal "consumed for code generation"
impl MyStruct {
fn into_tokens(self) -> TokenStream {
// Consuming, single-use semantics
}
}
// Practical advice:
// - Implement ToTokens for all token-generating types
// - Add into_tokens as an additional method if consuming makes sense
// - The quote! macro always uses to_tokens
// - For consuming pattern, assign to variable first, then interpolateKey insight: The ToTokens::to_tokens method uses append semantics with &self because proc macro code generation typically involves composing multiple token sources together, reusing values multiple times within a single macro expansion, and building complex output incrementally. The into_tokens pattern (which isn't a standard trait) reverses this to consuming semantics with self, which makes sense when the source data should be used exactly once, when moving large data without cloning is important, or when the consuming semantics signal intent that the data is being transformed into its final token form. The quote! macro relies entirely on to_tokens through its # interpolation syntax, making to_tokens the primary interface for proc macro code generation. For most types, implement ToTokens::to_tokens for standard interoperability; add a separate consuming method only when the ownership semantics provide meaningful benefits like avoiding clones or signaling single-use intent.
