Loading page…
Rust walkthroughs
Loading page…
axum::extract derive implement FromRequest and what are the performance implications?Axum's FromRequest trait enables extracting types from HTTP requests, and the #[derive(FromRequest)] macro generates the boilerplate for custom extractors. Understanding how this works reveals both the power and the performance considerations of Axum's extraction system.
The FromRequest trait defines how types are extracted from requests:
use axum::{
extract::{FromRequest, FromRequestParts, Request},
http::StatusCode,
async_trait,
};
// The trait (simplified)
#[async_trait]
pub trait FromRequest<S>: Sized {
type Rejection: IntoResponse;
async fn from_request(req: Request, state: &S) -> Result<Self, Self::Rejection>;
}
// For extractors that only need request parts (not body)
#[async_trait]
pub trait FromRequestParts<S>: Sized {
type Rejection: IntoResponse;
async fn from_request_parts(parts: &mut Parts, state: &S) -> Result<Self, Self::Rejection>;
}The trait is async, allowing extractors to perform asynchronous operations during extraction.
Before seeing the derive, understanding manual implementation shows what the macro generates:
use axum::{
extract::{FromRequest, Request},
http::StatusCode,
response::{IntoResponse, Response},
async_trait,
};
use serde::Deserialize;
struct User {
id: u64,
name: String,
}
#[async_trait]
impl<S> FromRequest<S> for User
where
S: Send + Sync,
{
type Rejection = Response;
async fn from_request(req: Request, state: &S) -> Result<Self, Self::Rejection> {
// Parse JSON body
let body = axum::body::to_bytes(req.into_body(), 1024 * 1024)
.await
.map_err(|e| (StatusCode::BAD_REQUEST, e.to_string()).into_response())?;
let user: User = serde_json::from_slice(&body)
.map_err(|e| (StatusCode::BAD_REQUEST, e.to_string()).into_response())?;
Ok(user)
}
}This is verbose and error-prone for complex types.
The derive macro generates this implementation automatically:
use axum::extract::FromRequest;
use serde::Deserialize;
#[derive(Deserialize, FromRequest)]
struct User {
id: u64,
name: String,
}
// The macro generates:
// #[async_trait]
// impl<S> FromRequest<S> for User { ... }For simple cases, this eliminates all the boilerplate.
The derive supports extracting multiple types:
use axum::extract::FromRequest;
use axum::Json;
use serde::Deserialize;
// Multiple extractors combined
#[derive(FromRequest)]
struct Extractor(
axum::extract::Path<u64>,
axum::extract::Query<Params>,
Json<Body>,
);
#[derive(Deserialize)]
struct Params {
verbose: bool,
}
#[derive(Deserialize)]
struct Body {
name: String,
}
// Usage in handler
async fn handler(Extractor(path, query, body): Extractor) {
println!("Path: {}", path);
println!("Verbose: {}", query.verbose);
println!("Name: {}", body.name);
}The macro generates code that runs each extractor in sequence.
For a struct with named fields:
use axum::extract::FromRequest;
use axum::Json;
use serde::Deserialize;
#[derive(FromRequest)]
struct CreateUser {
path: axum::extract::Path<String>,
json: Json<UserData>,
}
#[derive(Deserialize)]
struct UserData {
email: String,
}
// The macro generates approximately:
#[async_trait]
impl<S> FromRequest<S> for CreateUser
where
axum::extract::Path<String>: FromRequestParts<S>,
Json<UserData>: FromRequest<S>,
S: Send + Sync,
{
type Rejection = Response;
async fn from_request(req: Request, state: &S) -> Result<Self, Self::Rejection> {
let (mut parts, body) = req.into_parts();
let path = axum::extract::Path::from_request_parts(&mut parts, state)
.await
.map_err(|e| e.into_response())?;
// Reconstruct request for body extraction
let req = Request::from_parts(parts, body);
let json = Json::from_request(req, state)
.await
.map_err(|e| e.into_response())?;
Ok(CreateUser { path, json })
}
}The macro handles the complexity of extracting from parts vs body.
The derive understands the distinction:
use axum::extract::{FromRequest, FromRequestParts};
use axum::Json;
use serde::Deserialize;
#[derive(FromRequest)]
struct MyExtractor {
// FromRequestParts - doesn't consume body
headers: axum::extract::OriginalUri,
query: axum::extract::Query<QueryParams>,
// FromRequest - consumes body
json: Json<RequestBody>,
}
#[derive(Deserialize)]
struct QueryParams {
page: Option<u32>,
}
#[derive(Deserialize)]
struct RequestBody {
data: String,
}
// The generated code:
// 1. Extracts FromRequestParts items first
// 2. Then extracts FromRequest items (which consume the body)
// 3. At most one FromRequest extractor (body consumer)This ordering is crucial because the body can only be consumed once.
use axum::{
extract::{FromRequest, Request},
Json,
};
use serde::Deserialize;
#[derive(FromRequest)]
struct BodyExtractor {
json: Json<Data>, // Consumes body
}
#[derive(Deserialize)]
struct Data {
value: String,
}
// Performance consideration:
// - Body is read entirely into memory
// - JSON parsing happens during extraction
// - For large bodies, this blocks the extractor
// Alternative: streaming extraction for large bodies
// But FromRequest generally buffers the bodyBody-consuming extractors read the entire body into memory, which impacts memory usage for large payloads.
use axum::extract::{FromRequest, FromRequestParts, Path, Query, Json};
use serde::Deserialize;
#[derive(FromRequest)]
struct Extractors {
// These are cheap - no body reading
path: Path<String>,
query: Query<QueryParams>,
}
#[derive(FromRequest)]
struct WithBody {
path: Path<String>,
// This allocates for body + parsing
json: Json<LargeBody>,
}
#[derive(Deserialize)]
struct QueryParams {
id: u64,
}
#[derive(Deserialize)]
struct LargeBody {
data: Vec<String>, // Potentially large
}
// Performance difference:
// - Extractors: minimal allocation (just parsing URI/query)
// - WithBody: body buffered + JSON parsed + struct allocatedExtractors that don't touch the body are significantly cheaper.
use axum::{
extract::{FromRequest, Request},
Json,
http::StatusCode,
response::{IntoResponse, Response},
};
use serde::Deserialize;
#[derive(FromRequest)]
struct User {
name: String,
email: String,
}
// The generated rejection handles all errors
// Custom rejection can be specified:
#[derive(FromRequest)]
#[from_request(via(Json), rejection(MyRejection))]
struct CustomUser {
name: String,
email: String,
}
struct MyRejection(String);
impl IntoResponse for MyRejection {
fn into_response(self) -> Response {
(StatusCode::BAD_REQUEST, self.0).into_response()
}
}
// Performance: Custom rejection avoids default error formatting
// But the difference is usually negligibleuse axum::{
extract::{FromRequest, Path, Query, Json},
response::Response,
};
use serde::Deserialize;
#[derive(FromRequest)]
#[from_request(rejection(ApiError))]
struct ApiExtractor {
path: Path<u64>,
query: Query<Params>,
json: Json<Body>,
}
#[derive(Deserialize)]
struct Params {
verbose: bool,
}
#[derive(Deserialize)]
struct Body {
data: String,
}
struct ApiError(Response);
// Custom error type for all extraction failures
impl From<axum::extract::rejection::PathRejection> for ApiError {
fn from(e: axum::extract::rejection::PathRejection) -> Self {
ApiError((StatusCode::BAD_REQUEST, e.to_string()).into_response())
}
}
// Similar impls for QueryRejection, JsonRejection...use axum::{
extract::{FromRequestParts, State},
http::request::Parts,
async_trait,
};
use std::sync::Arc;
struct AppState {
db: Database,
cache: Cache,
}
// Extracting state is cheap - just a clone of Arc
#[async_trait]
impl<S> FromRequestParts<S> for AppState
where
S: Clone + Send + Sync + 'static,
AppState: From<S>,
{
type Rejection = std::convert::Infallible;
async fn from_request_parts(parts: &mut Parts, state: &S) -> Result<Self, Self::Rejection> {
Ok(state.clone().into())
}
}
// The State extractor provided by Axum
async fn handler(State(state): State<Arc<AppState>>) {
// state is Arc::clone, very cheap
}State extraction is essentially free when using Arc.
use axum::extract::{FromRequest, Json, Form};
use serde::Deserialize;
// THIS WON'T WORK: two body extractors
#[derive(FromRequest)]
struct BrokenExtractor {
json: Json<Data>, // Consumes body
form: Form<Data>, // Can't consume body again!
}
// The derive will fail or generate code that doesn't compile
// At most one extractor can consume the body
// Correct approach: extract once, use multiple times
#[derive(FromRequest)]
struct CorrectExtractor {
json: Json<Data>, // Only body extractor
}
// If you need different formats, use an enum:
#[derive(Deserialize)]
#[serde(untagged)]
enum Data {
Json(DataJson),
Form(DataForm),
}use axum::{
extract::{FromRequest, FromRequestParts, Path, Query, Extension, Json},
http::Request,
};
use serde::Deserialize;
#[derive(FromRequest)]
struct OrderedExtractor {
// Cheap extractions first
path: Path<u64>,
query: Query<Params>,
extension: Extension<Config>,
// Expensive extraction last
json: Json<LargeBody>,
}
#[derive(Deserialize)]
struct Params {
page: Option<u32>,
}
struct Config {
max_size: usize,
}
#[derive(Deserialize)]
struct LargeBody {
items: Vec<String>,
}
// Generated code extracts in field order
// If path extraction fails, body isn't read at all
// This is an optimization: fail fast on cheap checksThe derive extracts fields in order, allowing early exit on failures before expensive body processing.
use axum::{
extract::{FromRequest, Request, Path, Json},
http::StatusCode,
response::{IntoResponse, Response},
async_trait,
};
use serde::Deserialize;
#[derive(Deserialize)]
struct UserData {
name: String,
}
// Derive version
#[derive(FromRequest)]
struct DerivedExtractor {
path: Path<u64>,
json: Json<UserData>,
}
// Manual version
struct ManualExtractor {
path: u64,
user: UserData,
}
#[async_trait]
impl<S> FromRequest<S> for ManualExtractor
where
S: Send + Sync,
{
type Rejection = Response;
async fn from_request(req: Request, state: &S) -> Result<Self, Self::Rejection> {
let (mut parts, body) = req.into_parts();
// Extract path
let path: Path<u64> = Path::from_request_parts(&mut parts, state)
.await
.map_err(|e| (StatusCode::BAD_REQUEST, e.to_string()).into_response())?;
// Extract JSON
let req = Request::from_parts(parts, body);
let json: Json<UserData> = Json::from_request(req, state)
.await
.map_err(|e| (StatusCode::BAD_REQUEST, e.to_string()).into_response())?;
Ok(ManualExtractor {
path: *path,
user: json.0,
})
}
}
// Performance is identical - derive generates similar code
// Manual gives more control over error handlinguse axum::{
extract::{FromRequest, Request},
body::Bytes,
Json,
};
use serde::Deserialize;
#[derive(Deserialize)]
struct BigPayload {
data: Vec<u8>, // Could be megabytes
}
// Default Json extractor buffers entire body
#[derive(FromRequest)]
struct JsonExtractor {
json: Json<BigPayload>,
}
// For large payloads, consider streaming
// But FromRequest doesn't support streaming well
// Use IntoResponse for streaming responses instead
// If you need bounded memory:
#[derive(FromRequest)]
#[from_request(via(Json))]
struct BoundedPayload {
#[serde(deserialize_with = "limited_vec")]
data: Vec<u8>,
}
fn limited_vec<'de, D>(deserializer: D) -> Result<Vec<u8>, D::Error>
where
D: serde::Deserializer<'de>,
{
let vec: Vec<u8> = Vec::deserialize(deserializer)?;
if vec.len() > 1024 * 1024 { // 1MB limit
return Err(serde::de::Error::custom("payload too large"));
}
Ok(vec)
}use axum::extract::{FromRequest, Path, Query, Json, Form};
use serde::Deserialize;
#[derive(Deserialize)]
struct QueryParams {
search: String,
page: u32,
}
#[derive(Deserialize)]
struct FormData {
username: String,
password: String,
}
// Relative CPU costs of extractors:
// 1. Path<T> - very cheap, string parsing
// 2. Query<T> - cheap, URL parsing + deserialization
// 3. Extension<T> - very cheap, just cloning an Arc
// 4. Json<T> - expensive, full body parse + JSON deserialization
// 5. Form<T> - medium, URL-encoded parsing + deserialization
#[derive(FromRequest)]
struct CheapExtractor {
path: Path<u64>, // Cheap
query: Query<QueryParams>, // Cheap
}
#[derive(FromRequest)]
struct ExpensiveExtractor {
json: Json<FormData>, // Expensive
}use axum::{
extract::{FromRequest, FromRequestParts, Path, Query, Extension, Json},
http::request::Parts,
async_trait,
};
use serde::Deserialize;
// 1. Order fields from cheapest to most expensive
#[derive(FromRequest)]
struct OptimizedExtractor {
// Cheapest first - fail fast
path: Path<u64>,
query: Query<Params>,
extension: Extension<Config>,
// Most expensive last
json: Json<Body>,
}
// 2. Use FromRequestParts when you don't need the body
#[derive(Deserialize)]
struct Params {
id: u64,
}
struct Config {
max_items: usize,
}
#[derive(Deserialize)]
struct Body {
items: Vec<String>,
}
// 3. Avoid re-parsing
#[derive(FromRequest)]
struct BadExtractor {
path: Path<String>, // Parsed once
// Can't extract Path<String> again - already consumed
}
// 4. For cheap extractions, manual might be clearer
struct SimpleExtractor {
id: u64,
}
#[async_trait]
impl<S> FromRequestParts<S> for SimpleExtractor
where
S: Send + Sync,
{
type Rejection = axum::response::Response;
async fn from_request_parts(parts: &mut Parts, state: &S) -> Result<Self, Self::Rejection> {
let path: Path<u64> = Path::from_request_parts(parts, state)
.await
.map_err(|e| e.into_response())?;
Ok(SimpleExtractor { id: *path })
}
}The #[derive(FromRequest)] macro generates async trait implementations that:
Extract fields in order from cheapest to most expensive (as you define them), enabling early exit on failures before expensive body parsing.
Handle the parts/body split automatically, extracting FromRequestParts types first, then consuming the body for FromRequest types.
Generate rejection types with configurable error handling via #[from_request(rejection(...))] attributes.
Performance implications:
Body extractors allocate: Any extractor consuming the body (like Json, Form) reads the entire body into memory before parsing.
Order matters: Put cheap extractors (Path, Query, Extension) before expensive ones (Json, Form) to fail fast and avoid unnecessary work.
Only one body extractor: The body can only be consumed once; the derive enforces this at compile time.
Derive vs manual: Performance is identical; the derive generates code similar to what you'd write by hand.
State extraction is cheap: Cloning Arc<AppState> is essentially free.
The derive macro provides ergonomic syntax with zero runtime overhead compared to manual implementation, while the generated code follows best practices for extraction ordering. The main performance consideration is being mindful of body consumption and structuring extractors to fail fast on cheap checks before expensive parsing.