2

I'm building a FastAPI application and modeling a forum-style comment system using Pydantic v2. Each comment can contain replies, and those replies can contain more replies, recursively, similar to Reddit or GitHub threads.

A simplified version of my model (without user) looks like this:

from __future__ import annotations from pydantic import BaseModel from typing import List class Comment(BaseModel): id: int text: str replies: List[Comment] | None = None Comment.model_rebuild() 

This works fine for basic validation. However, I need to enforce a rule:

The comment tree must not exceed a maximum nesting depth. For example, maximum depth = 3. (So it doesn't get's ugly or breaks the UI). So with the payload:

{ "id": 1, "text": "Level 1", "replies": [ { "id": 2, "text": "Level 2", "replies": [ { "id": 3, "text": "Level 3", "replies": [ { "id": 4, "text": "Level 4 INVALID" } ] } ] } ] } 

I want Pydantic to raise a validation error because the depth exceeds the allowed maximum.

What I've Tried

I attempted to compute the depth inside a field_validator, but since the model validates recursively without knowing its own call depth, I hit issues like RecursionError or inability to access context about the current recursion level, or even validators running before all children are built.

from __future__ import annotations from pydantic import BaseModel, field_validator, ValidationError MAX_DEPTH = 3 class Comment(BaseModel): id: int text: str replies: list[Comment] | None = None @field_validator("replies", mode="after") def validate_depth(cls, replies): if replies: depth = cls._compute_depth(replies) if depth > MAX_DEPTH: raise ValueError(f"Maximum depth {MAX_DEPTH} exceeded (found: {depth})") return replies @classmethod def _compute_depth(cls, replies): if not replies: return 1 return 1 + max(cls._compute_depth(r.replies) for r in replies) Comment.model_rebuild() 

This works in BASIC cases and i want to findthe best approach, or is there a more idiomatic or built-in way in Pydantic v2 to handle recursive validation with depth constraints?

1 Answer 1

2

from __future__ import annotations

I am skeptical that you need that, but fine, whatever, perhaps you'll convince me. With older interpreters it used to make sense; 3.9 is EOL and only 3.10+ is relevant nowadays.

from typing import List 

Please don't do that. Prefer
replies: list[Comment] | None = None over
replies: List[Comment] | None = None.

depth

The comment tree must not exceed a maximum nesting depth [of three].

I understand your desire to embrace the whole "recursive" aspect. But given that constraint, it really feels like you should be pursuing a Type calculus of Comment1, Comment2, and Comment3, such that nesting 4 levels deep is trivially ruled out at linting time.

runtime check

More generally, if we wish to impose some arbitrarily large max_depth constraint, it feels like it might be more appropriate to enforce that at run time rather than at import time. Consider having validate_depth() raise fatal error when the observed depth is too deep, rather than making type calculus predictions about it.

If you really feel this should be a static analysis task, then consider giving descriptive names to the depth {1, 2, 3} cases which tie back to your business Use Case. If that is infeasible, then perhaps this particular validation is not a good fit for the pyright and mypy type checkers that your make lint recipe runs.

Sign up to request clarification or add additional context in comments.

Comments

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.