Documentation Index
Fetch the complete documentation index at: https://reasonblocks.mintlify.app/llms.txt
Use this file to discover all available pages before exploring further.
make_openai_tools wraps a CodebaseMemory instance and an optional ImportGraph in @function_tool-decorated callables that the OpenAI Agents SDK understands natively. Pass the returned list directly to Agent(tools=[...]) alongside your own tools. The tools share the same memory and graph objects you provide, so findings stored during one step are immediately available on the next recall.
Installation
The OpenAI Agents integration requires theopenai-agents package:
make_openai_tools
@function_tool-decorated callables ready for Agent(tools=[...]). The list contains up to three tools depending on the arguments and flags you provide.
Parameters
The
CodebaseMemory instance the tools read from and write to. When None, both recall_findings and store_finding are omitted from the returned list regardless of enable_recall and enable_store.Optional
ImportGraph for the repository. When provided and enable_impact=True, an impact_analysis tool is added to the returned list.Maximum number of findings to return from a single
recall_findings call. Increasing this value provides more context at the cost of additional tokens.Minimum similarity score (0–1) a finding must reach to appear in recall results. Lower values return more results with potentially lower relevance.
Include the
recall_findings tool. Set to False to produce a write-only or impact-only tool set.Include the
store_finding tool. Set to False for read-only scenarios where the agent should not persist new observations.Include the
impact_analysis tool when a graph is provided. Set to False to suppress it even when graph is not None.Returns
A list of
@function_tool-decorated callables. Safe to spread into an Agent tool list: tools=[*rb_tools, *your_tools].Tools
recall_findings(query)
Searches CodebaseMemory for findings relevant to query. The agent should call this before reading a file — if findings already exist, it can skip the file read entirely and avoid wasting tokens.
| Parameter | Type | Description |
|---|---|---|
query | str | Natural-language description of what you are looking for |
store_finding(content, file_path, finding_type)
Persists a new finding to CodebaseMemory so future agent runs can recall it. Store small, self-contained facts. Avoid storing long paragraphs; keep each entry tight and factual.
| Parameter | Type | Default | Description |
|---|---|---|---|
content | str | — | The finding text (under 8 000 characters) |
file_path | str | "" | Repo-relative path the finding is about, if applicable |
finding_type | str | "note" | Short tag: bug, behavior, pattern, or note |
"stored (id=<fid>)" on success or "store failed" on error.
impact_analysis(file_path)
Queries the ImportGraph to return the dependents (files that import this file) and dependencies (files this file imports). Use it to judge the blast radius of a proposed change before modifying a file.
| Parameter | Type | Description |
|---|---|---|
file_path | str | Repo-relative path, e.g. "pydantic/main.py" |
impact_analysis is only present when you pass a non-None graph and enable_impact=True. Check len(rb_tools) rather than assuming a fixed index if you build your tool list conditionally.Complete example
Telemetry: openai_hooks
make_openai_tools handles tool wiring only. ReasonBlocks telemetry — step scoring, FSM state tracking, and E-trace injection — is provided separately by rb.openai_hooks(), which returns a ReasonBlocksHooks object you pass to Runner.run(hooks=...).
For the full openai_hooks reference and lifecycle details, see the OpenAI Agents integration guide.