Variable name in the LLM chain to put the documents in
LLM Wrapper to use after formatting documents
Optional memoryOptional config: any[]Use .batch() instead. Will be removed in 0.2.0.
This feature is deprecated and will be removed in the future.
It is not recommended for use.
Call the chain on all inputs in the list
Optional config: anyOptional tags: string[]Use .invoke() instead. Will be removed in 0.2.0.
Run the core logic of this chain and add to output if desired.
Wraps _call and handles memory.
Invoke the chain with the provided input and returns the output.
Input values for the chain run.
Optional config: anyOptional configuration for the Runnable.
Promise that resolves with the output of the chain run.
Return a json-like object representing this chain.
Static deserializeLoad a chain from a json-like object describing it.
Generated using TypeDoc
Combine documents by doing a first pass and then refining on more documents.