LLM Wrapper to use
Key to use for output, defaults to text
Prompt object to use
Optional llmKwargs to pass to LLM
Optional memoryOptional outputOutputParser to use
Optional config: any[]Use .batch() instead. Will be removed in 0.2.0.
This feature is deprecated and will be removed in the future.
It is not recommended for use.
Call the chain on all inputs in the list
Invoke the chain with the provided input and returns the output.
Input values for the chain run.
Optional config: anyOptional configuration for the Runnable.
Promise that resolves with the output of the chain run.
Static deserializeLoad a chain from a json-like object describing it.
Static fromOptional chainOptional prompt?: anyGenerated using TypeDoc
Chain to run queries against LLMs.
Example