Optional memoryOptional config: anyOptional tags: string[]Use .invoke() instead. Will be removed in 0.2.0.
Run the core logic of this chain and add to output if desired.
Wraps _call and handles memory.
Invoke the chain with the provided input and returns the output.
Input values for the chain run.
Optional config: anyOptional configuration for the Runnable.
Promise that resolves with the output of the chain run.
Return a json-like object representing this chain.
Static deserializeLoad a chain from a json-like object describing it.
Static fromLLMCreates an instance of ChatVectorDBQAChain using a BaseLanguageModel and other options.
Instance of BaseLanguageModel used to generate a new question.
Instance of VectorStore used for vector operations.
(Optional) Additional options for creating the ChatVectorDBQAChain instance.
Optional inputOptional k?: numberOptional outputOptional qaOptional questionOptional returnOptional verbose?: booleanNew instance of ChatVectorDBQAChain.
Generated using TypeDoc
⚠️ Deprecated ⚠️
use
ConversationalRetrievalQAChaininstead.This feature is deprecated and will be removed in the future.
It is not recommended for use.