id
stringlengths 14
15
| text
stringlengths 35
2.07k
| embedding
sequence | source
stringlengths 61
154
|
---|---|---|---|
977de7b0dc05-0 | langchain.embeddings.self_hosted_hugging_face.SelfHostedHuggingFaceEmbeddings¶
class langchain.embeddings.self_hosted_hugging_face.SelfHostedHuggingFaceEmbeddings(*, cache: ~typing.Optional[bool] = None, verbose: bool = None, callbacks: ~typing.Optional[~typing.Union[~typing.List[~langchain.callbacks.base.BaseCallbackHandler], ~langchain.callbacks.base.BaseCallbackManager]] = None, callback_manager: ~typing.Optional[~langchain.callbacks.base.BaseCallbackManager] = None, tags: ~typing.Optional[~typing.List[str]] = None, pipeline_ref: ~typing.Any = None, client: ~typing.Any = None, inference_fn: ~typing.Callable = <function _embed_documents>, hardware: ~typing.Any = None, model_load_fn: ~typing.Callable = <function load_embedding_model>, load_fn_kwargs: ~typing.Optional[dict] = None, model_reqs: ~typing.List[str] = ['./', 'sentence_transformers', 'torch'], inference_kwargs: ~typing.Any = None, model_id: str = 'sentence-transformers/all-mpnet-base-v2')[source]¶
Bases: SelfHostedEmbeddings
Runs sentence_transformers embedding models on self-hosted remote hardware.
Supported hardware includes auto-launched instances on AWS, GCP, Azure,
and Lambda, as well as servers specified
by IP address and SSH credentials (such as on-prem, or another cloud
like Paperspace, Coreweave, etc.).
To use, you should have the runhouse python package installed.
Example
from langchain.embeddings import SelfHostedHuggingFaceEmbeddings
import runhouse as rh
model_name = "sentence-transformers/all-mpnet-base-v2"
gpu = rh.cluster(name="rh-a10x", instance_type="A100:1") | [
5317,
8995,
41541,
25624,
28248,
13144,
291,
1552,
36368,
32085,
815,
491,
9480,
291,
39,
36368,
16680,
26566,
25624,
55609,
198,
1058,
8859,
8995,
41541,
25624,
28248,
13144,
291,
1552,
36368,
32085,
815,
491,
9480,
291,
39,
36368,
16680,
26566,
25624,
4163,
11,
6636,
25,
4056,
90902,
37464,
58,
2707,
60,
284,
2290,
11,
14008,
25,
1845,
284,
2290,
11,
27777,
25,
4056,
90902,
37464,
58,
93,
90902,
10840,
290,
58,
93,
90902,
5937,
58,
93,
5317,
8995,
72134,
9105,
13316,
7646,
3126,
1145,
4056,
5317,
8995,
72134,
9105,
13316,
7646,
2087,
5163,
284,
2290,
11,
4927,
12418,
25,
4056,
90902,
37464,
58,
93,
5317,
8995,
72134,
9105,
13316,
7646,
2087,
60,
284,
2290,
11,
9681,
25,
4056,
90902,
37464,
58,
93,
90902,
5937,
17752,
5163,
284,
2290,
11,
15660,
7949,
25,
4056,
90902,
13614,
284,
2290,
11,
3016,
25,
4056,
90902,
13614,
284,
2290,
11,
45478,
15604,
25,
4056,
90902,
28115,
481,
284,
366,
1723,
721,
12529,
77027,
8226,
12035,
25,
4056,
90902,
13614,
284,
2290,
11,
1646,
12693,
15604,
25,
4056,
90902,
28115,
481,
284,
366,
1723,
2865,
52602,
5156,
8226,
2865,
15604,
37335,
25,
4056,
90902,
37464,
58,
8644,
60,
284,
2290,
11,
1646,
18110,
82,
25,
4056,
90902,
5937,
17752,
60,
284,
18701,
518,
364,
52989,
18956,
388,
518,
364,
28514,
4181,
45478,
37335,
25,
4056,
90902,
13614,
284,
2290,
11,
1646,
851,
25,
610,
284,
364,
52989,
33952,
388,
32506,
12,
1331,
4816,
31113,
8437,
17,
13588,
2484,
60,
55609,
198,
33,
2315,
25,
10323,
9480,
291,
26566,
25624,
198,
75020,
11914,
18956,
388,
40188,
4211,
389,
659,
39689,
291,
8870,
12035,
627,
35736,
12035,
5764,
3313,
53926,
10880,
13422,
389,
24124,
11,
480,
7269,
11,
35219,
345,
438,
45621,
11,
439,
1664,
439,
16692,
5300,
198,
1729,
6933,
2686,
323,
41563,
16792,
320,
21470,
439,
389,
22041,
76,
11,
477,
2500,
9624,
198,
4908,
45231,
1330,
11,
9708,
906,
525,
11,
5099,
13,
4390,
1271,
1005,
11,
499,
1288,
617,
279,
1629,
7830,
10344,
6462,
10487,
627,
13617,
198,
1527,
8859,
8995,
41541,
25624,
1179,
10323,
9480,
291,
39,
36368,
16680,
26566,
25624,
198,
475,
1629,
7830,
439,
22408,
198,
2590,
1292,
284,
330,
52989,
33952,
388,
32506,
12,
1331,
4816,
31113,
8437,
17,
702,
43694,
284,
22408,
41601,
3232,
429,
41196,
7561,
605,
87,
498,
2937,
1857,
429,
32,
1041,
25,
16,
909
] | https://langchain.readthedocs.io/en/latest/embeddings/langchain.embeddings.self_hosted_hugging_face.SelfHostedHuggingFaceEmbeddings.html |
977de7b0dc05-1 | gpu = rh.cluster(name="rh-a10x", instance_type="A100:1")
hf = SelfHostedHuggingFaceEmbeddings(model_name=model_name, hardware=gpu)
Initialize the remote inference function.
param cache: Optional[bool] = None¶
param callback_manager: Optional[BaseCallbackManager] = None¶
param callbacks: Callbacks = None¶
param hardware: Any = None¶
Remote hardware to send the inference function to.
param inference_fn: Callable = <function _embed_documents>¶
Inference function to extract the embeddings.
param inference_kwargs: Any = None¶
Any kwargs to pass to the model’s inference function.
param load_fn_kwargs: Optional[dict] = None¶
Key word arguments to pass to the model load function.
param model_id: str = 'sentence-transformers/all-mpnet-base-v2'¶
Model name to use.
param model_load_fn: Callable = <function load_embedding_model>¶
Function to load the model remotely on the server.
param model_reqs: List[str] = ['./', 'sentence_transformers', 'torch']¶
Requirements to install on hardware to inference the model.
param pipeline_ref: Any = None¶
param tags: Optional[List[str]] = None¶
Tags to add to the run trace.
param verbose: bool [Optional]¶
Whether to print out response text.
__call__(prompt: str, stop: Optional[List[str]] = None, callbacks: Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]] = None, **kwargs: Any) → str¶
Check Cache and run the LLM on the given prompt and input. | [
43694,
284,
22408,
41601,
3232,
429,
41196,
7561,
605,
87,
498,
2937,
1857,
429,
32,
1041,
25,
16,
1158,
45854,
284,
10323,
9480,
291,
39,
36368,
16680,
26566,
25624,
7790,
1292,
63596,
1292,
11,
12035,
38262,
5701,
340,
10130,
279,
8870,
45478,
734,
627,
913,
6636,
25,
12536,
58,
2707,
60,
284,
2290,
55609,
198,
913,
4927,
12418,
25,
12536,
58,
4066,
7646,
2087,
60,
284,
2290,
55609,
198,
913,
27777,
25,
23499,
82,
284,
2290,
55609,
198,
913,
12035,
25,
5884,
284,
2290,
55609,
198,
25732,
12035,
311,
3708,
279,
45478,
734,
311,
627,
913,
45478,
15604,
25,
54223,
284,
366,
1723,
721,
12529,
77027,
29,
55609,
198,
644,
2251,
734,
311,
8819,
279,
71647,
627,
913,
45478,
37335,
25,
5884,
284,
2290,
55609,
198,
8780,
16901,
311,
1522,
311,
279,
1646,
753,
45478,
734,
627,
913,
2865,
15604,
37335,
25,
12536,
58,
8644,
60,
284,
2290,
55609,
198,
1622,
3492,
6105,
311,
1522,
311,
279,
1646,
2865,
734,
627,
913,
1646,
851,
25,
610,
284,
364,
52989,
33952,
388,
32506,
12,
1331,
4816,
31113,
8437,
17,
6,
55609,
198,
1747,
836,
311,
1005,
627,
913,
1646,
12693,
15604,
25,
54223,
284,
366,
1723,
2865,
52602,
5156,
29,
55609,
198,
5263,
311,
2865,
279,
1646,
39529,
389,
279,
3622,
627,
913,
1646,
18110,
82,
25,
1796,
17752,
60,
284,
18701,
518,
364,
52989,
18956,
388,
518,
364,
28514,
663,
55609,
198,
60302,
311,
4685,
389,
12035,
311,
45478,
279,
1646,
627,
913,
15660,
7949,
25,
5884,
284,
2290,
55609,
198,
913,
9681,
25,
12536,
53094,
17752,
5163,
284,
2290,
55609,
198,
16309,
311,
923,
311,
279,
1629,
11917,
627,
913,
14008,
25,
1845,
510,
15669,
60,
55609,
198,
25729,
311,
1194,
704,
2077,
1495,
627,
565,
6797,
3889,
41681,
25,
610,
11,
3009,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
27777,
25,
12536,
58,
33758,
53094,
58,
4066,
7646,
3126,
1145,
5464,
7646,
2087,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
610,
55609,
198,
4061,
20044,
323,
1629,
279,
445,
11237,
389,
279,
2728,
10137,
323,
1988,
13
] | https://langchain.readthedocs.io/en/latest/embeddings/langchain.embeddings.self_hosted_hugging_face.SelfHostedHuggingFaceEmbeddings.html |
977de7b0dc05-2 | Check Cache and run the LLM on the given prompt and input.
async agenerate(prompts: List[str], stop: Optional[List[str]] = None, callbacks: Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]] = None, *, tags: Optional[List[str]] = None, **kwargs: Any) → LLMResult¶
Run the LLM on the given prompt and input.
async agenerate_prompt(prompts: List[PromptValue], stop: Optional[List[str]] = None, callbacks: Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]] = None, **kwargs: Any) → LLMResult¶
Take in a list of prompt values and return an LLMResult.
classmethod all_required_field_names() → Set¶
async apredict(text: str, *, stop: Optional[Sequence[str]] = None, **kwargs: Any) → str¶
Predict text from text.
async apredict_messages(messages: List[BaseMessage], *, stop: Optional[Sequence[str]] = None, **kwargs: Any) → BaseMessage¶
Predict message from messages.
dict(**kwargs: Any) → Dict¶
Return a dictionary of the LLM.
embed_documents(texts: List[str]) → List[List[float]]¶
Compute doc embeddings using a HuggingFace transformer model.
Parameters
texts – The list of texts to embed.s
Returns
List of embeddings, one for each text.
embed_query(text: str) → List[float]¶
Compute query embeddings using a HuggingFace transformer model.
Parameters
text – The text to embed.
Returns
Embeddings for the text.
classmethod from_pipeline(pipeline: Any, hardware: Any, model_reqs: Optional[List[str]] = None, device: int = 0, **kwargs: Any) → LLM¶
Init the SelfHostedPipeline from a pipeline object or string. | [
4061,
20044,
323,
1629,
279,
445,
11237,
389,
279,
2728,
10137,
323,
1988,
627,
7847,
945,
13523,
84432,
13044,
25,
1796,
17752,
1145,
3009,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
27777,
25,
12536,
58,
33758,
53094,
58,
4066,
7646,
3126,
1145,
5464,
7646,
2087,
5163,
284,
2290,
11,
12039,
9681,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
445,
11237,
2122,
55609,
198,
6869,
279,
445,
11237,
389,
279,
2728,
10137,
323,
1988,
627,
7847,
945,
13523,
62521,
84432,
13044,
25,
1796,
43447,
15091,
1150,
1145,
3009,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
27777,
25,
12536,
58,
33758,
53094,
58,
4066,
7646,
3126,
1145,
5464,
7646,
2087,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
445,
11237,
2122,
55609,
198,
18293,
304,
264,
1160,
315,
10137,
2819,
323,
471,
459,
445,
11237,
2122,
627,
27853,
682,
19265,
5121,
9366,
368,
11651,
2638,
55609,
198,
7847,
1469,
9037,
7383,
25,
610,
11,
12039,
3009,
25,
12536,
58,
14405,
17752,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
610,
55609,
198,
54644,
1495,
505,
1495,
627,
7847,
1469,
9037,
24321,
56805,
25,
1796,
58,
4066,
2097,
1145,
12039,
3009,
25,
12536,
58,
14405,
17752,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
5464,
2097,
55609,
198,
54644,
1984,
505,
6743,
627,
8644,
22551,
9872,
25,
5884,
8,
11651,
30226,
55609,
198,
5715,
264,
11240,
315,
279,
445,
11237,
627,
12529,
77027,
7383,
82,
25,
1796,
17752,
2526,
11651,
1796,
53094,
96481,
5163,
55609,
198,
47354,
4733,
71647,
1701,
264,
473,
36368,
16680,
43678,
1646,
627,
9905,
198,
87042,
1389,
578,
1160,
315,
22755,
311,
11840,
516,
198,
16851,
198,
861,
315,
71647,
11,
832,
369,
1855,
1495,
627,
12529,
5857,
7383,
25,
610,
8,
11651,
1796,
96481,
60,
55609,
198,
47354,
3319,
71647,
1701,
264,
473,
36368,
16680,
43678,
1646,
627,
9905,
198,
1342,
1389,
578,
1495,
311,
11840,
627,
16851,
198,
26566,
25624,
369,
279,
1495,
627,
27853,
505,
46287,
1319,
8966,
25,
5884,
11,
12035,
25,
5884,
11,
1646,
18110,
82,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
3756,
25,
528,
284,
220,
15,
11,
3146,
9872,
25,
5884,
8,
11651,
445,
11237,
55609,
198,
3888,
279,
10323,
9480,
291,
35756,
505,
264,
15660,
1665,
477,
925,
13
] | https://langchain.readthedocs.io/en/latest/embeddings/langchain.embeddings.self_hosted_hugging_face.SelfHostedHuggingFaceEmbeddings.html |
977de7b0dc05-3 | Init the SelfHostedPipeline from a pipeline object or string.
generate(prompts: List[str], stop: Optional[List[str]] = None, callbacks: Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]] = None, *, tags: Optional[List[str]] = None, **kwargs: Any) → LLMResult¶
Run the LLM on the given prompt and input.
generate_prompt(prompts: List[PromptValue], stop: Optional[List[str]] = None, callbacks: Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]] = None, **kwargs: Any) → LLMResult¶
Take in a list of prompt values and return an LLMResult.
get_num_tokens(text: str) → int¶
Get the number of tokens present in the text.
get_num_tokens_from_messages(messages: List[BaseMessage]) → int¶
Get the number of tokens in the message.
get_token_ids(text: str) → List[int]¶
Get the token present in the text.
predict(text: str, *, stop: Optional[Sequence[str]] = None, **kwargs: Any) → str¶
Predict text from text.
predict_messages(messages: List[BaseMessage], *, stop: Optional[Sequence[str]] = None, **kwargs: Any) → BaseMessage¶
Predict message from messages.
validator raise_deprecation » all fields¶
Raise deprecation warning if callback_manager is used.
save(file_path: Union[Path, str]) → None¶
Save the LLM.
Parameters
file_path – Path to file to save the LLM to.
Example:
.. code-block:: python
llm.save(file_path=”path/llm.yaml”)
validator set_verbose » verbose¶
If verbose is None, set it.
This allows users to pass in None as verbose to access the global setting. | [
3888,
279,
10323,
9480,
291,
35756,
505,
264,
15660,
1665,
477,
925,
627,
19927,
84432,
13044,
25,
1796,
17752,
1145,
3009,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
27777,
25,
12536,
58,
33758,
53094,
58,
4066,
7646,
3126,
1145,
5464,
7646,
2087,
5163,
284,
2290,
11,
12039,
9681,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
445,
11237,
2122,
55609,
198,
6869,
279,
445,
11237,
389,
279,
2728,
10137,
323,
1988,
627,
19927,
62521,
84432,
13044,
25,
1796,
43447,
15091,
1150,
1145,
3009,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
27777,
25,
12536,
58,
33758,
53094,
58,
4066,
7646,
3126,
1145,
5464,
7646,
2087,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
445,
11237,
2122,
55609,
198,
18293,
304,
264,
1160,
315,
10137,
2819,
323,
471,
459,
445,
11237,
2122,
627,
456,
4369,
29938,
7383,
25,
610,
8,
11651,
528,
55609,
198,
1991,
279,
1396,
315,
11460,
3118,
304,
279,
1495,
627,
456,
4369,
29938,
5791,
24321,
56805,
25,
1796,
58,
4066,
2097,
2526,
11651,
528,
55609,
198,
1991,
279,
1396,
315,
11460,
304,
279,
1984,
627,
456,
6594,
8237,
7383,
25,
610,
8,
11651,
1796,
19155,
60,
55609,
198,
1991,
279,
4037,
3118,
304,
279,
1495,
627,
35798,
7383,
25,
610,
11,
12039,
3009,
25,
12536,
58,
14405,
17752,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
610,
55609,
198,
54644,
1495,
505,
1495,
627,
35798,
24321,
56805,
25,
1796,
58,
4066,
2097,
1145,
12039,
3009,
25,
12536,
58,
14405,
17752,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
5464,
2097,
55609,
198,
54644,
1984,
505,
6743,
627,
16503,
4933,
2310,
70693,
4194,
8345,
4194,
682,
5151,
55609,
198,
94201,
409,
70693,
10163,
422,
4927,
12418,
374,
1511,
627,
6766,
4971,
2703,
25,
9323,
58,
1858,
11,
610,
2526,
11651,
2290,
55609,
198,
8960,
279,
445,
11237,
627,
9905,
198,
1213,
2703,
1389,
8092,
311,
1052,
311,
3665,
279,
445,
11237,
311,
627,
13617,
512,
497,
2082,
9612,
487,
10344,
198,
657,
76,
5799,
4971,
2703,
45221,
2398,
14,
657,
76,
34506,
863,
340,
16503,
743,
69021,
4194,
8345,
4194,
14008,
55609,
198,
2746,
14008,
374,
2290,
11,
743,
433,
627,
2028,
6276,
3932,
311,
1522,
304,
2290,
439,
14008,
311,
2680,
279,
3728,
6376,
13
] | https://langchain.readthedocs.io/en/latest/embeddings/langchain.embeddings.self_hosted_hugging_face.SelfHostedHuggingFaceEmbeddings.html |
977de7b0dc05-4 | This allows users to pass in None as verbose to access the global setting.
to_json() → Union[SerializedConstructor, SerializedNotImplemented]¶
to_json_not_implemented() → SerializedNotImplemented¶
property lc_attributes: Dict¶
Return a list of attribute names that should be included in the
serialized kwargs. These attributes must be accepted by the
constructor.
property lc_namespace: List[str]¶
Return the namespace of the langchain object.
eg. [“langchain”, “llms”, “openai”]
property lc_secrets: Dict[str, str]¶
Return a map of constructor argument names to secret ids.
eg. {“openai_api_key”: “OPENAI_API_KEY”}
property lc_serializable: bool¶
Return whether or not the class is serializable.
model Config¶
Bases: object
Configuration for this pydantic object.
extra = 'forbid'¶ | [
2028,
6276,
3932,
311,
1522,
304,
2290,
439,
14008,
311,
2680,
279,
3728,
6376,
627,
998,
9643,
368,
11651,
9323,
58,
78621,
13591,
11,
92572,
2688,
18804,
60,
55609,
198,
998,
9643,
8072,
18377,
14565,
368,
11651,
92572,
2688,
18804,
55609,
198,
3784,
37313,
18741,
25,
30226,
55609,
198,
5715,
264,
1160,
315,
7180,
5144,
430,
1288,
387,
5343,
304,
279,
198,
76377,
16901,
13,
4314,
8365,
2011,
387,
11928,
555,
279,
198,
22602,
627,
3784,
37313,
42671,
25,
1796,
17752,
60,
55609,
198,
5715,
279,
4573,
315,
279,
8859,
8995,
1665,
627,
797,
13,
510,
2118,
5317,
8995,
9520,
1054,
657,
1026,
9520,
1054,
2569,
2192,
863,
933,
3784,
37313,
3537,
53810,
25,
30226,
17752,
11,
610,
60,
55609,
198,
5715,
264,
2472,
315,
4797,
5811,
5144,
311,
6367,
14483,
627,
797,
13,
314,
2118,
2569,
2192,
11959,
3173,
57633,
1054,
32033,
15836,
11669,
6738,
863,
534,
3784,
37313,
26684,
8499,
25,
1845,
55609,
198,
5715,
3508,
477,
539,
279,
538,
374,
6275,
8499,
627,
2590,
5649,
55609,
198,
33,
2315,
25,
1665,
198,
7843,
369,
420,
4611,
67,
8322,
1665,
627,
15824,
284,
364,
2000,
21301,
6,
55609
] | https://langchain.readthedocs.io/en/latest/embeddings/langchain.embeddings.self_hosted_hugging_face.SelfHostedHuggingFaceEmbeddings.html |
4c0b428d37d2-0 | langchain.embeddings.google_palm.embed_with_retry¶
langchain.embeddings.google_palm.embed_with_retry(embeddings: GooglePalmEmbeddings, *args: Any, **kwargs: Any) → Any[source]¶
Use tenacity to retry the completion call. | [
5317,
8995,
41541,
25624,
5831,
623,
7828,
41541,
6753,
63845,
55609,
198,
5317,
8995,
41541,
25624,
5831,
623,
7828,
41541,
6753,
63845,
50825,
25624,
25,
5195,
47,
7828,
26566,
25624,
11,
353,
2164,
25,
5884,
11,
3146,
9872,
25,
5884,
8,
11651,
5884,
76747,
60,
55609,
198,
10464,
5899,
4107,
311,
23515,
279,
9954,
1650,
13
] | https://langchain.readthedocs.io/en/latest/embeddings/langchain.embeddings.google_palm.embed_with_retry.html |
fa0e049fe27f-0 | langchain.embeddings.self_hosted_hugging_face.SelfHostedHuggingFaceInstructEmbeddings¶
class langchain.embeddings.self_hosted_hugging_face.SelfHostedHuggingFaceInstructEmbeddings(*, cache: ~typing.Optional[bool] = None, verbose: bool = None, callbacks: ~typing.Optional[~typing.Union[~typing.List[~langchain.callbacks.base.BaseCallbackHandler], ~langchain.callbacks.base.BaseCallbackManager]] = None, callback_manager: ~typing.Optional[~langchain.callbacks.base.BaseCallbackManager] = None, tags: ~typing.Optional[~typing.List[str]] = None, pipeline_ref: ~typing.Any = None, client: ~typing.Any = None, inference_fn: ~typing.Callable = <function _embed_documents>, hardware: ~typing.Any = None, model_load_fn: ~typing.Callable = <function load_embedding_model>, load_fn_kwargs: ~typing.Optional[dict] = None, model_reqs: ~typing.List[str] = ['./', 'InstructorEmbedding', 'torch'], inference_kwargs: ~typing.Any = None, model_id: str = 'hkunlp/instructor-large', embed_instruction: str = 'Represent the document for retrieval: ', query_instruction: str = 'Represent the question for retrieving supporting documents: ')[source]¶
Bases: SelfHostedHuggingFaceEmbeddings
Runs InstructorEmbedding embedding models on self-hosted remote hardware.
Supported hardware includes auto-launched instances on AWS, GCP, Azure,
and Lambda, as well as servers specified
by IP address and SSH credentials (such as on-prem, or another
cloud like Paperspace, Coreweave, etc.).
To use, you should have the runhouse python package installed.
Example
from langchain.embeddings import SelfHostedHuggingFaceInstructEmbeddings
import runhouse as rh | [
5317,
8995,
41541,
25624,
28248,
13144,
291,
1552,
36368,
32085,
815,
491,
9480,
291,
39,
36368,
16680,
644,
1257,
26566,
25624,
55609,
198,
1058,
8859,
8995,
41541,
25624,
28248,
13144,
291,
1552,
36368,
32085,
815,
491,
9480,
291,
39,
36368,
16680,
644,
1257,
26566,
25624,
4163,
11,
6636,
25,
4056,
90902,
37464,
58,
2707,
60,
284,
2290,
11,
14008,
25,
1845,
284,
2290,
11,
27777,
25,
4056,
90902,
37464,
58,
93,
90902,
10840,
290,
58,
93,
90902,
5937,
58,
93,
5317,
8995,
72134,
9105,
13316,
7646,
3126,
1145,
4056,
5317,
8995,
72134,
9105,
13316,
7646,
2087,
5163,
284,
2290,
11,
4927,
12418,
25,
4056,
90902,
37464,
58,
93,
5317,
8995,
72134,
9105,
13316,
7646,
2087,
60,
284,
2290,
11,
9681,
25,
4056,
90902,
37464,
58,
93,
90902,
5937,
17752,
5163,
284,
2290,
11,
15660,
7949,
25,
4056,
90902,
13614,
284,
2290,
11,
3016,
25,
4056,
90902,
13614,
284,
2290,
11,
45478,
15604,
25,
4056,
90902,
28115,
481,
284,
366,
1723,
721,
12529,
77027,
8226,
12035,
25,
4056,
90902,
13614,
284,
2290,
11,
1646,
12693,
15604,
25,
4056,
90902,
28115,
481,
284,
366,
1723,
2865,
52602,
5156,
8226,
2865,
15604,
37335,
25,
4056,
90902,
37464,
58,
8644,
60,
284,
2290,
11,
1646,
18110,
82,
25,
4056,
90902,
5937,
17752,
60,
284,
18701,
518,
364,
644,
3162,
26566,
7113,
518,
364,
28514,
4181,
45478,
37335,
25,
4056,
90902,
13614,
284,
2290,
11,
1646,
851,
25,
610,
284,
364,
86611,
359,
13855,
18480,
3162,
40248,
518,
11840,
56023,
25,
610,
284,
364,
66843,
279,
2246,
369,
57470,
25,
6752,
3319,
56023,
25,
610,
284,
364,
66843,
279,
3488,
369,
49324,
12899,
9477,
25,
64581,
2484,
60,
55609,
198,
33,
2315,
25,
10323,
9480,
291,
39,
36368,
16680,
26566,
25624,
198,
75020,
63462,
26566,
7113,
40188,
4211,
389,
659,
39689,
291,
8870,
12035,
627,
35736,
12035,
5764,
3313,
53926,
10880,
13422,
389,
24124,
11,
480,
7269,
11,
35219,
345,
438,
45621,
11,
439,
1664,
439,
16692,
5300,
198,
1729,
6933,
2686,
323,
41563,
16792,
320,
21470,
439,
389,
22041,
76,
11,
477,
2500,
198,
12641,
1093,
45231,
1330,
11,
9708,
906,
525,
11,
5099,
13,
4390,
1271,
1005,
11,
499,
1288,
617,
279,
1629,
7830,
10344,
6462,
10487,
627,
13617,
198,
1527,
8859,
8995,
41541,
25624,
1179,
10323,
9480,
291,
39,
36368,
16680,
644,
1257,
26566,
25624,
198,
475,
1629,
7830,
439,
22408
] | https://langchain.readthedocs.io/en/latest/embeddings/langchain.embeddings.self_hosted_hugging_face.SelfHostedHuggingFaceInstructEmbeddings.html |
fa0e049fe27f-1 | import runhouse as rh
model_name = "hkunlp/instructor-large"
gpu = rh.cluster(name='rh-a10x', instance_type='A100:1')
hf = SelfHostedHuggingFaceInstructEmbeddings(
model_name=model_name, hardware=gpu)
Initialize the remote inference function.
param cache: Optional[bool] = None¶
param callback_manager: Optional[BaseCallbackManager] = None¶
param callbacks: Callbacks = None¶
param embed_instruction: str = 'Represent the document for retrieval: '¶
Instruction to use for embedding documents.
param hardware: Any = None¶
Remote hardware to send the inference function to.
param inference_fn: Callable = <function _embed_documents>¶
Inference function to extract the embeddings.
param inference_kwargs: Any = None¶
Any kwargs to pass to the model’s inference function.
param load_fn_kwargs: Optional[dict] = None¶
Key word arguments to pass to the model load function.
param model_id: str = 'hkunlp/instructor-large'¶
Model name to use.
param model_load_fn: Callable = <function load_embedding_model>¶
Function to load the model remotely on the server.
param model_reqs: List[str] = ['./', 'InstructorEmbedding', 'torch']¶
Requirements to install on hardware to inference the model.
param pipeline_ref: Any = None¶
param query_instruction: str = 'Represent the question for retrieving supporting documents: '¶
Instruction to use for embedding query.
param tags: Optional[List[str]] = None¶
Tags to add to the run trace.
param verbose: bool [Optional]¶
Whether to print out response text. | [
475,
1629,
7830,
439,
22408,
198,
2590,
1292,
284,
330,
86611,
359,
13855,
18480,
3162,
40248,
702,
43694,
284,
22408,
41601,
3232,
1151,
41196,
7561,
605,
87,
518,
2937,
1857,
1151,
32,
1041,
25,
16,
1329,
45854,
284,
10323,
9480,
291,
39,
36368,
16680,
644,
1257,
26566,
25624,
1021,
262,
1646,
1292,
63596,
1292,
11,
12035,
38262,
5701,
340,
10130,
279,
8870,
45478,
734,
627,
913,
6636,
25,
12536,
58,
2707,
60,
284,
2290,
55609,
198,
913,
4927,
12418,
25,
12536,
58,
4066,
7646,
2087,
60,
284,
2290,
55609,
198,
913,
27777,
25,
23499,
82,
284,
2290,
55609,
198,
913,
11840,
56023,
25,
610,
284,
364,
66843,
279,
2246,
369,
57470,
25,
364,
55609,
198,
17077,
311,
1005,
369,
40188,
9477,
627,
913,
12035,
25,
5884,
284,
2290,
55609,
198,
25732,
12035,
311,
3708,
279,
45478,
734,
311,
627,
913,
45478,
15604,
25,
54223,
284,
366,
1723,
721,
12529,
77027,
29,
55609,
198,
644,
2251,
734,
311,
8819,
279,
71647,
627,
913,
45478,
37335,
25,
5884,
284,
2290,
55609,
198,
8780,
16901,
311,
1522,
311,
279,
1646,
753,
45478,
734,
627,
913,
2865,
15604,
37335,
25,
12536,
58,
8644,
60,
284,
2290,
55609,
198,
1622,
3492,
6105,
311,
1522,
311,
279,
1646,
2865,
734,
627,
913,
1646,
851,
25,
610,
284,
364,
86611,
359,
13855,
18480,
3162,
40248,
6,
55609,
198,
1747,
836,
311,
1005,
627,
913,
1646,
12693,
15604,
25,
54223,
284,
366,
1723,
2865,
52602,
5156,
29,
55609,
198,
5263,
311,
2865,
279,
1646,
39529,
389,
279,
3622,
627,
913,
1646,
18110,
82,
25,
1796,
17752,
60,
284,
18701,
518,
364,
644,
3162,
26566,
7113,
518,
364,
28514,
663,
55609,
198,
60302,
311,
4685,
389,
12035,
311,
45478,
279,
1646,
627,
913,
15660,
7949,
25,
5884,
284,
2290,
55609,
198,
913,
3319,
56023,
25,
610,
284,
364,
66843,
279,
3488,
369,
49324,
12899,
9477,
25,
364,
55609,
198,
17077,
311,
1005,
369,
40188,
3319,
627,
913,
9681,
25,
12536,
53094,
17752,
5163,
284,
2290,
55609,
198,
16309,
311,
923,
311,
279,
1629,
11917,
627,
913,
14008,
25,
1845,
510,
15669,
60,
55609,
198,
25729,
311,
1194,
704,
2077,
1495,
13
] | https://langchain.readthedocs.io/en/latest/embeddings/langchain.embeddings.self_hosted_hugging_face.SelfHostedHuggingFaceInstructEmbeddings.html |
fa0e049fe27f-2 | param verbose: bool [Optional]¶
Whether to print out response text.
__call__(prompt: str, stop: Optional[List[str]] = None, callbacks: Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]] = None, **kwargs: Any) → str¶
Check Cache and run the LLM on the given prompt and input.
async agenerate(prompts: List[str], stop: Optional[List[str]] = None, callbacks: Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]] = None, *, tags: Optional[List[str]] = None, **kwargs: Any) → LLMResult¶
Run the LLM on the given prompt and input.
async agenerate_prompt(prompts: List[PromptValue], stop: Optional[List[str]] = None, callbacks: Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]] = None, **kwargs: Any) → LLMResult¶
Take in a list of prompt values and return an LLMResult.
classmethod all_required_field_names() → Set¶
async apredict(text: str, *, stop: Optional[Sequence[str]] = None, **kwargs: Any) → str¶
Predict text from text.
async apredict_messages(messages: List[BaseMessage], *, stop: Optional[Sequence[str]] = None, **kwargs: Any) → BaseMessage¶
Predict message from messages.
dict(**kwargs: Any) → Dict¶
Return a dictionary of the LLM.
embed_documents(texts: List[str]) → List[List[float]][source]¶
Compute doc embeddings using a HuggingFace instruct model.
Parameters
texts – The list of texts to embed.
Returns
List of embeddings, one for each text.
embed_query(text: str) → List[float][source]¶
Compute query embeddings using a HuggingFace instruct model.
Parameters
text – The text to embed.
Returns | [
913,
14008,
25,
1845,
510,
15669,
60,
55609,
198,
25729,
311,
1194,
704,
2077,
1495,
627,
565,
6797,
3889,
41681,
25,
610,
11,
3009,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
27777,
25,
12536,
58,
33758,
53094,
58,
4066,
7646,
3126,
1145,
5464,
7646,
2087,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
610,
55609,
198,
4061,
20044,
323,
1629,
279,
445,
11237,
389,
279,
2728,
10137,
323,
1988,
627,
7847,
945,
13523,
84432,
13044,
25,
1796,
17752,
1145,
3009,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
27777,
25,
12536,
58,
33758,
53094,
58,
4066,
7646,
3126,
1145,
5464,
7646,
2087,
5163,
284,
2290,
11,
12039,
9681,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
445,
11237,
2122,
55609,
198,
6869,
279,
445,
11237,
389,
279,
2728,
10137,
323,
1988,
627,
7847,
945,
13523,
62521,
84432,
13044,
25,
1796,
43447,
15091,
1150,
1145,
3009,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
27777,
25,
12536,
58,
33758,
53094,
58,
4066,
7646,
3126,
1145,
5464,
7646,
2087,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
445,
11237,
2122,
55609,
198,
18293,
304,
264,
1160,
315,
10137,
2819,
323,
471,
459,
445,
11237,
2122,
627,
27853,
682,
19265,
5121,
9366,
368,
11651,
2638,
55609,
198,
7847,
1469,
9037,
7383,
25,
610,
11,
12039,
3009,
25,
12536,
58,
14405,
17752,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
610,
55609,
198,
54644,
1495,
505,
1495,
627,
7847,
1469,
9037,
24321,
56805,
25,
1796,
58,
4066,
2097,
1145,
12039,
3009,
25,
12536,
58,
14405,
17752,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
5464,
2097,
55609,
198,
54644,
1984,
505,
6743,
627,
8644,
22551,
9872,
25,
5884,
8,
11651,
30226,
55609,
198,
5715,
264,
11240,
315,
279,
445,
11237,
627,
12529,
77027,
7383,
82,
25,
1796,
17752,
2526,
11651,
1796,
53094,
96481,
28819,
2484,
60,
55609,
198,
47354,
4733,
71647,
1701,
264,
473,
36368,
16680,
21745,
1646,
627,
9905,
198,
87042,
1389,
578,
1160,
315,
22755,
311,
11840,
627,
16851,
198,
861,
315,
71647,
11,
832,
369,
1855,
1495,
627,
12529,
5857,
7383,
25,
610,
8,
11651,
1796,
96481,
1483,
2484,
60,
55609,
198,
47354,
3319,
71647,
1701,
264,
473,
36368,
16680,
21745,
1646,
627,
9905,
198,
1342,
1389,
578,
1495,
311,
11840,
627,
16851
] | https://langchain.readthedocs.io/en/latest/embeddings/langchain.embeddings.self_hosted_hugging_face.SelfHostedHuggingFaceInstructEmbeddings.html |
fa0e049fe27f-3 | Parameters
text – The text to embed.
Returns
Embeddings for the text.
classmethod from_pipeline(pipeline: Any, hardware: Any, model_reqs: Optional[List[str]] = None, device: int = 0, **kwargs: Any) → LLM¶
Init the SelfHostedPipeline from a pipeline object or string.
generate(prompts: List[str], stop: Optional[List[str]] = None, callbacks: Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]] = None, *, tags: Optional[List[str]] = None, **kwargs: Any) → LLMResult¶
Run the LLM on the given prompt and input.
generate_prompt(prompts: List[PromptValue], stop: Optional[List[str]] = None, callbacks: Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]] = None, **kwargs: Any) → LLMResult¶
Take in a list of prompt values and return an LLMResult.
get_num_tokens(text: str) → int¶
Get the number of tokens present in the text.
get_num_tokens_from_messages(messages: List[BaseMessage]) → int¶
Get the number of tokens in the message.
get_token_ids(text: str) → List[int]¶
Get the token present in the text.
predict(text: str, *, stop: Optional[Sequence[str]] = None, **kwargs: Any) → str¶
Predict text from text.
predict_messages(messages: List[BaseMessage], *, stop: Optional[Sequence[str]] = None, **kwargs: Any) → BaseMessage¶
Predict message from messages.
validator raise_deprecation » all fields¶
Raise deprecation warning if callback_manager is used.
save(file_path: Union[Path, str]) → None¶
Save the LLM.
Parameters
file_path – Path to file to save the LLM to. | [
9905,
198,
1342,
1389,
578,
1495,
311,
11840,
627,
16851,
198,
26566,
25624,
369,
279,
1495,
627,
27853,
505,
46287,
1319,
8966,
25,
5884,
11,
12035,
25,
5884,
11,
1646,
18110,
82,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
3756,
25,
528,
284,
220,
15,
11,
3146,
9872,
25,
5884,
8,
11651,
445,
11237,
55609,
198,
3888,
279,
10323,
9480,
291,
35756,
505,
264,
15660,
1665,
477,
925,
627,
19927,
84432,
13044,
25,
1796,
17752,
1145,
3009,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
27777,
25,
12536,
58,
33758,
53094,
58,
4066,
7646,
3126,
1145,
5464,
7646,
2087,
5163,
284,
2290,
11,
12039,
9681,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
445,
11237,
2122,
55609,
198,
6869,
279,
445,
11237,
389,
279,
2728,
10137,
323,
1988,
627,
19927,
62521,
84432,
13044,
25,
1796,
43447,
15091,
1150,
1145,
3009,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
27777,
25,
12536,
58,
33758,
53094,
58,
4066,
7646,
3126,
1145,
5464,
7646,
2087,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
445,
11237,
2122,
55609,
198,
18293,
304,
264,
1160,
315,
10137,
2819,
323,
471,
459,
445,
11237,
2122,
627,
456,
4369,
29938,
7383,
25,
610,
8,
11651,
528,
55609,
198,
1991,
279,
1396,
315,
11460,
3118,
304,
279,
1495,
627,
456,
4369,
29938,
5791,
24321,
56805,
25,
1796,
58,
4066,
2097,
2526,
11651,
528,
55609,
198,
1991,
279,
1396,
315,
11460,
304,
279,
1984,
627,
456,
6594,
8237,
7383,
25,
610,
8,
11651,
1796,
19155,
60,
55609,
198,
1991,
279,
4037,
3118,
304,
279,
1495,
627,
35798,
7383,
25,
610,
11,
12039,
3009,
25,
12536,
58,
14405,
17752,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
610,
55609,
198,
54644,
1495,
505,
1495,
627,
35798,
24321,
56805,
25,
1796,
58,
4066,
2097,
1145,
12039,
3009,
25,
12536,
58,
14405,
17752,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
5464,
2097,
55609,
198,
54644,
1984,
505,
6743,
627,
16503,
4933,
2310,
70693,
4194,
8345,
4194,
682,
5151,
55609,
198,
94201,
409,
70693,
10163,
422,
4927,
12418,
374,
1511,
627,
6766,
4971,
2703,
25,
9323,
58,
1858,
11,
610,
2526,
11651,
2290,
55609,
198,
8960,
279,
445,
11237,
627,
9905,
198,
1213,
2703,
1389,
8092,
311,
1052,
311,
3665,
279,
445,
11237,
311,
13
] | https://langchain.readthedocs.io/en/latest/embeddings/langchain.embeddings.self_hosted_hugging_face.SelfHostedHuggingFaceInstructEmbeddings.html |
fa0e049fe27f-4 | Parameters
file_path – Path to file to save the LLM to.
Example:
.. code-block:: python
llm.save(file_path=”path/llm.yaml”)
validator set_verbose » verbose¶
If verbose is None, set it.
This allows users to pass in None as verbose to access the global setting.
to_json() → Union[SerializedConstructor, SerializedNotImplemented]¶
to_json_not_implemented() → SerializedNotImplemented¶
property lc_attributes: Dict¶
Return a list of attribute names that should be included in the
serialized kwargs. These attributes must be accepted by the
constructor.
property lc_namespace: List[str]¶
Return the namespace of the langchain object.
eg. [“langchain”, “llms”, “openai”]
property lc_secrets: Dict[str, str]¶
Return a map of constructor argument names to secret ids.
eg. {“openai_api_key”: “OPENAI_API_KEY”}
property lc_serializable: bool¶
Return whether or not the class is serializable.
model Config¶
Bases: object
Configuration for this pydantic object.
extra = 'forbid'¶ | [
9905,
198,
1213,
2703,
1389,
8092,
311,
1052,
311,
3665,
279,
445,
11237,
311,
627,
13617,
512,
497,
2082,
9612,
487,
10344,
198,
657,
76,
5799,
4971,
2703,
45221,
2398,
14,
657,
76,
34506,
863,
340,
16503,
743,
69021,
4194,
8345,
4194,
14008,
55609,
198,
2746,
14008,
374,
2290,
11,
743,
433,
627,
2028,
6276,
3932,
311,
1522,
304,
2290,
439,
14008,
311,
2680,
279,
3728,
6376,
627,
998,
9643,
368,
11651,
9323,
58,
78621,
13591,
11,
92572,
2688,
18804,
60,
55609,
198,
998,
9643,
8072,
18377,
14565,
368,
11651,
92572,
2688,
18804,
55609,
198,
3784,
37313,
18741,
25,
30226,
55609,
198,
5715,
264,
1160,
315,
7180,
5144,
430,
1288,
387,
5343,
304,
279,
198,
76377,
16901,
13,
4314,
8365,
2011,
387,
11928,
555,
279,
198,
22602,
627,
3784,
37313,
42671,
25,
1796,
17752,
60,
55609,
198,
5715,
279,
4573,
315,
279,
8859,
8995,
1665,
627,
797,
13,
510,
2118,
5317,
8995,
9520,
1054,
657,
1026,
9520,
1054,
2569,
2192,
863,
933,
3784,
37313,
3537,
53810,
25,
30226,
17752,
11,
610,
60,
55609,
198,
5715,
264,
2472,
315,
4797,
5811,
5144,
311,
6367,
14483,
627,
797,
13,
314,
2118,
2569,
2192,
11959,
3173,
57633,
1054,
32033,
15836,
11669,
6738,
863,
534,
3784,
37313,
26684,
8499,
25,
1845,
55609,
198,
5715,
3508,
477,
539,
279,
538,
374,
6275,
8499,
627,
2590,
5649,
55609,
198,
33,
2315,
25,
1665,
198,
7843,
369,
420,
4611,
67,
8322,
1665,
627,
15824,
284,
364,
2000,
21301,
6,
55609
] | https://langchain.readthedocs.io/en/latest/embeddings/langchain.embeddings.self_hosted_hugging_face.SelfHostedHuggingFaceInstructEmbeddings.html |
03ea109820de-0 | langchain.embeddings.openai.OpenAIEmbeddings¶
class langchain.embeddings.openai.OpenAIEmbeddings(*, client: Any = None, model: str = 'text-embedding-ada-002', deployment: str = 'text-embedding-ada-002', openai_api_version: Optional[str] = None, openai_api_base: Optional[str] = None, openai_api_type: Optional[str] = None, openai_proxy: Optional[str] = None, embedding_ctx_length: int = 8191, openai_api_key: Optional[str] = None, openai_organization: Optional[str] = None, allowed_special: Union[Literal['all'], Set[str]] = {}, disallowed_special: Union[Literal['all'], Set[str], Sequence[str]] = 'all', chunk_size: int = 1000, max_retries: int = 6, request_timeout: Optional[Union[float, Tuple[float, float]]] = None, headers: Any = None, tiktoken_model_name: Optional[str] = None)[source]¶
Bases: BaseModel, Embeddings
Wrapper around OpenAI embedding models.
To use, you should have the openai python package installed, and the
environment variable OPENAI_API_KEY set with your API key or pass it
as a named parameter to the constructor.
Example
from langchain.embeddings import OpenAIEmbeddings
openai = OpenAIEmbeddings(openai_api_key="my-api-key")
In order to use the library with Microsoft Azure endpoints, you need to set
the OPENAI_API_TYPE, OPENAI_API_BASE, OPENAI_API_KEY and OPENAI_API_VERSION.
The OPENAI_API_TYPE must be set to ‘azure’ and the others correspond to
the properties of your endpoint.
In addition, the deployment name must be passed as the model parameter.
Example
import os | [
5317,
8995,
41541,
25624,
5949,
2192,
13250,
15836,
26566,
25624,
55609,
198,
1058,
8859,
8995,
41541,
25624,
5949,
2192,
13250,
15836,
26566,
25624,
4163,
11,
3016,
25,
5884,
284,
2290,
11,
1646,
25,
610,
284,
364,
1342,
12,
95711,
12,
2649,
12,
6726,
518,
24047,
25,
610,
284,
364,
1342,
12,
95711,
12,
2649,
12,
6726,
518,
1825,
2192,
11959,
9625,
25,
12536,
17752,
60,
284,
2290,
11,
1825,
2192,
11959,
7806,
25,
12536,
17752,
60,
284,
2290,
11,
1825,
2192,
11959,
1857,
25,
12536,
17752,
60,
284,
2290,
11,
1825,
2192,
30812,
25,
12536,
17752,
60,
284,
2290,
11,
40188,
15498,
5228,
25,
528,
284,
220,
18831,
16,
11,
1825,
2192,
11959,
3173,
25,
12536,
17752,
60,
284,
2290,
11,
1825,
2192,
83452,
25,
12536,
17752,
60,
284,
2290,
11,
5535,
42729,
25,
9323,
58,
17802,
681,
543,
4181,
2638,
17752,
5163,
284,
16857,
834,
21642,
42729,
25,
9323,
58,
17802,
681,
543,
4181,
2638,
17752,
1145,
29971,
17752,
5163,
284,
364,
543,
518,
12143,
2424,
25,
528,
284,
220,
1041,
15,
11,
1973,
1311,
4646,
25,
528,
284,
220,
21,
11,
1715,
21179,
25,
12536,
58,
33758,
96481,
11,
25645,
96481,
11,
2273,
5163,
60,
284,
2290,
11,
7247,
25,
5884,
284,
2290,
11,
87272,
5963,
5156,
1292,
25,
12536,
17752,
60,
284,
2290,
6758,
2484,
60,
55609,
198,
33,
2315,
25,
65705,
11,
38168,
25624,
198,
11803,
2212,
5377,
15836,
40188,
4211,
627,
1271,
1005,
11,
499,
1288,
617,
279,
1825,
2192,
10344,
6462,
10487,
11,
323,
279,
198,
24175,
3977,
30941,
15836,
11669,
6738,
743,
449,
701,
5446,
1401,
477,
1522,
433,
198,
300,
264,
7086,
5852,
311,
279,
4797,
627,
13617,
198,
1527,
8859,
8995,
41541,
25624,
1179,
5377,
15836,
26566,
25624,
198,
2569,
2192,
284,
5377,
15836,
26566,
25624,
32081,
2192,
11959,
3173,
429,
2465,
24851,
16569,
1158,
644,
2015,
311,
1005,
279,
6875,
449,
5210,
35219,
37442,
11,
499,
1205,
311,
743,
198,
1820,
30941,
15836,
11669,
4283,
11,
30941,
15836,
11669,
12024,
11,
30941,
15836,
11669,
6738,
323,
30941,
15836,
11669,
10907,
627,
791,
30941,
15836,
11669,
4283,
2011,
387,
743,
311,
3451,
40595,
529,
323,
279,
3885,
8024,
311,
198,
1820,
6012,
315,
701,
15233,
627,
644,
5369,
11,
279,
24047,
836,
2011,
387,
5946,
439,
279,
1646,
5852,
627,
13617,
198,
475,
2709
] | https://langchain.readthedocs.io/en/latest/embeddings/langchain.embeddings.openai.OpenAIEmbeddings.html |
03ea109820de-1 | In addition, the deployment name must be passed as the model parameter.
Example
import os
os.environ["OPENAI_API_TYPE"] = "azure"
os.environ["OPENAI_API_BASE"] = "https://<your-endpoint.openai.azure.com/"
os.environ["OPENAI_API_KEY"] = "your AzureOpenAI key"
os.environ["OPENAI_API_VERSION"] = "2023-03-15-preview"
os.environ["OPENAI_PROXY"] = "http://your-corporate-proxy:8080"
from langchain.embeddings.openai import OpenAIEmbeddings
embeddings = OpenAIEmbeddings(
deployment="your-embeddings-deployment-name",
model="your-embeddings-model-name",
openai_api_base="https://your-endpoint.openai.azure.com/",
openai_api_type="azure",
)
text = "This is a test query."
query_result = embeddings.embed_query(text)
Create a new model by parsing and validating input data from keyword arguments.
Raises ValidationError if the input data cannot be parsed to form a valid model.
param allowed_special: Union[Literal['all'], Set[str]] = {}¶
param chunk_size: int = 1000¶
Maximum number of texts to embed in each batch
param deployment: str = 'text-embedding-ada-002'¶
param disallowed_special: Union[Literal['all'], Set[str], Sequence[str]] = 'all'¶
param embedding_ctx_length: int = 8191¶
param headers: Any = None¶
param max_retries: int = 6¶
Maximum number of retries to make when generating.
param model: str = 'text-embedding-ada-002'¶
param openai_api_base: Optional[str] = None¶
param openai_api_key: Optional[str] = None¶ | [
644,
5369,
11,
279,
24047,
836,
2011,
387,
5946,
439,
279,
1646,
5852,
627,
13617,
198,
475,
2709,
198,
437,
24656,
1204,
32033,
15836,
11669,
4283,
1365,
284,
330,
40595,
702,
437,
24656,
1204,
32033,
15836,
11669,
12024,
1365,
284,
330,
2485,
1129,
27,
22479,
13368,
2837,
5949,
2192,
71340,
916,
30655,
437,
24656,
1204,
32033,
15836,
11669,
6738,
1365,
284,
330,
22479,
35219,
5109,
15836,
1401,
702,
437,
24656,
1204,
32033,
15836,
11669,
10907,
1365,
284,
330,
2366,
18,
12,
2839,
12,
868,
51981,
702,
437,
24656,
1204,
32033,
15836,
60165,
1365,
284,
330,
1277,
1129,
22479,
1824,
39382,
349,
84801,
25,
11770,
15,
702,
1527,
8859,
8995,
41541,
25624,
5949,
2192,
1179,
5377,
15836,
26566,
25624,
198,
12529,
25624,
284,
5377,
15836,
26566,
25624,
1021,
262,
24047,
429,
22479,
12,
12529,
25624,
6953,
53899,
11753,
761,
262,
1646,
429,
22479,
12,
12529,
25624,
29344,
11753,
761,
262,
1825,
2192,
11959,
7806,
429,
2485,
1129,
22479,
13368,
2837,
5949,
2192,
71340,
916,
36175,
262,
1825,
2192,
11959,
1857,
429,
40595,
761,
340,
1342,
284,
330,
2028,
374,
264,
1296,
3319,
10246,
1663,
5400,
284,
71647,
41541,
5857,
7383,
340,
4110,
264,
502,
1646,
555,
23115,
323,
69772,
1988,
828,
505,
16570,
6105,
627,
36120,
54129,
422,
279,
1988,
828,
4250,
387,
16051,
311,
1376,
264,
2764,
1646,
627,
913,
5535,
42729,
25,
9323,
58,
17802,
681,
543,
4181,
2638,
17752,
5163,
284,
4792,
55609,
198,
913,
12143,
2424,
25,
528,
284,
220,
1041,
15,
55609,
198,
28409,
1396,
315,
22755,
311,
11840,
304,
1855,
7309,
198,
913,
24047,
25,
610,
284,
364,
1342,
12,
95711,
12,
2649,
12,
6726,
6,
55609,
198,
913,
834,
21642,
42729,
25,
9323,
58,
17802,
681,
543,
4181,
2638,
17752,
1145,
29971,
17752,
5163,
284,
364,
543,
6,
55609,
198,
913,
40188,
15498,
5228,
25,
528,
284,
220,
18831,
16,
55609,
198,
913,
7247,
25,
5884,
284,
2290,
55609,
198,
913,
1973,
1311,
4646,
25,
528,
284,
220,
21,
55609,
198,
28409,
1396,
315,
61701,
311,
1304,
994,
24038,
627,
913,
1646,
25,
610,
284,
364,
1342,
12,
95711,
12,
2649,
12,
6726,
6,
55609,
198,
913,
1825,
2192,
11959,
7806,
25,
12536,
17752,
60,
284,
2290,
55609,
198,
913,
1825,
2192,
11959,
3173,
25,
12536,
17752,
60,
284,
2290,
55609
] | https://langchain.readthedocs.io/en/latest/embeddings/langchain.embeddings.openai.OpenAIEmbeddings.html |
03ea109820de-2 | param openai_api_key: Optional[str] = None¶
param openai_api_type: Optional[str] = None¶
param openai_api_version: Optional[str] = None¶
param openai_organization: Optional[str] = None¶
param openai_proxy: Optional[str] = None¶
param request_timeout: Optional[Union[float, Tuple[float, float]]] = None¶
Timeout in seconds for the OpenAPI request.
param tiktoken_model_name: Optional[str] = None¶
The model name to pass to tiktoken when using this class.
Tiktoken is used to count the number of tokens in documents to constrain
them to be under a certain limit. By default, when set to None, this will
be the same as the embedding model name. However, there are some cases
where you may want to use this Embedding class with a model name not
supported by tiktoken. This can include when using Azure embeddings or
when using one of the many model providers that expose an OpenAI-like
API but with different models. In those cases, in order to avoid erroring
when tiktoken is called, you can specify a model name to use here.
async aembed_documents(texts: List[str], chunk_size: Optional[int] = 0) → List[List[float]][source]¶
Call out to OpenAI’s embedding endpoint async for embedding search docs.
Parameters
texts – The list of texts to embed.
chunk_size – The chunk size of embeddings. If None, will use the chunk size
specified by the class.
Returns
List of embeddings, one for each text.
async aembed_query(text: str) → List[float][source]¶
Call out to OpenAI’s embedding endpoint async for embedding query text.
Parameters
text – The text to embed.
Returns
Embedding for the text. | [
913,
1825,
2192,
11959,
3173,
25,
12536,
17752,
60,
284,
2290,
55609,
198,
913,
1825,
2192,
11959,
1857,
25,
12536,
17752,
60,
284,
2290,
55609,
198,
913,
1825,
2192,
11959,
9625,
25,
12536,
17752,
60,
284,
2290,
55609,
198,
913,
1825,
2192,
83452,
25,
12536,
17752,
60,
284,
2290,
55609,
198,
913,
1825,
2192,
30812,
25,
12536,
17752,
60,
284,
2290,
55609,
198,
913,
1715,
21179,
25,
12536,
58,
33758,
96481,
11,
25645,
96481,
11,
2273,
5163,
60,
284,
2290,
55609,
198,
7791,
304,
6622,
369,
279,
5377,
7227,
1715,
627,
913,
87272,
5963,
5156,
1292,
25,
12536,
17752,
60,
284,
2290,
55609,
198,
791,
1646,
836,
311,
1522,
311,
87272,
5963,
994,
1701,
420,
538,
627,
51,
1609,
5963,
374,
1511,
311,
1797,
279,
1396,
315,
11460,
304,
9477,
311,
80799,
198,
49818,
311,
387,
1234,
264,
3738,
4017,
13,
3296,
1670,
11,
994,
743,
311,
2290,
11,
420,
690,
198,
1395,
279,
1890,
439,
279,
40188,
1646,
836,
13,
4452,
11,
1070,
527,
1063,
5157,
198,
2940,
499,
1253,
1390,
311,
1005,
420,
38168,
7113,
538,
449,
264,
1646,
836,
539,
198,
18717,
555,
87272,
5963,
13,
1115,
649,
2997,
994,
1701,
35219,
71647,
477,
198,
9493,
1701,
832,
315,
279,
1690,
1646,
12850,
430,
29241,
459,
5377,
15836,
12970,
198,
7227,
719,
449,
2204,
4211,
13,
763,
1884,
5157,
11,
304,
2015,
311,
5766,
1493,
287,
198,
9493,
87272,
5963,
374,
2663,
11,
499,
649,
14158,
264,
1646,
836,
311,
1005,
1618,
627,
7847,
264,
12529,
77027,
7383,
82,
25,
1796,
17752,
1145,
12143,
2424,
25,
12536,
19155,
60,
284,
220,
15,
8,
11651,
1796,
53094,
96481,
28819,
2484,
60,
55609,
198,
7368,
704,
311,
5377,
15836,
753,
40188,
15233,
3393,
369,
40188,
2778,
27437,
627,
9905,
198,
87042,
1389,
578,
1160,
315,
22755,
311,
11840,
627,
27069,
2424,
1389,
578,
12143,
1404,
315,
71647,
13,
1442,
2290,
11,
690,
1005,
279,
12143,
1404,
198,
54534,
555,
279,
538,
627,
16851,
198,
861,
315,
71647,
11,
832,
369,
1855,
1495,
627,
7847,
264,
12529,
5857,
7383,
25,
610,
8,
11651,
1796,
96481,
1483,
2484,
60,
55609,
198,
7368,
704,
311,
5377,
15836,
753,
40188,
15233,
3393,
369,
40188,
3319,
1495,
627,
9905,
198,
1342,
1389,
578,
1495,
311,
11840,
627,
16851,
198,
26566,
7113,
369,
279,
1495,
13
] | https://langchain.readthedocs.io/en/latest/embeddings/langchain.embeddings.openai.OpenAIEmbeddings.html |
03ea109820de-3 | Parameters
text – The text to embed.
Returns
Embedding for the text.
embed_documents(texts: List[str], chunk_size: Optional[int] = 0) → List[List[float]][source]¶
Call out to OpenAI’s embedding endpoint for embedding search docs.
Parameters
texts – The list of texts to embed.
chunk_size – The chunk size of embeddings. If None, will use the chunk size
specified by the class.
Returns
List of embeddings, one for each text.
embed_query(text: str) → List[float][source]¶
Call out to OpenAI’s embedding endpoint for embedding query text.
Parameters
text – The text to embed.
Returns
Embedding for the text.
validator validate_environment » all fields[source]¶
Validate that api key and python package exists in environment.
model Config[source]¶
Bases: object
Configuration for this pydantic object.
extra = 'forbid'¶ | [
9905,
198,
1342,
1389,
578,
1495,
311,
11840,
627,
16851,
198,
26566,
7113,
369,
279,
1495,
627,
12529,
77027,
7383,
82,
25,
1796,
17752,
1145,
12143,
2424,
25,
12536,
19155,
60,
284,
220,
15,
8,
11651,
1796,
53094,
96481,
28819,
2484,
60,
55609,
198,
7368,
704,
311,
5377,
15836,
753,
40188,
15233,
369,
40188,
2778,
27437,
627,
9905,
198,
87042,
1389,
578,
1160,
315,
22755,
311,
11840,
627,
27069,
2424,
1389,
578,
12143,
1404,
315,
71647,
13,
1442,
2290,
11,
690,
1005,
279,
12143,
1404,
198,
54534,
555,
279,
538,
627,
16851,
198,
861,
315,
71647,
11,
832,
369,
1855,
1495,
627,
12529,
5857,
7383,
25,
610,
8,
11651,
1796,
96481,
1483,
2484,
60,
55609,
198,
7368,
704,
311,
5377,
15836,
753,
40188,
15233,
369,
40188,
3319,
1495,
627,
9905,
198,
1342,
1389,
578,
1495,
311,
11840,
627,
16851,
198,
26566,
7113,
369,
279,
1495,
627,
16503,
9788,
52874,
4194,
8345,
4194,
682,
5151,
76747,
60,
55609,
198,
18409,
430,
6464,
1401,
323,
10344,
6462,
6866,
304,
4676,
627,
2590,
5649,
76747,
60,
55609,
198,
33,
2315,
25,
1665,
198,
7843,
369,
420,
4611,
67,
8322,
1665,
627,
15824,
284,
364,
2000,
21301,
6,
55609
] | https://langchain.readthedocs.io/en/latest/embeddings/langchain.embeddings.openai.OpenAIEmbeddings.html |
c4f8f1ffe4e8-0 | langchain.embeddings.modelscope_hub.ModelScopeEmbeddings¶
class langchain.embeddings.modelscope_hub.ModelScopeEmbeddings(*, embed: Any = None, model_id: str = 'damo/nlp_corom_sentence-embedding_english-base')[source]¶
Bases: BaseModel, Embeddings
Wrapper around modelscope_hub embedding models.
To use, you should have the modelscope python package installed.
Example
from langchain.embeddings import ModelScopeEmbeddings
model_id = "damo/nlp_corom_sentence-embedding_english-base"
embed = ModelScopeEmbeddings(model_id=model_id)
Initialize the modelscope
param embed: Any = None¶
param model_id: str = 'damo/nlp_corom_sentence-embedding_english-base'¶
Model name to use.
embed_documents(texts: List[str]) → List[List[float]][source]¶
Compute doc embeddings using a modelscope embedding model.
Parameters
texts – The list of texts to embed.
Returns
List of embeddings, one for each text.
embed_query(text: str) → List[float][source]¶
Compute query embeddings using a modelscope embedding model.
Parameters
text – The text to embed.
Returns
Embeddings for the text.
model Config[source]¶
Bases: object
Configuration for this pydantic object.
extra = 'forbid'¶ | [
5317,
8995,
41541,
25624,
3272,
4280,
95096,
5777,
11037,
26566,
25624,
55609,
198,
1058,
8859,
8995,
41541,
25624,
3272,
4280,
95096,
5777,
11037,
26566,
25624,
4163,
11,
11840,
25,
5884,
284,
2290,
11,
1646,
851,
25,
610,
284,
364,
15770,
78,
9809,
13855,
15076,
316,
49432,
12,
95711,
62,
30220,
31113,
13588,
2484,
60,
55609,
198,
33,
2315,
25,
65705,
11,
38168,
25624,
198,
11803,
2212,
4211,
2474,
95096,
40188,
4211,
627,
1271,
1005,
11,
499,
1288,
617,
279,
4211,
2474,
10344,
6462,
10487,
627,
13617,
198,
1527,
8859,
8995,
41541,
25624,
1179,
5008,
11037,
26566,
25624,
198,
2590,
851,
284,
330,
15770,
78,
9809,
13855,
15076,
316,
49432,
12,
95711,
62,
30220,
31113,
702,
12529,
284,
5008,
11037,
26566,
25624,
7790,
851,
63596,
851,
340,
10130,
279,
4211,
2474,
198,
913,
11840,
25,
5884,
284,
2290,
55609,
198,
913,
1646,
851,
25,
610,
284,
364,
15770,
78,
9809,
13855,
15076,
316,
49432,
12,
95711,
62,
30220,
31113,
6,
55609,
198,
1747,
836,
311,
1005,
627,
12529,
77027,
7383,
82,
25,
1796,
17752,
2526,
11651,
1796,
53094,
96481,
28819,
2484,
60,
55609,
198,
47354,
4733,
71647,
1701,
264,
4211,
2474,
40188,
1646,
627,
9905,
198,
87042,
1389,
578,
1160,
315,
22755,
311,
11840,
627,
16851,
198,
861,
315,
71647,
11,
832,
369,
1855,
1495,
627,
12529,
5857,
7383,
25,
610,
8,
11651,
1796,
96481,
1483,
2484,
60,
55609,
198,
47354,
3319,
71647,
1701,
264,
4211,
2474,
40188,
1646,
627,
9905,
198,
1342,
1389,
578,
1495,
311,
11840,
627,
16851,
198,
26566,
25624,
369,
279,
1495,
627,
2590,
5649,
76747,
60,
55609,
198,
33,
2315,
25,
1665,
198,
7843,
369,
420,
4611,
67,
8322,
1665,
627,
15824,
284,
364,
2000,
21301,
6,
55609
] | https://langchain.readthedocs.io/en/latest/embeddings/langchain.embeddings.modelscope_hub.ModelScopeEmbeddings.html |
a6c99a93da9a-0 | langchain.embeddings.huggingface_hub.HuggingFaceHubEmbeddings¶
class langchain.embeddings.huggingface_hub.HuggingFaceHubEmbeddings(*, client: Any = None, repo_id: str = 'sentence-transformers/all-mpnet-base-v2', task: Optional[str] = 'feature-extraction', model_kwargs: Optional[dict] = None, huggingfacehub_api_token: Optional[str] = None)[source]¶
Bases: BaseModel, Embeddings
Wrapper around HuggingFaceHub embedding models.
To use, you should have the huggingface_hub python package installed, and the
environment variable HUGGINGFACEHUB_API_TOKEN set with your API token, or pass
it as a named parameter to the constructor.
Example
from langchain.embeddings import HuggingFaceHubEmbeddings
repo_id = "sentence-transformers/all-mpnet-base-v2"
hf = HuggingFaceHubEmbeddings(
repo_id=repo_id,
task="feature-extraction",
huggingfacehub_api_token="my-api-key",
)
Create a new model by parsing and validating input data from keyword arguments.
Raises ValidationError if the input data cannot be parsed to form a valid model.
param huggingfacehub_api_token: Optional[str] = None¶
param model_kwargs: Optional[dict] = None¶
Key word arguments to pass to the model.
param repo_id: str = 'sentence-transformers/all-mpnet-base-v2'¶
Model name to use.
param task: Optional[str] = 'feature-extraction'¶
Task to call the model with.
embed_documents(texts: List[str]) → List[List[float]][source]¶
Call out to HuggingFaceHub’s embedding endpoint for embedding search docs.
Parameters
texts – The list of texts to embed.
Returns | [
5317,
8995,
41541,
25624,
870,
36368,
1594,
95096,
3924,
36368,
16680,
19876,
26566,
25624,
55609,
198,
1058,
8859,
8995,
41541,
25624,
870,
36368,
1594,
95096,
3924,
36368,
16680,
19876,
26566,
25624,
4163,
11,
3016,
25,
5884,
284,
2290,
11,
16246,
851,
25,
610,
284,
364,
52989,
33952,
388,
32506,
12,
1331,
4816,
31113,
8437,
17,
518,
3465,
25,
12536,
17752,
60,
284,
364,
13043,
10397,
27523,
518,
1646,
37335,
25,
12536,
58,
8644,
60,
284,
2290,
11,
305,
36368,
1594,
27780,
11959,
6594,
25,
12536,
17752,
60,
284,
2290,
6758,
2484,
60,
55609,
198,
33,
2315,
25,
65705,
11,
38168,
25624,
198,
11803,
2212,
473,
36368,
16680,
19876,
40188,
4211,
627,
1271,
1005,
11,
499,
1288,
617,
279,
305,
36368,
1594,
95096,
10344,
6462,
10487,
11,
323,
279,
198,
24175,
3977,
473,
3014,
50537,
20342,
39,
4594,
11669,
19199,
743,
449,
701,
5446,
4037,
11,
477,
1522,
198,
275,
439,
264,
7086,
5852,
311,
279,
4797,
627,
13617,
198,
1527,
8859,
8995,
41541,
25624,
1179,
473,
36368,
16680,
19876,
26566,
25624,
198,
24373,
851,
284,
330,
52989,
33952,
388,
32506,
12,
1331,
4816,
31113,
8437,
17,
702,
45854,
284,
473,
36368,
16680,
19876,
26566,
25624,
1021,
262,
16246,
851,
28,
24373,
851,
345,
262,
3465,
429,
13043,
10397,
27523,
761,
262,
305,
36368,
1594,
27780,
11959,
6594,
429,
2465,
24851,
16569,
761,
340,
4110,
264,
502,
1646,
555,
23115,
323,
69772,
1988,
828,
505,
16570,
6105,
627,
36120,
54129,
422,
279,
1988,
828,
4250,
387,
16051,
311,
1376,
264,
2764,
1646,
627,
913,
305,
36368,
1594,
27780,
11959,
6594,
25,
12536,
17752,
60,
284,
2290,
55609,
198,
913,
1646,
37335,
25,
12536,
58,
8644,
60,
284,
2290,
55609,
198,
1622,
3492,
6105,
311,
1522,
311,
279,
1646,
627,
913,
16246,
851,
25,
610,
284,
364,
52989,
33952,
388,
32506,
12,
1331,
4816,
31113,
8437,
17,
6,
55609,
198,
1747,
836,
311,
1005,
627,
913,
3465,
25,
12536,
17752,
60,
284,
364,
13043,
10397,
27523,
6,
55609,
198,
6396,
311,
1650,
279,
1646,
449,
627,
12529,
77027,
7383,
82,
25,
1796,
17752,
2526,
11651,
1796,
53094,
96481,
28819,
2484,
60,
55609,
198,
7368,
704,
311,
473,
36368,
16680,
19876,
753,
40188,
15233,
369,
40188,
2778,
27437,
627,
9905,
198,
87042,
1389,
578,
1160,
315,
22755,
311,
11840,
627,
16851
] | https://langchain.readthedocs.io/en/latest/embeddings/langchain.embeddings.huggingface_hub.HuggingFaceHubEmbeddings.html |
a6c99a93da9a-1 | Parameters
texts – The list of texts to embed.
Returns
List of embeddings, one for each text.
embed_query(text: str) → List[float][source]¶
Call out to HuggingFaceHub’s embedding endpoint for embedding query text.
Parameters
text – The text to embed.
Returns
Embeddings for the text.
validator validate_environment » all fields[source]¶
Validate that api key and python package exists in environment.
model Config[source]¶
Bases: object
Configuration for this pydantic object.
extra = 'forbid'¶ | [
9905,
198,
87042,
1389,
578,
1160,
315,
22755,
311,
11840,
627,
16851,
198,
861,
315,
71647,
11,
832,
369,
1855,
1495,
627,
12529,
5857,
7383,
25,
610,
8,
11651,
1796,
96481,
1483,
2484,
60,
55609,
198,
7368,
704,
311,
473,
36368,
16680,
19876,
753,
40188,
15233,
369,
40188,
3319,
1495,
627,
9905,
198,
1342,
1389,
578,
1495,
311,
11840,
627,
16851,
198,
26566,
25624,
369,
279,
1495,
627,
16503,
9788,
52874,
4194,
8345,
4194,
682,
5151,
76747,
60,
55609,
198,
18409,
430,
6464,
1401,
323,
10344,
6462,
6866,
304,
4676,
627,
2590,
5649,
76747,
60,
55609,
198,
33,
2315,
25,
1665,
198,
7843,
369,
420,
4611,
67,
8322,
1665,
627,
15824,
284,
364,
2000,
21301,
6,
55609
] | https://langchain.readthedocs.io/en/latest/embeddings/langchain.embeddings.huggingface_hub.HuggingFaceHubEmbeddings.html |
5b8d0756d5b6-0 | langchain.embeddings.dashscope.embed_with_retry¶
langchain.embeddings.dashscope.embed_with_retry(embeddings: DashScopeEmbeddings, **kwargs: Any) → Any[source]¶
Use tenacity to retry the embedding call. | [
5317,
8995,
41541,
25624,
962,
1003,
4280,
41541,
6753,
63845,
55609,
198,
5317,
8995,
41541,
25624,
962,
1003,
4280,
41541,
6753,
63845,
50825,
25624,
25,
37770,
11037,
26566,
25624,
11,
3146,
9872,
25,
5884,
8,
11651,
5884,
76747,
60,
55609,
198,
10464,
5899,
4107,
311,
23515,
279,
40188,
1650,
13
] | https://langchain.readthedocs.io/en/latest/embeddings/langchain.embeddings.dashscope.embed_with_retry.html |
4c1336b31a6f-0 | langchain.embeddings.tensorflow_hub.TensorflowHubEmbeddings¶
class langchain.embeddings.tensorflow_hub.TensorflowHubEmbeddings(*, embed: Any = None, model_url: str = 'https://tfhub.dev/google/universal-sentence-encoder-multilingual/3')[source]¶
Bases: BaseModel, Embeddings
Wrapper around tensorflow_hub embedding models.
To use, you should have the tensorflow_text python package installed.
Example
from langchain.embeddings import TensorflowHubEmbeddings
url = "https://tfhub.dev/google/universal-sentence-encoder-multilingual/3"
tf = TensorflowHubEmbeddings(model_url=url)
Initialize the tensorflow_hub and tensorflow_text.
param model_url: str = 'https://tfhub.dev/google/universal-sentence-encoder-multilingual/3'¶
Model name to use.
embed_documents(texts: List[str]) → List[List[float]][source]¶
Compute doc embeddings using a TensorflowHub embedding model.
Parameters
texts – The list of texts to embed.
Returns
List of embeddings, one for each text.
embed_query(text: str) → List[float][source]¶
Compute query embeddings using a TensorflowHub embedding model.
Parameters
text – The text to embed.
Returns
Embeddings for the text.
model Config[source]¶
Bases: object
Configuration for this pydantic object.
extra = 'forbid'¶ | [
5317,
8995,
41541,
25624,
41688,
5072,
95096,
45303,
5072,
19876,
26566,
25624,
55609,
198,
1058,
8859,
8995,
41541,
25624,
41688,
5072,
95096,
45303,
5072,
19876,
26566,
25624,
4163,
11,
11840,
25,
5884,
284,
2290,
11,
1646,
2975,
25,
610,
284,
364,
2485,
1129,
9112,
27780,
22247,
41789,
36317,
35052,
1355,
18886,
12,
28106,
98027,
50923,
14,
18,
13588,
2484,
60,
55609,
198,
33,
2315,
25,
65705,
11,
38168,
25624,
198,
11803,
2212,
29187,
95096,
40188,
4211,
627,
1271,
1005,
11,
499,
1288,
617,
279,
29187,
4424,
10344,
6462,
10487,
627,
13617,
198,
1527,
8859,
8995,
41541,
25624,
1179,
27127,
5072,
19876,
26566,
25624,
198,
1103,
284,
330,
2485,
1129,
9112,
27780,
22247,
41789,
36317,
35052,
1355,
18886,
12,
28106,
98027,
50923,
14,
18,
702,
9112,
284,
27127,
5072,
19876,
26566,
25624,
7790,
2975,
65663,
340,
10130,
279,
29187,
95096,
323,
29187,
4424,
627,
913,
1646,
2975,
25,
610,
284,
364,
2485,
1129,
9112,
27780,
22247,
41789,
36317,
35052,
1355,
18886,
12,
28106,
98027,
50923,
14,
18,
6,
55609,
198,
1747,
836,
311,
1005,
627,
12529,
77027,
7383,
82,
25,
1796,
17752,
2526,
11651,
1796,
53094,
96481,
28819,
2484,
60,
55609,
198,
47354,
4733,
71647,
1701,
264,
27127,
5072,
19876,
40188,
1646,
627,
9905,
198,
87042,
1389,
578,
1160,
315,
22755,
311,
11840,
627,
16851,
198,
861,
315,
71647,
11,
832,
369,
1855,
1495,
627,
12529,
5857,
7383,
25,
610,
8,
11651,
1796,
96481,
1483,
2484,
60,
55609,
198,
47354,
3319,
71647,
1701,
264,
27127,
5072,
19876,
40188,
1646,
627,
9905,
198,
1342,
1389,
578,
1495,
311,
11840,
627,
16851,
198,
26566,
25624,
369,
279,
1495,
627,
2590,
5649,
76747,
60,
55609,
198,
33,
2315,
25,
1665,
198,
7843,
369,
420,
4611,
67,
8322,
1665,
627,
15824,
284,
364,
2000,
21301,
6,
55609
] | https://langchain.readthedocs.io/en/latest/embeddings/langchain.embeddings.tensorflow_hub.TensorflowHubEmbeddings.html |
185e97d70035-0 | langchain.memory.chat_message_histories.sql.create_message_model¶
langchain.memory.chat_message_histories.sql.create_message_model(table_name, DynamicBase)[source]¶
Create a message model for a given table name.
:param table_name: The name of the table to use.
:param DynamicBase: The base class to use for the model.
Returns
The model class. | [
5317,
8995,
37711,
27215,
6598,
37499,
2490,
10251,
2581,
6598,
5156,
55609,
198,
5317,
8995,
37711,
27215,
6598,
37499,
2490,
10251,
2581,
6598,
5156,
16138,
1292,
11,
22648,
4066,
6758,
2484,
60,
55609,
198,
4110,
264,
1984,
1646,
369,
264,
2728,
2007,
836,
627,
68416,
2007,
1292,
25,
578,
836,
315,
279,
2007,
311,
1005,
627,
68416,
22648,
4066,
25,
578,
2385,
538,
311,
1005,
369,
279,
1646,
627,
16851,
198,
791,
1646,
538,
13
] | https://langchain.readthedocs.io/en/latest/memory/langchain.memory.chat_message_histories.sql.create_message_model.html |
40ef31a0505e-0 | langchain.memory.chat_message_histories.dynamodb.DynamoDBChatMessageHistory¶
class langchain.memory.chat_message_histories.dynamodb.DynamoDBChatMessageHistory(table_name: str, session_id: str, endpoint_url: Optional[str] = None)[source]¶
Bases: BaseChatMessageHistory
Chat message history that stores history in AWS DynamoDB.
This class expects that a DynamoDB table with name table_name
and a partition Key of SessionId is present.
Parameters
table_name – name of the DynamoDB table
session_id – arbitrary key that is used to store the messages
of a single chat session.
endpoint_url – URL of the AWS endpoint to connect to. This argument
is optional and useful for test purposes, like using Localstack.
If you plan to use AWS cloud service, you normally don’t have to
worry about setting the endpoint_url.
Methods
__init__(table_name, session_id[, endpoint_url])
add_ai_message(message)
Add an AI message to the store
add_message(message)
Append the message to the record in DynamoDB
add_user_message(message)
Add a user message to the store
clear()
Clear session memory from DynamoDB
Attributes
messages
Retrieve the messages from DynamoDB
add_ai_message(message: str) → None¶
Add an AI message to the store
add_message(message: BaseMessage) → None[source]¶
Append the message to the record in DynamoDB
add_user_message(message: str) → None¶
Add a user message to the store
clear() → None[source]¶
Clear session memory from DynamoDB
property messages: List[langchain.schema.BaseMessage]¶
Retrieve the messages from DynamoDB | [
5317,
8995,
37711,
27215,
6598,
37499,
2490,
962,
84448,
920,
86708,
3590,
16047,
2097,
13730,
55609,
198,
1058,
8859,
8995,
37711,
27215,
6598,
37499,
2490,
962,
84448,
920,
86708,
3590,
16047,
2097,
13730,
16138,
1292,
25,
610,
11,
3882,
851,
25,
610,
11,
15233,
2975,
25,
12536,
17752,
60,
284,
2290,
6758,
2484,
60,
55609,
198,
33,
2315,
25,
5464,
16047,
2097,
13730,
198,
16047,
1984,
3925,
430,
10756,
3925,
304,
24124,
72913,
3590,
627,
2028,
538,
25283,
430,
264,
72913,
3590,
2007,
449,
836,
2007,
1292,
198,
438,
264,
17071,
5422,
315,
9343,
769,
374,
3118,
627,
9905,
198,
2048,
1292,
1389,
836,
315,
279,
72913,
3590,
2007,
198,
6045,
851,
1389,
25142,
1401,
430,
374,
1511,
311,
3637,
279,
6743,
198,
1073,
264,
3254,
6369,
3882,
627,
33640,
2975,
1389,
5665,
315,
279,
24124,
15233,
311,
4667,
311,
13,
1115,
5811,
198,
285,
10309,
323,
5505,
369,
1296,
10096,
11,
1093,
1701,
8949,
7848,
627,
2746,
499,
3197,
311,
1005,
24124,
9624,
2532,
11,
499,
14614,
1541,
1431,
617,
311,
198,
86,
8635,
922,
6376,
279,
15233,
2975,
627,
18337,
198,
565,
2381,
3889,
2048,
1292,
11,
4194,
6045,
851,
38372,
4194,
33640,
2975,
2608,
723,
70515,
6598,
7483,
340,
2261,
459,
15592,
1984,
311,
279,
3637,
198,
723,
6598,
7483,
340,
24819,
279,
1984,
311,
279,
3335,
304,
72913,
3590,
198,
723,
3398,
6598,
7483,
340,
2261,
264,
1217,
1984,
311,
279,
3637,
198,
7574,
746,
14335,
3882,
5044,
505,
72913,
3590,
198,
10738,
198,
16727,
198,
88765,
279,
6743,
505,
72913,
3590,
198,
723,
70515,
6598,
7483,
25,
610,
8,
11651,
2290,
55609,
198,
2261,
459,
15592,
1984,
311,
279,
3637,
198,
723,
6598,
7483,
25,
5464,
2097,
8,
11651,
2290,
76747,
60,
55609,
198,
24819,
279,
1984,
311,
279,
3335,
304,
72913,
3590,
198,
723,
3398,
6598,
7483,
25,
610,
8,
11651,
2290,
55609,
198,
2261,
264,
1217,
1984,
311,
279,
3637,
198,
7574,
368,
11651,
2290,
76747,
60,
55609,
198,
14335,
3882,
5044,
505,
72913,
3590,
198,
3784,
6743,
25,
1796,
58,
5317,
8995,
31992,
13316,
2097,
60,
55609,
198,
88765,
279,
6743,
505,
72913,
3590
] | https://langchain.readthedocs.io/en/latest/memory/langchain.memory.chat_message_histories.dynamodb.DynamoDBChatMessageHistory.html |
a73d76acbf92-0 | langchain.memory.entity.ConversationEntityMemory¶ | [
5317,
8995,
37711,
9926,
4906,
23124,
3106,
10869,
55609
] | https://langchain.readthedocs.io/en/latest/memory/langchain.memory.entity.ConversationEntityMemory.html |
a73d76acbf92-1 | class langchain.memory.entity.ConversationEntityMemory(*, chat_memory: BaseChatMessageHistory = None, output_key: Optional[str] = None, input_key: Optional[str] = None, return_messages: bool = False, human_prefix: str = 'Human', ai_prefix: str = 'AI', llm: BaseLanguageModel, entity_extraction_prompt: BasePromptTemplate = PromptTemplate(input_variables=['history', 'input'], output_parser=None, partial_variables={}, template='You are an AI assistant reading the transcript of a conversation between an AI and a human. Extract all of the proper nouns from the last line of conversation. As a guideline, a proper noun is generally capitalized. You should definitely extract all names and places.\n\nThe conversation history is provided just in case of a coreference (e.g. "What do you know about him" where "him" is defined in a previous line) -- ignore items mentioned there that are not in the last line.\n\nReturn the output as a single comma-separated list, or NONE if there is nothing of note to return (e.g. the user is just issuing a greeting or having a simple conversation).\n\nEXAMPLE\nConversation history:\nPerson #1: how\'s it going today?\nAI: "It\'s going great! How about you?"\nPerson #1: good! busy working on Langchain. lots to do.\nAI: "That sounds like a lot of work! What kind of things are you doing to make Langchain better?"\nLast line:\nPerson #1: i\'m trying to improve Langchain\'s interfaces, the UX, its integrations with various products the user might want ... a lot of stuff.\nOutput: Langchain\nEND OF EXAMPLE\n\nEXAMPLE\nConversation history:\nPerson #1: how\'s it going today?\nAI: "It\'s going great! How about you?"\nPerson | [
1058,
8859,
8995,
37711,
9926,
4906,
23124,
3106,
10869,
4163,
11,
6369,
19745,
25,
5464,
16047,
2097,
13730,
284,
2290,
11,
2612,
3173,
25,
12536,
17752,
60,
284,
2290,
11,
1988,
3173,
25,
12536,
17752,
60,
284,
2290,
11,
471,
24321,
25,
1845,
284,
3641,
11,
3823,
14301,
25,
610,
284,
364,
35075,
518,
16796,
14301,
25,
610,
284,
364,
15836,
518,
9507,
76,
25,
5464,
14126,
1747,
11,
5502,
95942,
62521,
25,
5464,
55715,
7423,
284,
60601,
7423,
5498,
29282,
14314,
19375,
518,
364,
1379,
4181,
2612,
19024,
5980,
11,
7276,
29282,
68525,
3896,
1151,
2675,
527,
459,
15592,
18328,
5403,
279,
36815,
315,
264,
10652,
1990,
459,
15592,
323,
264,
3823,
13,
23673,
682,
315,
279,
6300,
90938,
505,
279,
1566,
1584,
315,
10652,
13,
1666,
264,
73545,
11,
264,
6300,
38021,
374,
8965,
98421,
13,
1472,
1288,
8659,
8819,
682,
5144,
323,
7634,
7255,
77,
1734,
791,
10652,
3925,
374,
3984,
1120,
304,
1162,
315,
264,
6332,
2251,
320,
68,
1326,
13,
330,
3923,
656,
499,
1440,
922,
1461,
1,
1405,
330,
40617,
1,
374,
4613,
304,
264,
3766,
1584,
8,
1198,
10240,
3673,
9932,
1070,
430,
527,
539,
304,
279,
1566,
1584,
7255,
77,
1734,
5715,
279,
2612,
439,
264,
3254,
32783,
73792,
1160,
11,
477,
43969,
422,
1070,
374,
4400,
315,
5296,
311,
471,
320,
68,
1326,
13,
279,
1217,
374,
1120,
43221,
264,
43213,
477,
3515,
264,
4382,
10652,
73441,
77,
1734,
96975,
1734,
61413,
3925,
7338,
77,
10909,
674,
16,
25,
1268,
10379,
82,
433,
2133,
3432,
33720,
77,
15836,
25,
330,
2181,
10379,
82,
2133,
2294,
0,
2650,
922,
499,
7673,
59,
77,
10909,
674,
16,
25,
1695,
0,
13326,
3318,
389,
23272,
8995,
13,
10283,
311,
656,
7255,
77,
15836,
25,
330,
4897,
10578,
1093,
264,
2763,
315,
990,
0,
3639,
3169,
315,
2574,
527,
499,
3815,
311,
1304,
23272,
8995,
2731,
7673,
59,
77,
5966,
1584,
7338,
77,
10909,
674,
16,
25,
602,
10379,
76,
4560,
311,
7417,
23272,
8995,
10379,
82,
25066,
11,
279,
62593,
11,
1202,
8936,
811,
449,
5370,
3956,
279,
1217,
2643,
1390,
2564,
264,
2763,
315,
6392,
7255,
77,
5207,
25,
23272,
8995,
1734,
4794,
3083,
67346,
1734,
1734,
96975,
1734,
61413,
3925,
7338,
77,
10909,
674,
16,
25,
1268,
10379,
82,
433,
2133,
3432,
33720,
77,
15836,
25,
330,
2181,
10379,
82,
2133,
2294,
0,
2650,
922,
499,
7673,
59,
77,
10909
] | https://langchain.readthedocs.io/en/latest/memory/langchain.memory.entity.ConversationEntityMemory.html |
a73d76acbf92-2 | going today?\nAI: "It\'s going great! How about you?"\nPerson #1: good! busy working on Langchain. lots to do.\nAI: "That sounds like a lot of work! What kind of things are you doing to make Langchain better?"\nLast line:\nPerson #1: i\'m trying to improve Langchain\'s interfaces, the UX, its integrations with various products the user might want ... a lot of stuff. I\'m working with Person #2.\nOutput: Langchain, Person #2\nEND OF EXAMPLE\n\nConversation history (for reference only):\n{history}\nLast line of conversation (for extraction):\nHuman: {input}\n\nOutput:', template_format='f-string', validate_template=True), entity_summarization_prompt: BasePromptTemplate = PromptTemplate(input_variables=['entity', 'summary', 'history', 'input'], output_parser=None, partial_variables={}, template='You are an AI assistant helping a human keep track of facts about relevant people, places, and concepts in their life. Update the summary of the provided entity in the "Entity" section based on the last line of your conversation with the human. If you are writing the summary for the first time, return a single sentence.\nThe update should only include facts that are relayed in the last line of conversation about the provided entity, and should only contain facts about the provided entity.\n\nIf there is no new information about the provided entity or the information is not worth noting (not an important or relevant fact to remember long-term), return the existing summary unchanged.\n\nFull conversation history (for context):\n{history}\n\nEntity to summarize:\n{entity}\n\nExisting summary of {entity}:\n{summary}\n\nLast line of conversation:\nHuman: {input}\nUpdated summary:', template_format='f-string', validate_template=True), entity_cache: | [
9738,
3432,
33720,
77,
15836,
25,
330,
2181,
10379,
82,
2133,
2294,
0,
2650,
922,
499,
7673,
59,
77,
10909,
674,
16,
25,
1695,
0,
13326,
3318,
389,
23272,
8995,
13,
10283,
311,
656,
7255,
77,
15836,
25,
330,
4897,
10578,
1093,
264,
2763,
315,
990,
0,
3639,
3169,
315,
2574,
527,
499,
3815,
311,
1304,
23272,
8995,
2731,
7673,
59,
77,
5966,
1584,
7338,
77,
10909,
674,
16,
25,
602,
10379,
76,
4560,
311,
7417,
23272,
8995,
10379,
82,
25066,
11,
279,
62593,
11,
1202,
8936,
811,
449,
5370,
3956,
279,
1217,
2643,
1390,
2564,
264,
2763,
315,
6392,
13,
358,
10379,
76,
3318,
449,
7508,
674,
17,
7255,
77,
5207,
25,
23272,
8995,
11,
7508,
674,
17,
1734,
4794,
3083,
67346,
1734,
1734,
61413,
3925,
320,
2000,
5905,
1193,
90149,
77,
90,
19375,
11281,
77,
5966,
1584,
315,
10652,
320,
2000,
33289,
90149,
77,
35075,
25,
314,
1379,
11281,
77,
1734,
5207,
17898,
3896,
9132,
1151,
69,
31981,
518,
9788,
8864,
3702,
705,
5502,
10370,
5730,
2065,
62521,
25,
5464,
55715,
7423,
284,
60601,
7423,
5498,
29282,
14314,
3069,
518,
364,
1743,
518,
364,
19375,
518,
364,
1379,
4181,
2612,
19024,
5980,
11,
7276,
29282,
68525,
3896,
1151,
2675,
527,
459,
15592,
18328,
10695,
264,
3823,
2567,
3839,
315,
13363,
922,
9959,
1274,
11,
7634,
11,
323,
19476,
304,
872,
2324,
13,
5666,
279,
12399,
315,
279,
3984,
5502,
304,
279,
330,
3106,
1,
3857,
3196,
389,
279,
1566,
1584,
315,
701,
10652,
449,
279,
3823,
13,
1442,
499,
527,
4477,
279,
12399,
369,
279,
1176,
892,
11,
471,
264,
3254,
11914,
7255,
89330,
2713,
1288,
1193,
2997,
13363,
430,
527,
32951,
291,
304,
279,
1566,
1584,
315,
10652,
922,
279,
3984,
5502,
11,
323,
1288,
1193,
6782,
13363,
922,
279,
3984,
5502,
7255,
77,
1734,
2746,
1070,
374,
912,
502,
2038,
922,
279,
3984,
5502,
477,
279,
2038,
374,
539,
5922,
27401,
320,
1962,
459,
3062,
477,
9959,
2144,
311,
6227,
1317,
9860,
705,
471,
279,
6484,
12399,
35957,
7255,
77,
1734,
9619,
10652,
3925,
320,
2000,
2317,
90149,
77,
90,
19375,
11281,
77,
1734,
3106,
311,
63179,
7338,
77,
90,
3069,
11281,
77,
1734,
54167,
12399,
315,
314,
3069,
92,
7338,
77,
90,
1743,
11281,
77,
1734,
5966,
1584,
315,
10652,
7338,
77,
35075,
25,
314,
1379,
11281,
77,
16593,
12399,
17898,
3896,
9132,
1151,
69,
31981,
518,
9788,
8864,
3702,
705,
5502,
11790,
25
] | https://langchain.readthedocs.io/en/latest/memory/langchain.memory.entity.ConversationEntityMemory.html |
a73d76acbf92-3 | {input}\nUpdated summary:', template_format='f-string', validate_template=True), entity_cache: List[str] = [], k: int = 3, chat_history_key: str = 'history', entity_store: BaseEntityStore = None)[source]¶ | [
90,
1379,
11281,
77,
16593,
12399,
17898,
3896,
9132,
1151,
69,
31981,
518,
9788,
8864,
3702,
705,
5502,
11790,
25,
1796,
17752,
60,
284,
10277,
597,
25,
528,
284,
220,
18,
11,
6369,
20389,
3173,
25,
610,
284,
364,
19375,
518,
5502,
15153,
25,
75182,
6221,
284,
2290,
6758,
2484,
60,
55609
] | https://langchain.readthedocs.io/en/latest/memory/langchain.memory.entity.ConversationEntityMemory.html |
a73d76acbf92-4 | Bases: BaseChatMemory
Entity extractor & summarizer memory.
Extracts named entities from the recent chat history and generates summaries.
With a swapable entity store, persisting entities across conversations.
Defaults to an in-memory entity store, and can be swapped out for a Redis,
SQLite, or other entity store.
Create a new model by parsing and validating input data from keyword arguments.
Raises ValidationError if the input data cannot be parsed to form a valid model.
param ai_prefix: str = 'AI'¶
param chat_history_key: str = 'history'¶
param chat_memory: BaseChatMessageHistory [Optional]¶
param entity_cache: List[str] = []¶ | [
33,
2315,
25,
5464,
16047,
10869,
198,
3106,
68572,
612,
29385,
3213,
5044,
627,
30059,
82,
7086,
15086,
505,
279,
3293,
6369,
3925,
323,
27983,
70022,
627,
2409,
264,
14626,
481,
5502,
3637,
11,
23135,
287,
15086,
4028,
21633,
627,
16672,
311,
459,
304,
65196,
5502,
3637,
11,
323,
649,
387,
58050,
704,
369,
264,
35258,
345,
82872,
11,
477,
1023,
5502,
3637,
627,
4110,
264,
502,
1646,
555,
23115,
323,
69772,
1988,
828,
505,
16570,
6105,
627,
36120,
54129,
422,
279,
1988,
828,
4250,
387,
16051,
311,
1376,
264,
2764,
1646,
627,
913,
16796,
14301,
25,
610,
284,
364,
15836,
6,
55609,
198,
913,
6369,
20389,
3173,
25,
610,
284,
364,
19375,
6,
55609,
198,
913,
6369,
19745,
25,
5464,
16047,
2097,
13730,
510,
15669,
60,
55609,
198,
913,
5502,
11790,
25,
1796,
17752,
60,
284,
3132,
55609
] | https://langchain.readthedocs.io/en/latest/memory/langchain.memory.entity.ConversationEntityMemory.html |
a73d76acbf92-5 | param entity_extraction_prompt: langchain.prompts.base.BasePromptTemplate = PromptTemplate(input_variables=['history', 'input'], output_parser=None, partial_variables={}, template='You are an AI assistant reading the transcript of a conversation between an AI and a human. Extract all of the proper nouns from the last line of conversation. As a guideline, a proper noun is generally capitalized. You should definitely extract all names and places.\n\nThe conversation history is provided just in case of a coreference (e.g. "What do you know about him" where "him" is defined in a previous line) -- ignore items mentioned there that are not in the last line.\n\nReturn the output as a single comma-separated list, or NONE if there is nothing of note to return (e.g. the user is just issuing a greeting or having a simple conversation).\n\nEXAMPLE\nConversation history:\nPerson #1: how\'s it going today?\nAI: "It\'s going great! How about you?"\nPerson #1: good! busy working on Langchain. lots to do.\nAI: "That sounds like a lot of work! What kind of things are you doing to make Langchain better?"\nLast line:\nPerson #1: i\'m trying to improve Langchain\'s interfaces, the UX, its integrations with various products the user might want ... a lot of stuff.\nOutput: Langchain\nEND OF EXAMPLE\n\nEXAMPLE\nConversation history:\nPerson #1: how\'s it going today?\nAI: "It\'s going great! How about you?"\nPerson #1: good! busy working on Langchain. lots to do.\nAI: "That sounds like a lot of work! What kind of things are you doing to make Langchain better?"\nLast line:\nPerson #1: i\'m trying to improve Langchain\'s interfaces, the | [
913,
5502,
95942,
62521,
25,
8859,
8995,
61848,
13044,
9105,
13316,
55715,
7423,
284,
60601,
7423,
5498,
29282,
14314,
19375,
518,
364,
1379,
4181,
2612,
19024,
5980,
11,
7276,
29282,
68525,
3896,
1151,
2675,
527,
459,
15592,
18328,
5403,
279,
36815,
315,
264,
10652,
1990,
459,
15592,
323,
264,
3823,
13,
23673,
682,
315,
279,
6300,
90938,
505,
279,
1566,
1584,
315,
10652,
13,
1666,
264,
73545,
11,
264,
6300,
38021,
374,
8965,
98421,
13,
1472,
1288,
8659,
8819,
682,
5144,
323,
7634,
7255,
77,
1734,
791,
10652,
3925,
374,
3984,
1120,
304,
1162,
315,
264,
6332,
2251,
320,
68,
1326,
13,
330,
3923,
656,
499,
1440,
922,
1461,
1,
1405,
330,
40617,
1,
374,
4613,
304,
264,
3766,
1584,
8,
1198,
10240,
3673,
9932,
1070,
430,
527,
539,
304,
279,
1566,
1584,
7255,
77,
1734,
5715,
279,
2612,
439,
264,
3254,
32783,
73792,
1160,
11,
477,
43969,
422,
1070,
374,
4400,
315,
5296,
311,
471,
320,
68,
1326,
13,
279,
1217,
374,
1120,
43221,
264,
43213,
477,
3515,
264,
4382,
10652,
73441,
77,
1734,
96975,
1734,
61413,
3925,
7338,
77,
10909,
674,
16,
25,
1268,
10379,
82,
433,
2133,
3432,
33720,
77,
15836,
25,
330,
2181,
10379,
82,
2133,
2294,
0,
2650,
922,
499,
7673,
59,
77,
10909,
674,
16,
25,
1695,
0,
13326,
3318,
389,
23272,
8995,
13,
10283,
311,
656,
7255,
77,
15836,
25,
330,
4897,
10578,
1093,
264,
2763,
315,
990,
0,
3639,
3169,
315,
2574,
527,
499,
3815,
311,
1304,
23272,
8995,
2731,
7673,
59,
77,
5966,
1584,
7338,
77,
10909,
674,
16,
25,
602,
10379,
76,
4560,
311,
7417,
23272,
8995,
10379,
82,
25066,
11,
279,
62593,
11,
1202,
8936,
811,
449,
5370,
3956,
279,
1217,
2643,
1390,
2564,
264,
2763,
315,
6392,
7255,
77,
5207,
25,
23272,
8995,
1734,
4794,
3083,
67346,
1734,
1734,
96975,
1734,
61413,
3925,
7338,
77,
10909,
674,
16,
25,
1268,
10379,
82,
433,
2133,
3432,
33720,
77,
15836,
25,
330,
2181,
10379,
82,
2133,
2294,
0,
2650,
922,
499,
7673,
59,
77,
10909,
674,
16,
25,
1695,
0,
13326,
3318,
389,
23272,
8995,
13,
10283,
311,
656,
7255,
77,
15836,
25,
330,
4897,
10578,
1093,
264,
2763,
315,
990,
0,
3639,
3169,
315,
2574,
527,
499,
3815,
311,
1304,
23272,
8995,
2731,
7673,
59,
77,
5966,
1584,
7338,
77,
10909,
674,
16,
25,
602,
10379,
76,
4560,
311,
7417,
23272,
8995,
10379,
82,
25066,
11,
279
] | https://langchain.readthedocs.io/en/latest/memory/langchain.memory.entity.ConversationEntityMemory.html |
a73d76acbf92-6 | line:\nPerson #1: i\'m trying to improve Langchain\'s interfaces, the UX, its integrations with various products the user might want ... a lot of stuff. I\'m working with Person #2.\nOutput: Langchain, Person #2\nEND OF EXAMPLE\n\nConversation history (for reference only):\n{history}\nLast line of conversation (for extraction):\nHuman: {input}\n\nOutput:', template_format='f-string', validate_template=True)¶ | [
1074,
7338,
77,
10909,
674,
16,
25,
602,
10379,
76,
4560,
311,
7417,
23272,
8995,
10379,
82,
25066,
11,
279,
62593,
11,
1202,
8936,
811,
449,
5370,
3956,
279,
1217,
2643,
1390,
2564,
264,
2763,
315,
6392,
13,
358,
10379,
76,
3318,
449,
7508,
674,
17,
7255,
77,
5207,
25,
23272,
8995,
11,
7508,
674,
17,
1734,
4794,
3083,
67346,
1734,
1734,
61413,
3925,
320,
2000,
5905,
1193,
90149,
77,
90,
19375,
11281,
77,
5966,
1584,
315,
10652,
320,
2000,
33289,
90149,
77,
35075,
25,
314,
1379,
11281,
77,
1734,
5207,
17898,
3896,
9132,
1151,
69,
31981,
518,
9788,
8864,
3702,
8,
55609
] | https://langchain.readthedocs.io/en/latest/memory/langchain.memory.entity.ConversationEntityMemory.html |
a73d76acbf92-7 | param entity_store: langchain.memory.entity.BaseEntityStore [Optional]¶
param entity_summarization_prompt: langchain.prompts.base.BasePromptTemplate = PromptTemplate(input_variables=['entity', 'summary', 'history', 'input'], output_parser=None, partial_variables={}, template='You are an AI assistant helping a human keep track of facts about relevant people, places, and concepts in their life. Update the summary of the provided entity in the "Entity" section based on the last line of your conversation with the human. If you are writing the summary for the first time, return a single sentence.\nThe update should only include facts that are relayed in the last line of conversation about the provided entity, and should only contain facts about the provided entity.\n\nIf there is no new information about the provided entity or the information is not worth noting (not an important or relevant fact to remember long-term), return the existing summary unchanged.\n\nFull conversation history (for context):\n{history}\n\nEntity to summarize:\n{entity}\n\nExisting summary of {entity}:\n{summary}\n\nLast line of conversation:\nHuman: {input}\nUpdated summary:', template_format='f-string', validate_template=True)¶
param human_prefix: str = 'Human'¶
param input_key: Optional[str] = None¶
param k: int = 3¶
param llm: langchain.base_language.BaseLanguageModel [Required]¶
param output_key: Optional[str] = None¶
param return_messages: bool = False¶
clear() → None[source]¶
Clear memory contents.
load_memory_variables(inputs: Dict[str, Any]) → Dict[str, Any][source]¶
Returns chat history and all generated entities with summaries if available,
and updates or clears the recent entity cache.
New entity name can be found when calling this method, before the entity | [
913,
5502,
15153,
25,
8859,
8995,
37711,
9926,
13316,
3106,
6221,
510,
15669,
60,
55609,
198,
913,
5502,
10370,
5730,
2065,
62521,
25,
8859,
8995,
61848,
13044,
9105,
13316,
55715,
7423,
284,
60601,
7423,
5498,
29282,
14314,
3069,
518,
364,
1743,
518,
364,
19375,
518,
364,
1379,
4181,
2612,
19024,
5980,
11,
7276,
29282,
68525,
3896,
1151,
2675,
527,
459,
15592,
18328,
10695,
264,
3823,
2567,
3839,
315,
13363,
922,
9959,
1274,
11,
7634,
11,
323,
19476,
304,
872,
2324,
13,
5666,
279,
12399,
315,
279,
3984,
5502,
304,
279,
330,
3106,
1,
3857,
3196,
389,
279,
1566,
1584,
315,
701,
10652,
449,
279,
3823,
13,
1442,
499,
527,
4477,
279,
12399,
369,
279,
1176,
892,
11,
471,
264,
3254,
11914,
7255,
89330,
2713,
1288,
1193,
2997,
13363,
430,
527,
32951,
291,
304,
279,
1566,
1584,
315,
10652,
922,
279,
3984,
5502,
11,
323,
1288,
1193,
6782,
13363,
922,
279,
3984,
5502,
7255,
77,
1734,
2746,
1070,
374,
912,
502,
2038,
922,
279,
3984,
5502,
477,
279,
2038,
374,
539,
5922,
27401,
320,
1962,
459,
3062,
477,
9959,
2144,
311,
6227,
1317,
9860,
705,
471,
279,
6484,
12399,
35957,
7255,
77,
1734,
9619,
10652,
3925,
320,
2000,
2317,
90149,
77,
90,
19375,
11281,
77,
1734,
3106,
311,
63179,
7338,
77,
90,
3069,
11281,
77,
1734,
54167,
12399,
315,
314,
3069,
92,
7338,
77,
90,
1743,
11281,
77,
1734,
5966,
1584,
315,
10652,
7338,
77,
35075,
25,
314,
1379,
11281,
77,
16593,
12399,
17898,
3896,
9132,
1151,
69,
31981,
518,
9788,
8864,
3702,
8,
55609,
198,
913,
3823,
14301,
25,
610,
284,
364,
35075,
6,
55609,
198,
913,
1988,
3173,
25,
12536,
17752,
60,
284,
2290,
55609,
198,
913,
597,
25,
528,
284,
220,
18,
55609,
198,
913,
9507,
76,
25,
8859,
8995,
9105,
30121,
13316,
14126,
1747,
510,
8327,
60,
55609,
198,
913,
2612,
3173,
25,
12536,
17752,
60,
284,
2290,
55609,
198,
913,
471,
24321,
25,
1845,
284,
3641,
55609,
198,
7574,
368,
11651,
2290,
76747,
60,
55609,
198,
14335,
5044,
8970,
627,
1096,
19745,
29282,
35099,
25,
30226,
17752,
11,
5884,
2526,
11651,
30226,
17752,
11,
5884,
1483,
2484,
60,
55609,
198,
16851,
6369,
3925,
323,
682,
8066,
15086,
449,
70022,
422,
2561,
345,
438,
9013,
477,
57698,
279,
3293,
5502,
6636,
627,
3648,
5502,
836,
649,
387,
1766,
994,
8260,
420,
1749,
11,
1603,
279,
5502
] | https://langchain.readthedocs.io/en/latest/memory/langchain.memory.entity.ConversationEntityMemory.html |
a73d76acbf92-8 | New entity name can be found when calling this method, before the entity
summaries are generated, so the entity cache values may be empty if no entity
descriptions are generated yet.
save_context(inputs: Dict[str, Any], outputs: Dict[str, str]) → None[source]¶
Save context from this conversation history to the entity store.
Generates a summary for each entity in the entity cache by prompting
the model, and saves these summaries to the entity store.
to_json() → Union[SerializedConstructor, SerializedNotImplemented]¶
to_json_not_implemented() → SerializedNotImplemented¶
property buffer: List[langchain.schema.BaseMessage]¶
Access chat memory messages.
property lc_attributes: Dict¶
Return a list of attribute names that should be included in the
serialized kwargs. These attributes must be accepted by the
constructor.
property lc_namespace: List[str]¶
Return the namespace of the langchain object.
eg. [“langchain”, “llms”, “openai”]
property lc_secrets: Dict[str, str]¶
Return a map of constructor argument names to secret ids.
eg. {“openai_api_key”: “OPENAI_API_KEY”}
property lc_serializable: bool¶
Return whether or not the class is serializable.
model Config¶
Bases: object
Configuration for this pydantic object.
arbitrary_types_allowed = True¶ | [
3648,
5502,
836,
649,
387,
1766,
994,
8260,
420,
1749,
11,
1603,
279,
5502,
198,
70644,
5548,
527,
8066,
11,
779,
279,
5502,
6636,
2819,
1253,
387,
4384,
422,
912,
5502,
198,
5919,
25712,
527,
8066,
3686,
627,
6766,
8634,
35099,
25,
30226,
17752,
11,
5884,
1145,
16674,
25,
30226,
17752,
11,
610,
2526,
11651,
2290,
76747,
60,
55609,
198,
8960,
2317,
505,
420,
10652,
3925,
311,
279,
5502,
3637,
627,
5648,
988,
264,
12399,
369,
1855,
5502,
304,
279,
5502,
6636,
555,
50745,
198,
1820,
1646,
11,
323,
27024,
1521,
70022,
311,
279,
5502,
3637,
627,
998,
9643,
368,
11651,
9323,
58,
78621,
13591,
11,
92572,
2688,
18804,
60,
55609,
198,
998,
9643,
8072,
18377,
14565,
368,
11651,
92572,
2688,
18804,
55609,
198,
3784,
4240,
25,
1796,
58,
5317,
8995,
31992,
13316,
2097,
60,
55609,
198,
6182,
6369,
5044,
6743,
627,
3784,
37313,
18741,
25,
30226,
55609,
198,
5715,
264,
1160,
315,
7180,
5144,
430,
1288,
387,
5343,
304,
279,
198,
76377,
16901,
13,
4314,
8365,
2011,
387,
11928,
555,
279,
198,
22602,
627,
3784,
37313,
42671,
25,
1796,
17752,
60,
55609,
198,
5715,
279,
4573,
315,
279,
8859,
8995,
1665,
627,
797,
13,
510,
2118,
5317,
8995,
9520,
1054,
657,
1026,
9520,
1054,
2569,
2192,
863,
933,
3784,
37313,
3537,
53810,
25,
30226,
17752,
11,
610,
60,
55609,
198,
5715,
264,
2472,
315,
4797,
5811,
5144,
311,
6367,
14483,
627,
797,
13,
314,
2118,
2569,
2192,
11959,
3173,
57633,
1054,
32033,
15836,
11669,
6738,
863,
534,
3784,
37313,
26684,
8499,
25,
1845,
55609,
198,
5715,
3508,
477,
539,
279,
538,
374,
6275,
8499,
627,
2590,
5649,
55609,
198,
33,
2315,
25,
1665,
198,
7843,
369,
420,
4611,
67,
8322,
1665,
627,
277,
88951,
9962,
43255,
284,
3082,
55609
] | https://langchain.readthedocs.io/en/latest/memory/langchain.memory.entity.ConversationEntityMemory.html |
8fbf462210ac-0 | langchain.memory.chat_message_histories.cassandra.CassandraChatMessageHistory¶
class langchain.memory.chat_message_histories.cassandra.CassandraChatMessageHistory(session_id: str, session: Session, keyspace: str, table_name: str = 'message_store', ttl_seconds: int | None = None)[source]¶
Bases: BaseChatMessageHistory
Chat message history that stores history in Cassandra.
Parameters
session_id – arbitrary key that is used to store the messages
of a single chat session.
session – a Cassandra Session object (an open DB connection)
keyspace – name of the keyspace to use.
table_name – name of the table to use.
ttl_seconds – time-to-live (seconds) for automatic expiration
of stored entries. None (default) for no expiration.
Methods
__init__(session_id, session, keyspace[, ...])
add_ai_message(message)
Add an AI message to the store
add_message(message)
Write a message to the table
add_user_message(message)
Add a user message to the store
clear()
Clear session memory from DB
Attributes
messages
Retrieve all session messages from DB
add_ai_message(message: str) → None¶
Add an AI message to the store
add_message(message: BaseMessage) → None[source]¶
Write a message to the table
add_user_message(message: str) → None¶
Add a user message to the store
clear() → None[source]¶
Clear session memory from DB
property messages: List[langchain.schema.BaseMessage]¶
Retrieve all session messages from DB | [
5317,
8995,
37711,
27215,
6598,
37499,
2490,
522,
71193,
732,
71193,
16047,
2097,
13730,
55609,
198,
1058,
8859,
8995,
37711,
27215,
6598,
37499,
2490,
522,
71193,
732,
71193,
16047,
2097,
13730,
16663,
851,
25,
610,
11,
3882,
25,
9343,
11,
7039,
1330,
25,
610,
11,
2007,
1292,
25,
610,
284,
364,
2037,
15153,
518,
55032,
35925,
25,
528,
765,
2290,
284,
2290,
6758,
2484,
60,
55609,
198,
33,
2315,
25,
5464,
16047,
2097,
13730,
198,
16047,
1984,
3925,
430,
10756,
3925,
304,
82342,
627,
9905,
198,
6045,
851,
1389,
25142,
1401,
430,
374,
1511,
311,
3637,
279,
6743,
198,
1073,
264,
3254,
6369,
3882,
627,
6045,
1389,
264,
82342,
9343,
1665,
320,
276,
1825,
6078,
3717,
340,
798,
8920,
1389,
836,
315,
279,
7039,
1330,
311,
1005,
627,
2048,
1292,
1389,
836,
315,
279,
2007,
311,
1005,
627,
63958,
35925,
1389,
892,
4791,
74551,
320,
17859,
8,
369,
17392,
32792,
198,
1073,
9967,
10925,
13,
2290,
320,
2309,
8,
369,
912,
32792,
627,
18337,
198,
565,
2381,
3889,
6045,
851,
11,
4194,
6045,
11,
4194,
798,
8920,
38372,
4194,
1131,
2608,
723,
70515,
6598,
7483,
340,
2261,
459,
15592,
1984,
311,
279,
3637,
198,
723,
6598,
7483,
340,
8144,
264,
1984,
311,
279,
2007,
198,
723,
3398,
6598,
7483,
340,
2261,
264,
1217,
1984,
311,
279,
3637,
198,
7574,
746,
14335,
3882,
5044,
505,
6078,
198,
10738,
198,
16727,
198,
88765,
682,
3882,
6743,
505,
6078,
198,
723,
70515,
6598,
7483,
25,
610,
8,
11651,
2290,
55609,
198,
2261,
459,
15592,
1984,
311,
279,
3637,
198,
723,
6598,
7483,
25,
5464,
2097,
8,
11651,
2290,
76747,
60,
55609,
198,
8144,
264,
1984,
311,
279,
2007,
198,
723,
3398,
6598,
7483,
25,
610,
8,
11651,
2290,
55609,
198,
2261,
264,
1217,
1984,
311,
279,
3637,
198,
7574,
368,
11651,
2290,
76747,
60,
55609,
198,
14335,
3882,
5044,
505,
6078,
198,
3784,
6743,
25,
1796,
58,
5317,
8995,
31992,
13316,
2097,
60,
55609,
198,
88765,
682,
3882,
6743,
505,
6078
] | https://langchain.readthedocs.io/en/latest/memory/langchain.memory.chat_message_histories.cassandra.CassandraChatMessageHistory.html |
e60fe6360d07-0 | langchain.memory.motorhead_memory.MotorheadMemory¶
class langchain.memory.motorhead_memory.MotorheadMemory(*, chat_memory: BaseChatMessageHistory = None, output_key: Optional[str] = None, input_key: Optional[str] = None, return_messages: bool = False, url: str = 'https://api.getmetal.io/v1/motorhead', session_id: str, context: Optional[str] = None, api_key: Optional[str] = None, client_id: Optional[str] = None, timeout: int = 3000, memory_key: str = 'history')[source]¶
Bases: BaseChatMemory
Create a new model by parsing and validating input data from keyword arguments.
Raises ValidationError if the input data cannot be parsed to form a valid model.
param api_key: Optional[str] = None¶
param chat_memory: BaseChatMessageHistory [Optional]¶
param client_id: Optional[str] = None¶
param context: Optional[str] = None¶
param input_key: Optional[str] = None¶
param output_key: Optional[str] = None¶
param return_messages: bool = False¶
param session_id: str [Required]¶
param url: str = 'https://api.getmetal.io/v1/motorhead'¶
clear() → None¶
Clear memory contents.
delete_session() → None[source]¶
Delete a session
async init() → None[source]¶
load_memory_variables(values: Dict[str, Any]) → Dict[str, Any][source]¶
Return key-value pairs given the text input to the chain.
If None, return all memories
save_context(inputs: Dict[str, Any], outputs: Dict[str, str]) → None[source]¶
Save context from this conversation to buffer.
to_json() → Union[SerializedConstructor, SerializedNotImplemented]¶ | [
5317,
8995,
37711,
749,
10088,
2025,
19745,
1345,
10088,
2025,
10869,
55609,
198,
1058,
8859,
8995,
37711,
749,
10088,
2025,
19745,
1345,
10088,
2025,
10869,
4163,
11,
6369,
19745,
25,
5464,
16047,
2097,
13730,
284,
2290,
11,
2612,
3173,
25,
12536,
17752,
60,
284,
2290,
11,
1988,
3173,
25,
12536,
17752,
60,
284,
2290,
11,
471,
24321,
25,
1845,
284,
3641,
11,
2576,
25,
610,
284,
364,
2485,
1129,
2113,
673,
55108,
4340,
5574,
16,
3262,
10088,
2025,
518,
3882,
851,
25,
610,
11,
2317,
25,
12536,
17752,
60,
284,
2290,
11,
6464,
3173,
25,
12536,
17752,
60,
284,
2290,
11,
3016,
851,
25,
12536,
17752,
60,
284,
2290,
11,
9829,
25,
528,
284,
220,
3101,
15,
11,
5044,
3173,
25,
610,
284,
364,
19375,
13588,
2484,
60,
55609,
198,
33,
2315,
25,
5464,
16047,
10869,
198,
4110,
264,
502,
1646,
555,
23115,
323,
69772,
1988,
828,
505,
16570,
6105,
627,
36120,
54129,
422,
279,
1988,
828,
4250,
387,
16051,
311,
1376,
264,
2764,
1646,
627,
913,
6464,
3173,
25,
12536,
17752,
60,
284,
2290,
55609,
198,
913,
6369,
19745,
25,
5464,
16047,
2097,
13730,
510,
15669,
60,
55609,
198,
913,
3016,
851,
25,
12536,
17752,
60,
284,
2290,
55609,
198,
913,
2317,
25,
12536,
17752,
60,
284,
2290,
55609,
198,
913,
1988,
3173,
25,
12536,
17752,
60,
284,
2290,
55609,
198,
913,
2612,
3173,
25,
12536,
17752,
60,
284,
2290,
55609,
198,
913,
471,
24321,
25,
1845,
284,
3641,
55609,
198,
913,
3882,
851,
25,
610,
510,
8327,
60,
55609,
198,
913,
2576,
25,
610,
284,
364,
2485,
1129,
2113,
673,
55108,
4340,
5574,
16,
3262,
10088,
2025,
6,
55609,
198,
7574,
368,
11651,
2290,
55609,
198,
14335,
5044,
8970,
627,
4644,
12596,
368,
11651,
2290,
76747,
60,
55609,
198,
6571,
264,
3882,
198,
7847,
3003,
368,
11651,
2290,
76747,
60,
55609,
198,
1096,
19745,
29282,
20706,
25,
30226,
17752,
11,
5884,
2526,
11651,
30226,
17752,
11,
5884,
1483,
2484,
60,
55609,
198,
5715,
1401,
19625,
13840,
2728,
279,
1495,
1988,
311,
279,
8957,
627,
2746,
2290,
11,
471,
682,
19459,
198,
6766,
8634,
35099,
25,
30226,
17752,
11,
5884,
1145,
16674,
25,
30226,
17752,
11,
610,
2526,
11651,
2290,
76747,
60,
55609,
198,
8960,
2317,
505,
420,
10652,
311,
4240,
627,
998,
9643,
368,
11651,
9323,
58,
78621,
13591,
11,
92572,
2688,
18804,
60,
55609
] | https://langchain.readthedocs.io/en/latest/memory/langchain.memory.motorhead_memory.MotorheadMemory.html |
e60fe6360d07-1 | to_json() → Union[SerializedConstructor, SerializedNotImplemented]¶
to_json_not_implemented() → SerializedNotImplemented¶
property lc_attributes: Dict¶
Return a list of attribute names that should be included in the
serialized kwargs. These attributes must be accepted by the
constructor.
property lc_namespace: List[str]¶
Return the namespace of the langchain object.
eg. [“langchain”, “llms”, “openai”]
property lc_secrets: Dict[str, str]¶
Return a map of constructor argument names to secret ids.
eg. {“openai_api_key”: “OPENAI_API_KEY”}
property lc_serializable: bool¶
Return whether or not the class is serializable.
property memory_variables: List[str]¶
Input keys this memory class will load dynamically.
model Config¶
Bases: object
Configuration for this pydantic object.
arbitrary_types_allowed = True¶ | [
998,
9643,
368,
11651,
9323,
58,
78621,
13591,
11,
92572,
2688,
18804,
60,
55609,
198,
998,
9643,
8072,
18377,
14565,
368,
11651,
92572,
2688,
18804,
55609,
198,
3784,
37313,
18741,
25,
30226,
55609,
198,
5715,
264,
1160,
315,
7180,
5144,
430,
1288,
387,
5343,
304,
279,
198,
76377,
16901,
13,
4314,
8365,
2011,
387,
11928,
555,
279,
198,
22602,
627,
3784,
37313,
42671,
25,
1796,
17752,
60,
55609,
198,
5715,
279,
4573,
315,
279,
8859,
8995,
1665,
627,
797,
13,
510,
2118,
5317,
8995,
9520,
1054,
657,
1026,
9520,
1054,
2569,
2192,
863,
933,
3784,
37313,
3537,
53810,
25,
30226,
17752,
11,
610,
60,
55609,
198,
5715,
264,
2472,
315,
4797,
5811,
5144,
311,
6367,
14483,
627,
797,
13,
314,
2118,
2569,
2192,
11959,
3173,
57633,
1054,
32033,
15836,
11669,
6738,
863,
534,
3784,
37313,
26684,
8499,
25,
1845,
55609,
198,
5715,
3508,
477,
539,
279,
538,
374,
6275,
8499,
627,
3784,
5044,
29282,
25,
1796,
17752,
60,
55609,
198,
2566,
7039,
420,
5044,
538,
690,
2865,
43111,
627,
2590,
5649,
55609,
198,
33,
2315,
25,
1665,
198,
7843,
369,
420,
4611,
67,
8322,
1665,
627,
277,
88951,
9962,
43255,
284,
3082,
55609
] | https://langchain.readthedocs.io/en/latest/memory/langchain.memory.motorhead_memory.MotorheadMemory.html |
5974bf54aced-0 | langchain.memory.chat_message_histories.cosmos_db.CosmosDBChatMessageHistory¶
class langchain.memory.chat_message_histories.cosmos_db.CosmosDBChatMessageHistory(cosmos_endpoint: str, cosmos_database: str, cosmos_container: str, session_id: str, user_id: str, credential: Any = None, connection_string: Optional[str] = None, ttl: Optional[int] = None, cosmos_client_kwargs: Optional[dict] = None)[source]¶
Bases: BaseChatMessageHistory
Chat history backed by Azure CosmosDB.
Initializes a new instance of the CosmosDBChatMessageHistory class.
Make sure to call prepare_cosmos or use the context manager to make
sure your database is ready.
Either a credential or a connection string must be provided.
Parameters
cosmos_endpoint – The connection endpoint for the Azure Cosmos DB account.
cosmos_database – The name of the database to use.
cosmos_container – The name of the container to use.
session_id – The session ID to use, can be overwritten while loading.
user_id – The user ID to use, can be overwritten while loading.
credential – The credential to use to authenticate to Azure Cosmos DB.
connection_string – The connection string to use to authenticate.
ttl – The time to live (in seconds) to use for documents in the container.
cosmos_client_kwargs – Additional kwargs to pass to the CosmosClient.
Methods
__init__(cosmos_endpoint, cosmos_database, ...)
Initializes a new instance of the CosmosDBChatMessageHistory class.
add_ai_message(message)
Add an AI message to the store
add_message(message)
Add a self-created message to the store
add_user_message(message)
Add a user message to the store
clear()
Clear session memory from this memory and cosmos.
load_messages()
Retrieve the messages from Cosmos | [
5317,
8995,
37711,
27215,
6598,
37499,
2490,
21832,
8801,
8856,
67355,
8801,
3590,
16047,
2097,
13730,
55609,
198,
1058,
8859,
8995,
37711,
27215,
6598,
37499,
2490,
21832,
8801,
8856,
67355,
8801,
3590,
16047,
2097,
13730,
89930,
8801,
37799,
25,
610,
11,
83645,
28441,
25,
610,
11,
83645,
16226,
25,
610,
11,
3882,
851,
25,
610,
11,
1217,
851,
25,
610,
11,
41307,
25,
5884,
284,
2290,
11,
3717,
3991,
25,
12536,
17752,
60,
284,
2290,
11,
55032,
25,
12536,
19155,
60,
284,
2290,
11,
83645,
8342,
37335,
25,
12536,
58,
8644,
60,
284,
2290,
6758,
2484,
60,
55609,
198,
33,
2315,
25,
5464,
16047,
2097,
13730,
198,
16047,
3925,
22126,
555,
35219,
84524,
3590,
627,
6475,
4861,
264,
502,
2937,
315,
279,
84524,
3590,
16047,
2097,
13730,
538,
627,
8238,
2771,
311,
1650,
10772,
62292,
8801,
477,
1005,
279,
2317,
6783,
311,
1304,
198,
19643,
701,
4729,
374,
5644,
627,
50344,
264,
41307,
477,
264,
3717,
925,
2011,
387,
3984,
627,
9905,
198,
9594,
8801,
37799,
1389,
578,
3717,
15233,
369,
279,
35219,
84524,
6078,
2759,
627,
9594,
8801,
28441,
1389,
578,
836,
315,
279,
4729,
311,
1005,
627,
9594,
8801,
16226,
1389,
578,
836,
315,
279,
5593,
311,
1005,
627,
6045,
851,
1389,
578,
3882,
3110,
311,
1005,
11,
649,
387,
60273,
1418,
8441,
627,
882,
851,
1389,
578,
1217,
3110,
311,
1005,
11,
649,
387,
60273,
1418,
8441,
627,
67899,
1389,
578,
41307,
311,
1005,
311,
34289,
311,
35219,
84524,
6078,
627,
7898,
3991,
1389,
578,
3717,
925,
311,
1005,
311,
34289,
627,
63958,
1389,
578,
892,
311,
3974,
320,
258,
6622,
8,
311,
1005,
369,
9477,
304,
279,
5593,
627,
9594,
8801,
8342,
37335,
1389,
24086,
16901,
311,
1522,
311,
279,
84524,
3032,
627,
18337,
198,
565,
2381,
3889,
9594,
8801,
37799,
11,
4194,
9594,
8801,
28441,
11,
4194,
32318,
6475,
4861,
264,
502,
2937,
315,
279,
84524,
3590,
16047,
2097,
13730,
538,
627,
723,
70515,
6598,
7483,
340,
2261,
459,
15592,
1984,
311,
279,
3637,
198,
723,
6598,
7483,
340,
2261,
264,
659,
72057,
1984,
311,
279,
3637,
198,
723,
3398,
6598,
7483,
340,
2261,
264,
1217,
1984,
311,
279,
3637,
198,
7574,
746,
14335,
3882,
5044,
505,
420,
5044,
323,
83645,
627,
1096,
24321,
746,
88765,
279,
6743,
505,
84524
] | https://langchain.readthedocs.io/en/latest/memory/langchain.memory.chat_message_histories.cosmos_db.CosmosDBChatMessageHistory.html |
5974bf54aced-1 | Clear session memory from this memory and cosmos.
load_messages()
Retrieve the messages from Cosmos
prepare_cosmos()
Prepare the CosmosDB client.
upsert_messages()
Update the cosmosdb item.
Attributes
messages
add_ai_message(message: str) → None¶
Add an AI message to the store
add_message(message: BaseMessage) → None[source]¶
Add a self-created message to the store
add_user_message(message: str) → None¶
Add a user message to the store
clear() → None[source]¶
Clear session memory from this memory and cosmos.
load_messages() → None[source]¶
Retrieve the messages from Cosmos
prepare_cosmos() → None[source]¶
Prepare the CosmosDB client.
Use this function or the context manager to make sure your database is ready.
upsert_messages() → None[source]¶
Update the cosmosdb item.
messages: List[BaseMessage]¶ | [
14335,
3882,
5044,
505,
420,
5044,
323,
83645,
627,
1096,
24321,
746,
88765,
279,
6743,
505,
84524,
198,
13923,
62292,
8801,
746,
51690,
279,
84524,
3590,
3016,
627,
455,
6175,
24321,
746,
4387,
279,
83645,
2042,
1537,
627,
10738,
198,
16727,
198,
723,
70515,
6598,
7483,
25,
610,
8,
11651,
2290,
55609,
198,
2261,
459,
15592,
1984,
311,
279,
3637,
198,
723,
6598,
7483,
25,
5464,
2097,
8,
11651,
2290,
76747,
60,
55609,
198,
2261,
264,
659,
72057,
1984,
311,
279,
3637,
198,
723,
3398,
6598,
7483,
25,
610,
8,
11651,
2290,
55609,
198,
2261,
264,
1217,
1984,
311,
279,
3637,
198,
7574,
368,
11651,
2290,
76747,
60,
55609,
198,
14335,
3882,
5044,
505,
420,
5044,
323,
83645,
627,
1096,
24321,
368,
11651,
2290,
76747,
60,
55609,
198,
88765,
279,
6743,
505,
84524,
198,
13923,
62292,
8801,
368,
11651,
2290,
76747,
60,
55609,
198,
51690,
279,
84524,
3590,
3016,
627,
10464,
420,
734,
477,
279,
2317,
6783,
311,
1304,
2771,
701,
4729,
374,
5644,
627,
455,
6175,
24321,
368,
11651,
2290,
76747,
60,
55609,
198,
4387,
279,
83645,
2042,
1537,
627,
16727,
25,
1796,
58,
4066,
2097,
60,
55609
] | https://langchain.readthedocs.io/en/latest/memory/langchain.memory.chat_message_histories.cosmos_db.CosmosDBChatMessageHistory.html |
38cfcf5ab243-0 | langchain.memory.chat_message_histories.sql.SQLChatMessageHistory¶
class langchain.memory.chat_message_histories.sql.SQLChatMessageHistory(session_id: str, connection_string: str, table_name: str = 'message_store')[source]¶
Bases: BaseChatMessageHistory
Chat message history stored in an SQL database.
Methods
__init__(session_id, connection_string[, ...])
add_ai_message(message)
Add an AI message to the store
add_message(message)
Append the message to the record in db
add_user_message(message)
Add a user message to the store
clear()
Clear session memory from db
Attributes
messages
Retrieve all messages from db
add_ai_message(message: str) → None¶
Add an AI message to the store
add_message(message: BaseMessage) → None[source]¶
Append the message to the record in db
add_user_message(message: str) → None¶
Add a user message to the store
clear() → None[source]¶
Clear session memory from db
property messages: List[langchain.schema.BaseMessage]¶
Retrieve all messages from db | [
5317,
8995,
37711,
27215,
6598,
37499,
2490,
10251,
26151,
16047,
2097,
13730,
55609,
198,
1058,
8859,
8995,
37711,
27215,
6598,
37499,
2490,
10251,
26151,
16047,
2097,
13730,
16663,
851,
25,
610,
11,
3717,
3991,
25,
610,
11,
2007,
1292,
25,
610,
284,
364,
2037,
15153,
13588,
2484,
60,
55609,
198,
33,
2315,
25,
5464,
16047,
2097,
13730,
198,
16047,
1984,
3925,
9967,
304,
459,
8029,
4729,
627,
18337,
198,
565,
2381,
3889,
6045,
851,
11,
4194,
7898,
3991,
38372,
4194,
1131,
2608,
723,
70515,
6598,
7483,
340,
2261,
459,
15592,
1984,
311,
279,
3637,
198,
723,
6598,
7483,
340,
24819,
279,
1984,
311,
279,
3335,
304,
3000,
198,
723,
3398,
6598,
7483,
340,
2261,
264,
1217,
1984,
311,
279,
3637,
198,
7574,
746,
14335,
3882,
5044,
505,
3000,
198,
10738,
198,
16727,
198,
88765,
682,
6743,
505,
3000,
198,
723,
70515,
6598,
7483,
25,
610,
8,
11651,
2290,
55609,
198,
2261,
459,
15592,
1984,
311,
279,
3637,
198,
723,
6598,
7483,
25,
5464,
2097,
8,
11651,
2290,
76747,
60,
55609,
198,
24819,
279,
1984,
311,
279,
3335,
304,
3000,
198,
723,
3398,
6598,
7483,
25,
610,
8,
11651,
2290,
55609,
198,
2261,
264,
1217,
1984,
311,
279,
3637,
198,
7574,
368,
11651,
2290,
76747,
60,
55609,
198,
14335,
3882,
5044,
505,
3000,
198,
3784,
6743,
25,
1796,
58,
5317,
8995,
31992,
13316,
2097,
60,
55609,
198,
88765,
682,
6743,
505,
3000
] | https://langchain.readthedocs.io/en/latest/memory/langchain.memory.chat_message_histories.sql.SQLChatMessageHistory.html |
95cbaae1aae1-0 | langchain.memory.chat_message_histories.momento.MomentoChatMessageHistory¶
class langchain.memory.chat_message_histories.momento.MomentoChatMessageHistory(session_id: str, cache_client: momento.CacheClient, cache_name: str, *, key_prefix: str = 'message_store:', ttl: Optional[timedelta] = None, ensure_cache_exists: bool = True)[source]¶
Bases: BaseChatMessageHistory
Chat message history cache that uses Momento as a backend.
See https://gomomento.com/
Instantiate a chat message history cache that uses Momento as a backend.
Note: to instantiate the cache client passed to MomentoChatMessageHistory,
you must have a Momento account at https://gomomento.com/.
Parameters
session_id (str) – The session ID to use for this chat session.
cache_client (CacheClient) – The Momento cache client.
cache_name (str) – The name of the cache to use to store the messages.
key_prefix (str, optional) – The prefix to apply to the cache key.
Defaults to “message_store:”.
ttl (Optional[timedelta], optional) – The TTL to use for the messages.
Defaults to None, ie the default TTL of the cache will be used.
ensure_cache_exists (bool, optional) – Create the cache if it doesn’t exist.
Defaults to True.
Raises
ImportError – Momento python package is not installed.
TypeError – cache_client is not of type momento.CacheClientObject
Methods
__init__(session_id, cache_client, cache_name, *)
Instantiate a chat message history cache that uses Momento as a backend.
add_ai_message(message)
Add an AI message to the store
add_message(message)
Store a message in the cache.
add_user_message(message) | [
5317,
8995,
37711,
27215,
6598,
37499,
2490,
749,
13209,
78,
1345,
13209,
78,
16047,
2097,
13730,
55609,
198,
1058,
8859,
8995,
37711,
27215,
6598,
37499,
2490,
749,
13209,
78,
1345,
13209,
78,
16047,
2097,
13730,
16663,
851,
25,
610,
11,
6636,
8342,
25,
31221,
47230,
3032,
11,
6636,
1292,
25,
610,
11,
12039,
1401,
14301,
25,
610,
284,
364,
2037,
15153,
17898,
55032,
25,
12536,
14527,
318,
47954,
60,
284,
2290,
11,
6106,
11790,
9965,
25,
1845,
284,
3082,
6758,
2484,
60,
55609,
198,
33,
2315,
25,
5464,
16047,
2097,
13730,
198,
16047,
1984,
3925,
6636,
430,
5829,
40096,
78,
439,
264,
19713,
627,
10031,
3788,
1129,
37183,
13209,
78,
916,
6018,
81651,
264,
6369,
1984,
3925,
6636,
430,
5829,
40096,
78,
439,
264,
19713,
627,
9290,
25,
311,
42002,
279,
6636,
3016,
5946,
311,
40096,
78,
16047,
2097,
13730,
345,
9514,
2011,
617,
264,
40096,
78,
2759,
520,
3788,
1129,
37183,
13209,
78,
916,
76969,
9905,
198,
6045,
851,
320,
496,
8,
1389,
578,
3882,
3110,
311,
1005,
369,
420,
6369,
3882,
627,
9544,
8342,
320,
8397,
3032,
8,
1389,
578,
40096,
78,
6636,
3016,
627,
9544,
1292,
320,
496,
8,
1389,
578,
836,
315,
279,
6636,
311,
1005,
311,
3637,
279,
6743,
627,
798,
14301,
320,
496,
11,
10309,
8,
1389,
578,
9436,
311,
3881,
311,
279,
6636,
1401,
627,
16672,
311,
1054,
2037,
15153,
25,
863,
627,
63958,
320,
15669,
14527,
318,
47954,
1145,
10309,
8,
1389,
578,
79632,
311,
1005,
369,
279,
6743,
627,
16672,
311,
2290,
11,
30958,
279,
1670,
79632,
315,
279,
6636,
690,
387,
1511,
627,
28389,
11790,
9965,
320,
2707,
11,
10309,
8,
1389,
4324,
279,
6636,
422,
433,
3250,
1431,
3073,
627,
16672,
311,
3082,
627,
36120,
198,
11772,
1480,
1389,
40096,
78,
10344,
6462,
374,
539,
10487,
627,
81176,
1389,
6636,
8342,
374,
539,
315,
955,
31221,
47230,
3032,
1211,
198,
18337,
198,
565,
2381,
3889,
6045,
851,
11,
4194,
9544,
8342,
11,
4194,
9544,
1292,
11,
4194,
39060,
81651,
264,
6369,
1984,
3925,
6636,
430,
5829,
40096,
78,
439,
264,
19713,
627,
723,
70515,
6598,
7483,
340,
2261,
459,
15592,
1984,
311,
279,
3637,
198,
723,
6598,
7483,
340,
6221,
264,
1984,
304,
279,
6636,
627,
723,
3398,
6598,
7483,
8
] | https://langchain.readthedocs.io/en/latest/memory/langchain.memory.chat_message_histories.momento.MomentoChatMessageHistory.html |
95cbaae1aae1-1 | add_message(message)
Store a message in the cache.
add_user_message(message)
Add a user message to the store
clear()
Remove the session's messages from the cache.
from_client_params(session_id, cache_name, ...)
Construct cache from CacheClient parameters.
Attributes
messages
Retrieve the messages from Momento.
add_ai_message(message: str) → None¶
Add an AI message to the store
add_message(message: BaseMessage) → None[source]¶
Store a message in the cache.
Parameters
message (BaseMessage) – The message object to store.
Raises
SdkException – Momento service or network error.
Exception – Unexpected response.
add_user_message(message: str) → None¶
Add a user message to the store
clear() → None[source]¶
Remove the session’s messages from the cache.
Raises
SdkException – Momento service or network error.
Exception – Unexpected response.
classmethod from_client_params(session_id: str, cache_name: str, ttl: timedelta, *, configuration: Optional[momento.config.Configuration] = None, auth_token: Optional[str] = None, **kwargs: Any) → MomentoChatMessageHistory[source]¶
Construct cache from CacheClient parameters.
property messages: list[langchain.schema.BaseMessage]¶
Retrieve the messages from Momento.
Raises
SdkException – Momento service or network error
Exception – Unexpected response
Returns
List of cached messages
Return type
list[BaseMessage] | [
723,
6598,
7483,
340,
6221,
264,
1984,
304,
279,
6636,
627,
723,
3398,
6598,
7483,
340,
2261,
264,
1217,
1984,
311,
279,
3637,
198,
7574,
746,
13319,
279,
3882,
596,
6743,
505,
279,
6636,
627,
1527,
8342,
6887,
16663,
851,
11,
4194,
9544,
1292,
11,
4194,
32318,
29568,
6636,
505,
20044,
3032,
5137,
627,
10738,
198,
16727,
198,
88765,
279,
6743,
505,
40096,
78,
627,
723,
70515,
6598,
7483,
25,
610,
8,
11651,
2290,
55609,
198,
2261,
459,
15592,
1984,
311,
279,
3637,
198,
723,
6598,
7483,
25,
5464,
2097,
8,
11651,
2290,
76747,
60,
55609,
198,
6221,
264,
1984,
304,
279,
6636,
627,
9905,
198,
2037,
320,
4066,
2097,
8,
1389,
578,
1984,
1665,
311,
3637,
627,
36120,
198,
58275,
1378,
1389,
40096,
78,
2532,
477,
4009,
1493,
627,
1378,
1389,
71500,
2077,
627,
723,
3398,
6598,
7483,
25,
610,
8,
11651,
2290,
55609,
198,
2261,
264,
1217,
1984,
311,
279,
3637,
198,
7574,
368,
11651,
2290,
76747,
60,
55609,
198,
13319,
279,
3882,
753,
6743,
505,
279,
6636,
627,
36120,
198,
58275,
1378,
1389,
40096,
78,
2532,
477,
4009,
1493,
627,
1378,
1389,
71500,
2077,
627,
27853,
505,
8342,
6887,
16663,
851,
25,
610,
11,
6636,
1292,
25,
610,
11,
55032,
25,
43355,
11,
12039,
6683,
25,
12536,
12335,
13209,
78,
5539,
17785,
60,
284,
2290,
11,
4259,
6594,
25,
12536,
17752,
60,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
40096,
78,
16047,
2097,
13730,
76747,
60,
55609,
198,
29568,
6636,
505,
20044,
3032,
5137,
627,
3784,
6743,
25,
1160,
58,
5317,
8995,
31992,
13316,
2097,
60,
55609,
198,
88765,
279,
6743,
505,
40096,
78,
627,
36120,
198,
58275,
1378,
1389,
40096,
78,
2532,
477,
4009,
1493,
198,
1378,
1389,
71500,
2077,
198,
16851,
198,
861,
315,
21224,
6743,
198,
5715,
955,
198,
1638,
58,
4066,
2097,
60
] | https://langchain.readthedocs.io/en/latest/memory/langchain.memory.chat_message_histories.momento.MomentoChatMessageHistory.html |
518a7e35f03f-0 | langchain.memory.combined.CombinedMemory¶
class langchain.memory.combined.CombinedMemory(*, memories: List[BaseMemory])[source]¶
Bases: BaseMemory
Class for combining multiple memories’ data together.
Create a new model by parsing and validating input data from keyword arguments.
Raises ValidationError if the input data cannot be parsed to form a valid model.
param memories: List[langchain.schema.BaseMemory] [Required]¶
For tracking all the memories that should be accessed.
validator check_input_key » memories[source]¶
Check that if memories are of type BaseChatMemory that input keys exist.
validator check_repeated_memory_variable » memories[source]¶
clear() → None[source]¶
Clear context from this session for every memory.
load_memory_variables(inputs: Dict[str, Any]) → Dict[str, str][source]¶
Load all vars from sub-memories.
save_context(inputs: Dict[str, Any], outputs: Dict[str, str]) → None[source]¶
Save context from this session for every memory.
to_json() → Union[SerializedConstructor, SerializedNotImplemented]¶
to_json_not_implemented() → SerializedNotImplemented¶
property lc_attributes: Dict¶
Return a list of attribute names that should be included in the
serialized kwargs. These attributes must be accepted by the
constructor.
property lc_namespace: List[str]¶
Return the namespace of the langchain object.
eg. [“langchain”, “llms”, “openai”]
property lc_secrets: Dict[str, str]¶
Return a map of constructor argument names to secret ids.
eg. {“openai_api_key”: “OPENAI_API_KEY”}
property lc_serializable: bool¶
Return whether or not the class is serializable. | [
5317,
8995,
37711,
916,
65,
1619,
732,
2925,
1619,
10869,
55609,
198,
1058,
8859,
8995,
37711,
916,
65,
1619,
732,
2925,
1619,
10869,
4163,
11,
19459,
25,
1796,
58,
4066,
10869,
41105,
2484,
60,
55609,
198,
33,
2315,
25,
5464,
10869,
198,
1999,
369,
35271,
5361,
19459,
529,
828,
3871,
627,
4110,
264,
502,
1646,
555,
23115,
323,
69772,
1988,
828,
505,
16570,
6105,
627,
36120,
54129,
422,
279,
1988,
828,
4250,
387,
16051,
311,
1376,
264,
2764,
1646,
627,
913,
19459,
25,
1796,
58,
5317,
8995,
31992,
13316,
10869,
60,
510,
8327,
60,
55609,
198,
2520,
15194,
682,
279,
19459,
430,
1288,
387,
25790,
627,
16503,
1817,
6022,
3173,
4194,
8345,
4194,
19459,
76747,
60,
55609,
198,
4061,
430,
422,
19459,
527,
315,
955,
5464,
16047,
10869,
430,
1988,
7039,
3073,
627,
16503,
1817,
1311,
43054,
19745,
14977,
4194,
8345,
4194,
19459,
76747,
60,
55609,
198,
7574,
368,
11651,
2290,
76747,
60,
55609,
198,
14335,
2317,
505,
420,
3882,
369,
1475,
5044,
627,
1096,
19745,
29282,
35099,
25,
30226,
17752,
11,
5884,
2526,
11651,
30226,
17752,
11,
610,
1483,
2484,
60,
55609,
198,
6003,
682,
20537,
505,
1207,
1474,
336,
2490,
627,
6766,
8634,
35099,
25,
30226,
17752,
11,
5884,
1145,
16674,
25,
30226,
17752,
11,
610,
2526,
11651,
2290,
76747,
60,
55609,
198,
8960,
2317,
505,
420,
3882,
369,
1475,
5044,
627,
998,
9643,
368,
11651,
9323,
58,
78621,
13591,
11,
92572,
2688,
18804,
60,
55609,
198,
998,
9643,
8072,
18377,
14565,
368,
11651,
92572,
2688,
18804,
55609,
198,
3784,
37313,
18741,
25,
30226,
55609,
198,
5715,
264,
1160,
315,
7180,
5144,
430,
1288,
387,
5343,
304,
279,
198,
76377,
16901,
13,
4314,
8365,
2011,
387,
11928,
555,
279,
198,
22602,
627,
3784,
37313,
42671,
25,
1796,
17752,
60,
55609,
198,
5715,
279,
4573,
315,
279,
8859,
8995,
1665,
627,
797,
13,
510,
2118,
5317,
8995,
9520,
1054,
657,
1026,
9520,
1054,
2569,
2192,
863,
933,
3784,
37313,
3537,
53810,
25,
30226,
17752,
11,
610,
60,
55609,
198,
5715,
264,
2472,
315,
4797,
5811,
5144,
311,
6367,
14483,
627,
797,
13,
314,
2118,
2569,
2192,
11959,
3173,
57633,
1054,
32033,
15836,
11669,
6738,
863,
534,
3784,
37313,
26684,
8499,
25,
1845,
55609,
198,
5715,
3508,
477,
539,
279,
538,
374,
6275,
8499,
13
] | https://langchain.readthedocs.io/en/latest/memory/langchain.memory.combined.CombinedMemory.html |
518a7e35f03f-1 | property lc_serializable: bool¶
Return whether or not the class is serializable.
property memory_variables: List[str]¶
All the memory variables that this instance provides.
model Config¶
Bases: object
Configuration for this pydantic object.
arbitrary_types_allowed = True¶ | [
3784,
37313,
26684,
8499,
25,
1845,
55609,
198,
5715,
3508,
477,
539,
279,
538,
374,
6275,
8499,
627,
3784,
5044,
29282,
25,
1796,
17752,
60,
55609,
198,
2460,
279,
5044,
7482,
430,
420,
2937,
5825,
627,
2590,
5649,
55609,
198,
33,
2315,
25,
1665,
198,
7843,
369,
420,
4611,
67,
8322,
1665,
627,
277,
88951,
9962,
43255,
284,
3082,
55609
] | https://langchain.readthedocs.io/en/latest/memory/langchain.memory.combined.CombinedMemory.html |
2830bf2d6849-0 | langchain.memory.entity.BaseEntityStore¶
class langchain.memory.entity.BaseEntityStore[source]¶
Bases: BaseModel, ABC
Create a new model by parsing and validating input data from keyword arguments.
Raises ValidationError if the input data cannot be parsed to form a valid model.
abstract clear() → None[source]¶
Delete all entities from store.
abstract delete(key: str) → None[source]¶
Delete entity value from store.
abstract exists(key: str) → bool[source]¶
Check if entity exists in store.
abstract get(key: str, default: Optional[str] = None) → Optional[str][source]¶
Get entity value from store.
abstract set(key: str, value: Optional[str]) → None[source]¶
Set entity value in store. | [
5317,
8995,
37711,
9926,
13316,
3106,
6221,
55609,
198,
1058,
8859,
8995,
37711,
9926,
13316,
3106,
6221,
76747,
60,
55609,
198,
33,
2315,
25,
65705,
11,
19921,
198,
4110,
264,
502,
1646,
555,
23115,
323,
69772,
1988,
828,
505,
16570,
6105,
627,
36120,
54129,
422,
279,
1988,
828,
4250,
387,
16051,
311,
1376,
264,
2764,
1646,
627,
16647,
2867,
368,
11651,
2290,
76747,
60,
55609,
198,
6571,
682,
15086,
505,
3637,
627,
16647,
3783,
4962,
25,
610,
8,
11651,
2290,
76747,
60,
55609,
198,
6571,
5502,
907,
505,
3637,
627,
16647,
6866,
4962,
25,
610,
8,
11651,
1845,
76747,
60,
55609,
198,
4061,
422,
5502,
6866,
304,
3637,
627,
16647,
636,
4962,
25,
610,
11,
1670,
25,
12536,
17752,
60,
284,
2290,
8,
11651,
12536,
17752,
1483,
2484,
60,
55609,
198,
1991,
5502,
907,
505,
3637,
627,
16647,
743,
4962,
25,
610,
11,
907,
25,
12536,
17752,
2526,
11651,
2290,
76747,
60,
55609,
198,
1681,
5502,
907,
304,
3637,
13
] | https://langchain.readthedocs.io/en/latest/memory/langchain.memory.entity.BaseEntityStore.html |
216a11209d7d-0 | langchain.memory.chat_message_histories.mongodb.MongoDBChatMessageHistory¶
class langchain.memory.chat_message_histories.mongodb.MongoDBChatMessageHistory(connection_string: str, session_id: str, database_name: str = 'chat_history', collection_name: str = 'message_store')[source]¶
Bases: BaseChatMessageHistory
Chat message history that stores history in MongoDB.
Parameters
connection_string – connection string to connect to MongoDB
session_id – arbitrary key that is used to store the messages
of a single chat session.
database_name – name of the database to use
collection_name – name of the collection to use
Methods
__init__(connection_string, session_id[, ...])
add_ai_message(message)
Add an AI message to the store
add_message(message)
Append the message to the record in MongoDB
add_user_message(message)
Add a user message to the store
clear()
Clear session memory from MongoDB
Attributes
messages
Retrieve the messages from MongoDB
add_ai_message(message: str) → None¶
Add an AI message to the store
add_message(message: BaseMessage) → None[source]¶
Append the message to the record in MongoDB
add_user_message(message: str) → None¶
Add a user message to the store
clear() → None[source]¶
Clear session memory from MongoDB
property messages: List[langchain.schema.BaseMessage]¶
Retrieve the messages from MongoDB | [
5317,
8995,
37711,
27215,
6598,
37499,
2490,
45331,
79305,
3590,
16047,
2097,
13730,
55609,
198,
1058,
8859,
8995,
37711,
27215,
6598,
37499,
2490,
45331,
79305,
3590,
16047,
2097,
13730,
25119,
3991,
25,
610,
11,
3882,
851,
25,
610,
11,
4729,
1292,
25,
610,
284,
364,
9884,
20389,
518,
4526,
1292,
25,
610,
284,
364,
2037,
15153,
13588,
2484,
60,
55609,
198,
33,
2315,
25,
5464,
16047,
2097,
13730,
198,
16047,
1984,
3925,
430,
10756,
3925,
304,
46428,
627,
9905,
198,
7898,
3991,
1389,
3717,
925,
311,
4667,
311,
46428,
198,
6045,
851,
1389,
25142,
1401,
430,
374,
1511,
311,
3637,
279,
6743,
198,
1073,
264,
3254,
6369,
3882,
627,
12494,
1292,
1389,
836,
315,
279,
4729,
311,
1005,
198,
13727,
1292,
1389,
836,
315,
279,
4526,
311,
1005,
198,
18337,
198,
565,
2381,
3889,
7898,
3991,
11,
4194,
6045,
851,
38372,
4194,
1131,
2608,
723,
70515,
6598,
7483,
340,
2261,
459,
15592,
1984,
311,
279,
3637,
198,
723,
6598,
7483,
340,
24819,
279,
1984,
311,
279,
3335,
304,
46428,
198,
723,
3398,
6598,
7483,
340,
2261,
264,
1217,
1984,
311,
279,
3637,
198,
7574,
746,
14335,
3882,
5044,
505,
46428,
198,
10738,
198,
16727,
198,
88765,
279,
6743,
505,
46428,
198,
723,
70515,
6598,
7483,
25,
610,
8,
11651,
2290,
55609,
198,
2261,
459,
15592,
1984,
311,
279,
3637,
198,
723,
6598,
7483,
25,
5464,
2097,
8,
11651,
2290,
76747,
60,
55609,
198,
24819,
279,
1984,
311,
279,
3335,
304,
46428,
198,
723,
3398,
6598,
7483,
25,
610,
8,
11651,
2290,
55609,
198,
2261,
264,
1217,
1984,
311,
279,
3637,
198,
7574,
368,
11651,
2290,
76747,
60,
55609,
198,
14335,
3882,
5044,
505,
46428,
198,
3784,
6743,
25,
1796,
58,
5317,
8995,
31992,
13316,
2097,
60,
55609,
198,
88765,
279,
6743,
505,
46428
] | https://langchain.readthedocs.io/en/latest/memory/langchain.memory.chat_message_histories.mongodb.MongoDBChatMessageHistory.html |
72fb2dd7d1ac-0 | langchain.memory.buffer_window.ConversationBufferWindowMemory¶
class langchain.memory.buffer_window.ConversationBufferWindowMemory(*, chat_memory: BaseChatMessageHistory = None, output_key: Optional[str] = None, input_key: Optional[str] = None, return_messages: bool = False, human_prefix: str = 'Human', ai_prefix: str = 'AI', memory_key: str = 'history', k: int = 5)[source]¶
Bases: BaseChatMemory
Buffer for storing conversation memory.
Create a new model by parsing and validating input data from keyword arguments.
Raises ValidationError if the input data cannot be parsed to form a valid model.
param ai_prefix: str = 'AI'¶
param chat_memory: BaseChatMessageHistory [Optional]¶
param human_prefix: str = 'Human'¶
param input_key: Optional[str] = None¶
param k: int = 5¶
param output_key: Optional[str] = None¶
param return_messages: bool = False¶
clear() → None¶
Clear memory contents.
load_memory_variables(inputs: Dict[str, Any]) → Dict[str, str][source]¶
Return history buffer.
save_context(inputs: Dict[str, Any], outputs: Dict[str, str]) → None¶
Save context from this conversation to buffer.
to_json() → Union[SerializedConstructor, SerializedNotImplemented]¶
to_json_not_implemented() → SerializedNotImplemented¶
property buffer: List[langchain.schema.BaseMessage]¶
String buffer of memory.
property lc_attributes: Dict¶
Return a list of attribute names that should be included in the
serialized kwargs. These attributes must be accepted by the
constructor.
property lc_namespace: List[str]¶
Return the namespace of the langchain object.
eg. [“langchain”, “llms”, “openai”] | [
5317,
8995,
37711,
25472,
12856,
4906,
23124,
4187,
4362,
10869,
55609,
198,
1058,
8859,
8995,
37711,
25472,
12856,
4906,
23124,
4187,
4362,
10869,
4163,
11,
6369,
19745,
25,
5464,
16047,
2097,
13730,
284,
2290,
11,
2612,
3173,
25,
12536,
17752,
60,
284,
2290,
11,
1988,
3173,
25,
12536,
17752,
60,
284,
2290,
11,
471,
24321,
25,
1845,
284,
3641,
11,
3823,
14301,
25,
610,
284,
364,
35075,
518,
16796,
14301,
25,
610,
284,
364,
15836,
518,
5044,
3173,
25,
610,
284,
364,
19375,
518,
597,
25,
528,
284,
220,
20,
6758,
2484,
60,
55609,
198,
33,
2315,
25,
5464,
16047,
10869,
198,
4187,
369,
28672,
10652,
5044,
627,
4110,
264,
502,
1646,
555,
23115,
323,
69772,
1988,
828,
505,
16570,
6105,
627,
36120,
54129,
422,
279,
1988,
828,
4250,
387,
16051,
311,
1376,
264,
2764,
1646,
627,
913,
16796,
14301,
25,
610,
284,
364,
15836,
6,
55609,
198,
913,
6369,
19745,
25,
5464,
16047,
2097,
13730,
510,
15669,
60,
55609,
198,
913,
3823,
14301,
25,
610,
284,
364,
35075,
6,
55609,
198,
913,
1988,
3173,
25,
12536,
17752,
60,
284,
2290,
55609,
198,
913,
597,
25,
528,
284,
220,
20,
55609,
198,
913,
2612,
3173,
25,
12536,
17752,
60,
284,
2290,
55609,
198,
913,
471,
24321,
25,
1845,
284,
3641,
55609,
198,
7574,
368,
11651,
2290,
55609,
198,
14335,
5044,
8970,
627,
1096,
19745,
29282,
35099,
25,
30226,
17752,
11,
5884,
2526,
11651,
30226,
17752,
11,
610,
1483,
2484,
60,
55609,
198,
5715,
3925,
4240,
627,
6766,
8634,
35099,
25,
30226,
17752,
11,
5884,
1145,
16674,
25,
30226,
17752,
11,
610,
2526,
11651,
2290,
55609,
198,
8960,
2317,
505,
420,
10652,
311,
4240,
627,
998,
9643,
368,
11651,
9323,
58,
78621,
13591,
11,
92572,
2688,
18804,
60,
55609,
198,
998,
9643,
8072,
18377,
14565,
368,
11651,
92572,
2688,
18804,
55609,
198,
3784,
4240,
25,
1796,
58,
5317,
8995,
31992,
13316,
2097,
60,
55609,
198,
707,
4240,
315,
5044,
627,
3784,
37313,
18741,
25,
30226,
55609,
198,
5715,
264,
1160,
315,
7180,
5144,
430,
1288,
387,
5343,
304,
279,
198,
76377,
16901,
13,
4314,
8365,
2011,
387,
11928,
555,
279,
198,
22602,
627,
3784,
37313,
42671,
25,
1796,
17752,
60,
55609,
198,
5715,
279,
4573,
315,
279,
8859,
8995,
1665,
627,
797,
13,
510,
2118,
5317,
8995,
9520,
1054,
657,
1026,
9520,
1054,
2569,
2192,
863,
60
] | https://langchain.readthedocs.io/en/latest/memory/langchain.memory.buffer_window.ConversationBufferWindowMemory.html |
72fb2dd7d1ac-1 | eg. [“langchain”, “llms”, “openai”]
property lc_secrets: Dict[str, str]¶
Return a map of constructor argument names to secret ids.
eg. {“openai_api_key”: “OPENAI_API_KEY”}
property lc_serializable: bool¶
Return whether or not the class is serializable.
model Config¶
Bases: object
Configuration for this pydantic object.
arbitrary_types_allowed = True¶ | [
797,
13,
510,
2118,
5317,
8995,
9520,
1054,
657,
1026,
9520,
1054,
2569,
2192,
863,
933,
3784,
37313,
3537,
53810,
25,
30226,
17752,
11,
610,
60,
55609,
198,
5715,
264,
2472,
315,
4797,
5811,
5144,
311,
6367,
14483,
627,
797,
13,
314,
2118,
2569,
2192,
11959,
3173,
57633,
1054,
32033,
15836,
11669,
6738,
863,
534,
3784,
37313,
26684,
8499,
25,
1845,
55609,
198,
5715,
3508,
477,
539,
279,
538,
374,
6275,
8499,
627,
2590,
5649,
55609,
198,
33,
2315,
25,
1665,
198,
7843,
369,
420,
4611,
67,
8322,
1665,
627,
277,
88951,
9962,
43255,
284,
3082,
55609
] | https://langchain.readthedocs.io/en/latest/memory/langchain.memory.buffer_window.ConversationBufferWindowMemory.html |
98af03149ff2-0 | langchain.memory.entity.SQLiteEntityStore¶
class langchain.memory.entity.SQLiteEntityStore(session_id: str = 'default', db_file: str = 'entities.db', table_name: str = 'memory_store', *args: Any)[source]¶
Bases: BaseEntityStore
SQLite-backed Entity store
Create a new model by parsing and validating input data from keyword arguments.
Raises ValidationError if the input data cannot be parsed to form a valid model.
param session_id: str = 'default'¶
param table_name: str = 'memory_store'¶
clear() → None[source]¶
Delete all entities from store.
delete(key: str) → None[source]¶
Delete entity value from store.
exists(key: str) → bool[source]¶
Check if entity exists in store.
get(key: str, default: Optional[str] = None) → Optional[str][source]¶
Get entity value from store.
set(key: str, value: Optional[str]) → None[source]¶
Set entity value in store.
property full_table_name: str¶ | [
5317,
8995,
37711,
9926,
98701,
3106,
6221,
55609,
198,
1058,
8859,
8995,
37711,
9926,
98701,
3106,
6221,
16663,
851,
25,
610,
284,
364,
2309,
518,
3000,
2517,
25,
610,
284,
364,
10720,
7221,
518,
2007,
1292,
25,
610,
284,
364,
17717,
15153,
518,
353,
2164,
25,
5884,
6758,
2484,
60,
55609,
198,
33,
2315,
25,
75182,
6221,
198,
82872,
46128,
10606,
3637,
198,
4110,
264,
502,
1646,
555,
23115,
323,
69772,
1988,
828,
505,
16570,
6105,
627,
36120,
54129,
422,
279,
1988,
828,
4250,
387,
16051,
311,
1376,
264,
2764,
1646,
627,
913,
3882,
851,
25,
610,
284,
364,
2309,
6,
55609,
198,
913,
2007,
1292,
25,
610,
284,
364,
17717,
15153,
6,
55609,
198,
7574,
368,
11651,
2290,
76747,
60,
55609,
198,
6571,
682,
15086,
505,
3637,
627,
4644,
4962,
25,
610,
8,
11651,
2290,
76747,
60,
55609,
198,
6571,
5502,
907,
505,
3637,
627,
16703,
4962,
25,
610,
8,
11651,
1845,
76747,
60,
55609,
198,
4061,
422,
5502,
6866,
304,
3637,
627,
456,
4962,
25,
610,
11,
1670,
25,
12536,
17752,
60,
284,
2290,
8,
11651,
12536,
17752,
1483,
2484,
60,
55609,
198,
1991,
5502,
907,
505,
3637,
627,
751,
4962,
25,
610,
11,
907,
25,
12536,
17752,
2526,
11651,
2290,
76747,
60,
55609,
198,
1681,
5502,
907,
304,
3637,
627,
3784,
2539,
5350,
1292,
25,
610,
55609
] | https://langchain.readthedocs.io/en/latest/memory/langchain.memory.entity.SQLiteEntityStore.html |
0e64349ad56b-0 | langchain.memory.summary.SummarizerMixin¶
class langchain.memory.summary.SummarizerMixin(*, human_prefix: str = 'Human', ai_prefix: str = 'AI', llm: ~langchain.base_language.BaseLanguageModel, prompt: ~langchain.prompts.base.BasePromptTemplate = PromptTemplate(input_variables=['summary', 'new_lines'], output_parser=None, partial_variables={}, template='Progressively summarize the lines of conversation provided, adding onto the previous summary returning a new summary.\n\nEXAMPLE\nCurrent summary:\nThe human asks what the AI thinks of artificial intelligence. The AI thinks artificial intelligence is a force for good.\n\nNew lines of conversation:\nHuman: Why do you think artificial intelligence is a force for good?\nAI: Because artificial intelligence will help humans reach their full potential.\n\nNew summary:\nThe human asks what the AI thinks of artificial intelligence. The AI thinks artificial intelligence is a force for good because it will help humans reach their full potential.\nEND OF EXAMPLE\n\nCurrent summary:\n{summary}\n\nNew lines of conversation:\n{new_lines}\n\nNew summary:', template_format='f-string', validate_template=True), summary_message_cls: ~typing.Type[~langchain.schema.BaseMessage] = <class 'langchain.schema.SystemMessage'>)[source]¶
Bases: BaseModel
Create a new model by parsing and validating input data from keyword arguments.
Raises ValidationError if the input data cannot be parsed to form a valid model.
param ai_prefix: str = 'AI'¶
param human_prefix: str = 'Human'¶
param llm: langchain.base_language.BaseLanguageModel [Required]¶ | [
5317,
8995,
37711,
36441,
42776,
5730,
3213,
39556,
55609,
198,
1058,
8859,
8995,
37711,
36441,
42776,
5730,
3213,
39556,
4163,
11,
3823,
14301,
25,
610,
284,
364,
35075,
518,
16796,
14301,
25,
610,
284,
364,
15836,
518,
9507,
76,
25,
4056,
5317,
8995,
9105,
30121,
13316,
14126,
1747,
11,
10137,
25,
4056,
5317,
8995,
61848,
13044,
9105,
13316,
55715,
7423,
284,
60601,
7423,
5498,
29282,
14314,
1743,
518,
364,
943,
18828,
4181,
2612,
19024,
5980,
11,
7276,
29282,
68525,
3896,
1151,
9685,
3210,
63179,
279,
5238,
315,
10652,
3984,
11,
7999,
8800,
279,
3766,
12399,
13758,
264,
502,
12399,
7255,
77,
1734,
96975,
1734,
5520,
12399,
7338,
89330,
3823,
17501,
1148,
279,
15592,
15849,
315,
21075,
11478,
13,
578,
15592,
15849,
21075,
11478,
374,
264,
5457,
369,
1695,
7255,
77,
1734,
3648,
5238,
315,
10652,
7338,
77,
35075,
25,
8595,
656,
499,
1781,
21075,
11478,
374,
264,
5457,
369,
1695,
33720,
77,
15836,
25,
9393,
21075,
11478,
690,
1520,
12966,
5662,
872,
2539,
4754,
7255,
77,
1734,
3648,
12399,
7338,
89330,
3823,
17501,
1148,
279,
15592,
15849,
315,
21075,
11478,
13,
578,
15592,
15849,
21075,
11478,
374,
264,
5457,
369,
1695,
1606,
433,
690,
1520,
12966,
5662,
872,
2539,
4754,
7255,
77,
4794,
3083,
67346,
1734,
1734,
5520,
12399,
7338,
77,
90,
1743,
11281,
77,
1734,
3648,
5238,
315,
10652,
7338,
77,
90,
943,
18828,
11281,
77,
1734,
3648,
12399,
17898,
3896,
9132,
1151,
69,
31981,
518,
9788,
8864,
3702,
705,
12399,
6598,
39756,
25,
4056,
90902,
10394,
58,
93,
5317,
8995,
31992,
13316,
2097,
60,
284,
366,
1058,
364,
5317,
8995,
31992,
17031,
2097,
6404,
6758,
2484,
60,
55609,
198,
33,
2315,
25,
65705,
198,
4110,
264,
502,
1646,
555,
23115,
323,
69772,
1988,
828,
505,
16570,
6105,
627,
36120,
54129,
422,
279,
1988,
828,
4250,
387,
16051,
311,
1376,
264,
2764,
1646,
627,
913,
16796,
14301,
25,
610,
284,
364,
15836,
6,
55609,
198,
913,
3823,
14301,
25,
610,
284,
364,
35075,
6,
55609,
198,
913,
9507,
76,
25,
8859,
8995,
9105,
30121,
13316,
14126,
1747,
510,
8327,
60,
55609
] | https://langchain.readthedocs.io/en/latest/memory/langchain.memory.summary.SummarizerMixin.html |
0e64349ad56b-1 | param llm: langchain.base_language.BaseLanguageModel [Required]¶
param prompt: langchain.prompts.base.BasePromptTemplate = PromptTemplate(input_variables=['summary', 'new_lines'], output_parser=None, partial_variables={}, template='Progressively summarize the lines of conversation provided, adding onto the previous summary returning a new summary.\n\nEXAMPLE\nCurrent summary:\nThe human asks what the AI thinks of artificial intelligence. The AI thinks artificial intelligence is a force for good.\n\nNew lines of conversation:\nHuman: Why do you think artificial intelligence is a force for good?\nAI: Because artificial intelligence will help humans reach their full potential.\n\nNew summary:\nThe human asks what the AI thinks of artificial intelligence. The AI thinks artificial intelligence is a force for good because it will help humans reach their full potential.\nEND OF EXAMPLE\n\nCurrent summary:\n{summary}\n\nNew lines of conversation:\n{new_lines}\n\nNew summary:', template_format='f-string', validate_template=True)¶
param summary_message_cls: Type[langchain.schema.BaseMessage] = <class 'langchain.schema.SystemMessage'>¶
predict_new_summary(messages: List[BaseMessage], existing_summary: str) → str[source]¶ | [
913,
9507,
76,
25,
8859,
8995,
9105,
30121,
13316,
14126,
1747,
510,
8327,
60,
55609,
198,
913,
10137,
25,
8859,
8995,
61848,
13044,
9105,
13316,
55715,
7423,
284,
60601,
7423,
5498,
29282,
14314,
1743,
518,
364,
943,
18828,
4181,
2612,
19024,
5980,
11,
7276,
29282,
68525,
3896,
1151,
9685,
3210,
63179,
279,
5238,
315,
10652,
3984,
11,
7999,
8800,
279,
3766,
12399,
13758,
264,
502,
12399,
7255,
77,
1734,
96975,
1734,
5520,
12399,
7338,
89330,
3823,
17501,
1148,
279,
15592,
15849,
315,
21075,
11478,
13,
578,
15592,
15849,
21075,
11478,
374,
264,
5457,
369,
1695,
7255,
77,
1734,
3648,
5238,
315,
10652,
7338,
77,
35075,
25,
8595,
656,
499,
1781,
21075,
11478,
374,
264,
5457,
369,
1695,
33720,
77,
15836,
25,
9393,
21075,
11478,
690,
1520,
12966,
5662,
872,
2539,
4754,
7255,
77,
1734,
3648,
12399,
7338,
89330,
3823,
17501,
1148,
279,
15592,
15849,
315,
21075,
11478,
13,
578,
15592,
15849,
21075,
11478,
374,
264,
5457,
369,
1695,
1606,
433,
690,
1520,
12966,
5662,
872,
2539,
4754,
7255,
77,
4794,
3083,
67346,
1734,
1734,
5520,
12399,
7338,
77,
90,
1743,
11281,
77,
1734,
3648,
5238,
315,
10652,
7338,
77,
90,
943,
18828,
11281,
77,
1734,
3648,
12399,
17898,
3896,
9132,
1151,
69,
31981,
518,
9788,
8864,
3702,
8,
55609,
198,
913,
12399,
6598,
39756,
25,
4078,
58,
5317,
8995,
31992,
13316,
2097,
60,
284,
366,
1058,
364,
5317,
8995,
31992,
17031,
2097,
6404,
55609,
198,
35798,
6046,
28350,
56805,
25,
1796,
58,
4066,
2097,
1145,
6484,
28350,
25,
610,
8,
11651,
610,
76747,
60,
55609
] | https://langchain.readthedocs.io/en/latest/memory/langchain.memory.summary.SummarizerMixin.html |
73a2975406fc-0 | langchain.memory.buffer.ConversationBufferMemory¶
class langchain.memory.buffer.ConversationBufferMemory(*, chat_memory: BaseChatMessageHistory = None, output_key: Optional[str] = None, input_key: Optional[str] = None, return_messages: bool = False, human_prefix: str = 'Human', ai_prefix: str = 'AI', memory_key: str = 'history')[source]¶
Bases: BaseChatMemory
Buffer for storing conversation memory.
Create a new model by parsing and validating input data from keyword arguments.
Raises ValidationError if the input data cannot be parsed to form a valid model.
param ai_prefix: str = 'AI'¶
param chat_memory: BaseChatMessageHistory [Optional]¶
param human_prefix: str = 'Human'¶
param input_key: Optional[str] = None¶
param output_key: Optional[str] = None¶
param return_messages: bool = False¶
clear() → None¶
Clear memory contents.
load_memory_variables(inputs: Dict[str, Any]) → Dict[str, Any][source]¶
Return history buffer.
save_context(inputs: Dict[str, Any], outputs: Dict[str, str]) → None¶
Save context from this conversation to buffer.
to_json() → Union[SerializedConstructor, SerializedNotImplemented]¶
to_json_not_implemented() → SerializedNotImplemented¶
property buffer: Any¶
String buffer of memory.
property lc_attributes: Dict¶
Return a list of attribute names that should be included in the
serialized kwargs. These attributes must be accepted by the
constructor.
property lc_namespace: List[str]¶
Return the namespace of the langchain object.
eg. [“langchain”, “llms”, “openai”]
property lc_secrets: Dict[str, str]¶
Return a map of constructor argument names to secret ids. | [
5317,
8995,
37711,
25472,
4906,
23124,
4187,
10869,
55609,
198,
1058,
8859,
8995,
37711,
25472,
4906,
23124,
4187,
10869,
4163,
11,
6369,
19745,
25,
5464,
16047,
2097,
13730,
284,
2290,
11,
2612,
3173,
25,
12536,
17752,
60,
284,
2290,
11,
1988,
3173,
25,
12536,
17752,
60,
284,
2290,
11,
471,
24321,
25,
1845,
284,
3641,
11,
3823,
14301,
25,
610,
284,
364,
35075,
518,
16796,
14301,
25,
610,
284,
364,
15836,
518,
5044,
3173,
25,
610,
284,
364,
19375,
13588,
2484,
60,
55609,
198,
33,
2315,
25,
5464,
16047,
10869,
198,
4187,
369,
28672,
10652,
5044,
627,
4110,
264,
502,
1646,
555,
23115,
323,
69772,
1988,
828,
505,
16570,
6105,
627,
36120,
54129,
422,
279,
1988,
828,
4250,
387,
16051,
311,
1376,
264,
2764,
1646,
627,
913,
16796,
14301,
25,
610,
284,
364,
15836,
6,
55609,
198,
913,
6369,
19745,
25,
5464,
16047,
2097,
13730,
510,
15669,
60,
55609,
198,
913,
3823,
14301,
25,
610,
284,
364,
35075,
6,
55609,
198,
913,
1988,
3173,
25,
12536,
17752,
60,
284,
2290,
55609,
198,
913,
2612,
3173,
25,
12536,
17752,
60,
284,
2290,
55609,
198,
913,
471,
24321,
25,
1845,
284,
3641,
55609,
198,
7574,
368,
11651,
2290,
55609,
198,
14335,
5044,
8970,
627,
1096,
19745,
29282,
35099,
25,
30226,
17752,
11,
5884,
2526,
11651,
30226,
17752,
11,
5884,
1483,
2484,
60,
55609,
198,
5715,
3925,
4240,
627,
6766,
8634,
35099,
25,
30226,
17752,
11,
5884,
1145,
16674,
25,
30226,
17752,
11,
610,
2526,
11651,
2290,
55609,
198,
8960,
2317,
505,
420,
10652,
311,
4240,
627,
998,
9643,
368,
11651,
9323,
58,
78621,
13591,
11,
92572,
2688,
18804,
60,
55609,
198,
998,
9643,
8072,
18377,
14565,
368,
11651,
92572,
2688,
18804,
55609,
198,
3784,
4240,
25,
5884,
55609,
198,
707,
4240,
315,
5044,
627,
3784,
37313,
18741,
25,
30226,
55609,
198,
5715,
264,
1160,
315,
7180,
5144,
430,
1288,
387,
5343,
304,
279,
198,
76377,
16901,
13,
4314,
8365,
2011,
387,
11928,
555,
279,
198,
22602,
627,
3784,
37313,
42671,
25,
1796,
17752,
60,
55609,
198,
5715,
279,
4573,
315,
279,
8859,
8995,
1665,
627,
797,
13,
510,
2118,
5317,
8995,
9520,
1054,
657,
1026,
9520,
1054,
2569,
2192,
863,
933,
3784,
37313,
3537,
53810,
25,
30226,
17752,
11,
610,
60,
55609,
198,
5715,
264,
2472,
315,
4797,
5811,
5144,
311,
6367,
14483,
13
] | https://langchain.readthedocs.io/en/latest/memory/langchain.memory.buffer.ConversationBufferMemory.html |
73a2975406fc-1 | Return a map of constructor argument names to secret ids.
eg. {“openai_api_key”: “OPENAI_API_KEY”}
property lc_serializable: bool¶
Return whether or not the class is serializable.
model Config¶
Bases: object
Configuration for this pydantic object.
arbitrary_types_allowed = True¶ | [
5715,
264,
2472,
315,
4797,
5811,
5144,
311,
6367,
14483,
627,
797,
13,
314,
2118,
2569,
2192,
11959,
3173,
57633,
1054,
32033,
15836,
11669,
6738,
863,
534,
3784,
37313,
26684,
8499,
25,
1845,
55609,
198,
5715,
3508,
477,
539,
279,
538,
374,
6275,
8499,
627,
2590,
5649,
55609,
198,
33,
2315,
25,
1665,
198,
7843,
369,
420,
4611,
67,
8322,
1665,
627,
277,
88951,
9962,
43255,
284,
3082,
55609
] | https://langchain.readthedocs.io/en/latest/memory/langchain.memory.buffer.ConversationBufferMemory.html |
9b00ad7f40d6-0 | langchain.memory.chat_message_histories.firestore.FirestoreChatMessageHistory¶
class langchain.memory.chat_message_histories.firestore.FirestoreChatMessageHistory(collection_name: str, session_id: str, user_id: str)[source]¶
Bases: BaseChatMessageHistory
Chat history backed by Google Firestore.
Initialize a new instance of the FirestoreChatMessageHistory class.
Parameters
collection_name – The name of the collection to use.
session_id – The session ID for the chat..
user_id – The user ID for the chat.
Methods
__init__(collection_name, session_id, user_id)
Initialize a new instance of the FirestoreChatMessageHistory class.
add_ai_message(message)
Add an AI message to the store
add_message(message)
Add a self-created message to the store
add_user_message(message)
Add a user message to the store
clear()
Clear session memory from this memory and Firestore.
load_messages()
Retrieve the messages from Firestore
prepare_firestore()
Prepare the Firestore client.
upsert_messages([new_message])
Update the Firestore document.
Attributes
messages
add_ai_message(message: str) → None¶
Add an AI message to the store
add_message(message: BaseMessage) → None[source]¶
Add a self-created message to the store
add_user_message(message: str) → None¶
Add a user message to the store
clear() → None[source]¶
Clear session memory from this memory and Firestore.
load_messages() → None[source]¶
Retrieve the messages from Firestore
prepare_firestore() → None[source]¶
Prepare the Firestore client.
Use this function to make sure your database is ready.
upsert_messages(new_message: Optional[BaseMessage] = None) → None[source]¶
Update the Firestore document.
messages: List[BaseMessage]¶ | [
5317,
8995,
37711,
27215,
6598,
37499,
2490,
52099,
72405,
4412,
16047,
2097,
13730,
55609,
198,
1058,
8859,
8995,
37711,
27215,
6598,
37499,
2490,
52099,
72405,
4412,
16047,
2097,
13730,
36966,
1292,
25,
610,
11,
3882,
851,
25,
610,
11,
1217,
851,
25,
610,
6758,
2484,
60,
55609,
198,
33,
2315,
25,
5464,
16047,
2097,
13730,
198,
16047,
3925,
22126,
555,
5195,
95366,
627,
10130,
264,
502,
2937,
315,
279,
95366,
16047,
2097,
13730,
538,
627,
9905,
198,
13727,
1292,
1389,
578,
836,
315,
279,
4526,
311,
1005,
627,
6045,
851,
1389,
578,
3882,
3110,
369,
279,
6369,
35047,
882,
851,
1389,
578,
1217,
3110,
369,
279,
6369,
627,
18337,
198,
565,
2381,
3889,
13727,
1292,
11,
4194,
6045,
851,
11,
4194,
882,
851,
340,
10130,
264,
502,
2937,
315,
279,
95366,
16047,
2097,
13730,
538,
627,
723,
70515,
6598,
7483,
340,
2261,
459,
15592,
1984,
311,
279,
3637,
198,
723,
6598,
7483,
340,
2261,
264,
659,
72057,
1984,
311,
279,
3637,
198,
723,
3398,
6598,
7483,
340,
2261,
264,
1217,
1984,
311,
279,
3637,
198,
7574,
746,
14335,
3882,
5044,
505,
420,
5044,
323,
95366,
627,
1096,
24321,
746,
88765,
279,
6743,
505,
95366,
198,
13923,
90924,
746,
51690,
279,
95366,
3016,
627,
455,
6175,
24321,
2625,
943,
6598,
2608,
4387,
279,
95366,
2246,
627,
10738,
198,
16727,
198,
723,
70515,
6598,
7483,
25,
610,
8,
11651,
2290,
55609,
198,
2261,
459,
15592,
1984,
311,
279,
3637,
198,
723,
6598,
7483,
25,
5464,
2097,
8,
11651,
2290,
76747,
60,
55609,
198,
2261,
264,
659,
72057,
1984,
311,
279,
3637,
198,
723,
3398,
6598,
7483,
25,
610,
8,
11651,
2290,
55609,
198,
2261,
264,
1217,
1984,
311,
279,
3637,
198,
7574,
368,
11651,
2290,
76747,
60,
55609,
198,
14335,
3882,
5044,
505,
420,
5044,
323,
95366,
627,
1096,
24321,
368,
11651,
2290,
76747,
60,
55609,
198,
88765,
279,
6743,
505,
95366,
198,
13923,
90924,
368,
11651,
2290,
76747,
60,
55609,
198,
51690,
279,
95366,
3016,
627,
10464,
420,
734,
311,
1304,
2771,
701,
4729,
374,
5644,
627,
455,
6175,
24321,
1792,
6598,
25,
12536,
58,
4066,
2097,
60,
284,
2290,
8,
11651,
2290,
76747,
60,
55609,
198,
4387,
279,
95366,
2246,
627,
16727,
25,
1796,
58,
4066,
2097,
60,
55609
] | https://langchain.readthedocs.io/en/latest/memory/langchain.memory.chat_message_histories.firestore.FirestoreChatMessageHistory.html |
37d5e55b6fb0-0 | langchain.memory.chat_message_histories.postgres.PostgresChatMessageHistory¶
class langchain.memory.chat_message_histories.postgres.PostgresChatMessageHistory(session_id: str, connection_string: str = 'postgresql://postgres:mypassword@localhost/chat_history', table_name: str = 'message_store')[source]¶
Bases: BaseChatMessageHistory
Chat message history stored in a Postgres database.
Methods
__init__(session_id[, connection_string, ...])
add_ai_message(message)
Add an AI message to the store
add_message(message)
Append the message to the record in PostgreSQL
add_user_message(message)
Add a user message to the store
clear()
Clear session memory from PostgreSQL
Attributes
messages
Retrieve the messages from PostgreSQL
add_ai_message(message: str) → None¶
Add an AI message to the store
add_message(message: BaseMessage) → None[source]¶
Append the message to the record in PostgreSQL
add_user_message(message: str) → None¶
Add a user message to the store
clear() → None[source]¶
Clear session memory from PostgreSQL
property messages: List[langchain.schema.BaseMessage]¶
Retrieve the messages from PostgreSQL | [
5317,
8995,
37711,
27215,
6598,
37499,
2490,
6679,
18297,
24336,
18297,
16047,
2097,
13730,
55609,
198,
1058,
8859,
8995,
37711,
27215,
6598,
37499,
2490,
6679,
18297,
24336,
18297,
16047,
2097,
13730,
16663,
851,
25,
610,
11,
3717,
3991,
25,
610,
284,
364,
89570,
1129,
44170,
31386,
1100,
1979,
31,
8465,
72435,
20389,
518,
2007,
1292,
25,
610,
284,
364,
2037,
15153,
13588,
2484,
60,
55609,
198,
33,
2315,
25,
5464,
16047,
2097,
13730,
198,
16047,
1984,
3925,
9967,
304,
264,
3962,
18297,
4729,
627,
18337,
198,
565,
2381,
3889,
6045,
851,
38372,
4194,
7898,
3991,
11,
4194,
1131,
2608,
723,
70515,
6598,
7483,
340,
2261,
459,
15592,
1984,
311,
279,
3637,
198,
723,
6598,
7483,
340,
24819,
279,
1984,
311,
279,
3335,
304,
74701,
198,
723,
3398,
6598,
7483,
340,
2261,
264,
1217,
1984,
311,
279,
3637,
198,
7574,
746,
14335,
3882,
5044,
505,
74701,
198,
10738,
198,
16727,
198,
88765,
279,
6743,
505,
74701,
198,
723,
70515,
6598,
7483,
25,
610,
8,
11651,
2290,
55609,
198,
2261,
459,
15592,
1984,
311,
279,
3637,
198,
723,
6598,
7483,
25,
5464,
2097,
8,
11651,
2290,
76747,
60,
55609,
198,
24819,
279,
1984,
311,
279,
3335,
304,
74701,
198,
723,
3398,
6598,
7483,
25,
610,
8,
11651,
2290,
55609,
198,
2261,
264,
1217,
1984,
311,
279,
3637,
198,
7574,
368,
11651,
2290,
76747,
60,
55609,
198,
14335,
3882,
5044,
505,
74701,
198,
3784,
6743,
25,
1796,
58,
5317,
8995,
31992,
13316,
2097,
60,
55609,
198,
88765,
279,
6743,
505,
74701
] | https://langchain.readthedocs.io/en/latest/memory/langchain.memory.chat_message_histories.postgres.PostgresChatMessageHistory.html |
9511cad5abee-0 | langchain.memory.chat_message_histories.zep.ZepChatMessageHistory¶
class langchain.memory.chat_message_histories.zep.ZepChatMessageHistory(session_id: str, url: str = 'http://localhost:8000', api_key: Optional[str] = None)[source]¶
Bases: BaseChatMessageHistory
A ChatMessageHistory implementation that uses Zep as a backend.
Recommended usage:
# Set up Zep Chat History
zep_chat_history = ZepChatMessageHistory(
session_id=session_id,
url=ZEP_API_URL,
api_key=<your_api_key>,
)
# Use a standard ConversationBufferMemory to encapsulate the Zep chat history
memory = ConversationBufferMemory(
memory_key="chat_history", chat_memory=zep_chat_history
)
Zep provides long-term conversation storage for LLM apps. The server stores,
summarizes, embeds, indexes, and enriches conversational AI chat
histories, and exposes them via simple, low-latency APIs.
For server installation instructions and more, see:
https://docs.getzep.com/deployment/quickstart/
This class is a thin wrapper around the zep-python package. Additional
Zep functionality is exposed via the zep_summary and zep_messages
properties.
For more information on the zep-python package, see:
https://github.com/getzep/zep-python
Methods
__init__(session_id[, url, api_key])
add_ai_message(message)
Add an AI message to the store
add_message(message)
Append the message to the Zep memory history
add_user_message(message)
Add a user message to the store
clear()
Clear session memory from Zep.
search(query[, metadata, limit])
Search Zep memory for messages matching the query
Attributes
messages | [
5317,
8995,
37711,
27215,
6598,
37499,
2490,
4025,
752,
13784,
752,
16047,
2097,
13730,
55609,
198,
1058,
8859,
8995,
37711,
27215,
6598,
37499,
2490,
4025,
752,
13784,
752,
16047,
2097,
13730,
16663,
851,
25,
610,
11,
2576,
25,
610,
284,
364,
1277,
1129,
8465,
25,
4728,
15,
518,
6464,
3173,
25,
12536,
17752,
60,
284,
2290,
6758,
2484,
60,
55609,
198,
33,
2315,
25,
5464,
16047,
2097,
13730,
198,
32,
13149,
2097,
13730,
8292,
430,
5829,
1901,
752,
439,
264,
19713,
627,
57627,
10648,
512,
2,
2638,
709,
1901,
752,
13149,
11346,
198,
89,
752,
36153,
20389,
284,
1901,
752,
16047,
2097,
13730,
1021,
262,
3882,
851,
84500,
851,
345,
262,
2576,
28,
57,
9377,
11669,
8159,
345,
262,
6464,
3173,
39798,
22479,
11959,
3173,
12803,
340,
2,
5560,
264,
5410,
51930,
4187,
10869,
311,
43669,
6468,
279,
1901,
752,
6369,
3925,
198,
17717,
284,
51930,
4187,
10869,
1021,
262,
5044,
3173,
429,
9884,
20389,
498,
6369,
19745,
83670,
752,
36153,
20389,
198,
340,
57,
752,
5825,
1317,
9860,
10652,
5942,
369,
445,
11237,
10721,
13,
578,
3622,
10756,
345,
1264,
5730,
4861,
11,
11840,
82,
11,
25998,
11,
323,
31518,
288,
7669,
1697,
15592,
6369,
198,
21843,
2490,
11,
323,
59381,
1124,
4669,
4382,
11,
3428,
99514,
2301,
34456,
627,
2520,
3622,
14028,
11470,
323,
810,
11,
1518,
512,
2485,
1129,
14452,
673,
89,
752,
916,
23365,
53899,
14,
28863,
2527,
6018,
2028,
538,
374,
264,
15792,
13564,
2212,
279,
1167,
752,
73029,
6462,
13,
24086,
198,
57,
752,
15293,
374,
15246,
4669,
279,
1167,
752,
28350,
323,
1167,
752,
24321,
198,
13495,
627,
2520,
810,
2038,
389,
279,
1167,
752,
73029,
6462,
11,
1518,
512,
2485,
1129,
5316,
916,
24183,
89,
752,
32182,
752,
73029,
198,
18337,
198,
565,
2381,
3889,
6045,
851,
38372,
4194,
1103,
11,
4194,
2113,
3173,
2608,
723,
70515,
6598,
7483,
340,
2261,
459,
15592,
1984,
311,
279,
3637,
198,
723,
6598,
7483,
340,
24819,
279,
1984,
311,
279,
1901,
752,
5044,
3925,
198,
723,
3398,
6598,
7483,
340,
2261,
264,
1217,
1984,
311,
279,
3637,
198,
7574,
746,
14335,
3882,
5044,
505,
1901,
752,
627,
1874,
10974,
38372,
4194,
18103,
11,
4194,
9696,
2608,
6014,
1901,
752,
5044,
369,
6743,
12864,
279,
3319,
198,
10738,
198,
16727
] | https://langchain.readthedocs.io/en/latest/memory/langchain.memory.chat_message_histories.zep.ZepChatMessageHistory.html |
9511cad5abee-1 | Search Zep memory for messages matching the query
Attributes
messages
Retrieve messages from Zep memory
zep_messages
Retrieve summary from Zep memory
zep_summary
Retrieve summary from Zep memory
add_ai_message(message: str) → None¶
Add an AI message to the store
add_message(message: BaseMessage) → None[source]¶
Append the message to the Zep memory history
add_user_message(message: str) → None¶
Add a user message to the store
clear() → None[source]¶
Clear session memory from Zep. Note that Zep is long-term storage for memory
and this is not advised unless you have specific data retention requirements.
search(query: str, metadata: Optional[Dict] = None, limit: Optional[int] = None) → List[MemorySearchResult][source]¶
Search Zep memory for messages matching the query
property messages: List[langchain.schema.BaseMessage]¶
Retrieve messages from Zep memory
property zep_messages: List[Message]¶
Retrieve summary from Zep memory
property zep_summary: Optional[str]¶
Retrieve summary from Zep memory | [
6014,
1901,
752,
5044,
369,
6743,
12864,
279,
3319,
198,
10738,
198,
16727,
198,
88765,
6743,
505,
1901,
752,
5044,
198,
89,
752,
24321,
198,
88765,
12399,
505,
1901,
752,
5044,
198,
89,
752,
28350,
198,
88765,
12399,
505,
1901,
752,
5044,
198,
723,
70515,
6598,
7483,
25,
610,
8,
11651,
2290,
55609,
198,
2261,
459,
15592,
1984,
311,
279,
3637,
198,
723,
6598,
7483,
25,
5464,
2097,
8,
11651,
2290,
76747,
60,
55609,
198,
24819,
279,
1984,
311,
279,
1901,
752,
5044,
3925,
198,
723,
3398,
6598,
7483,
25,
610,
8,
11651,
2290,
55609,
198,
2261,
264,
1217,
1984,
311,
279,
3637,
198,
7574,
368,
11651,
2290,
76747,
60,
55609,
198,
14335,
3882,
5044,
505,
1901,
752,
13,
7181,
430,
1901,
752,
374,
1317,
9860,
5942,
369,
5044,
198,
438,
420,
374,
539,
26160,
7389,
499,
617,
3230,
828,
38231,
8670,
627,
1874,
10974,
25,
610,
11,
11408,
25,
12536,
58,
13755,
60,
284,
2290,
11,
4017,
25,
12536,
19155,
60,
284,
2290,
8,
11651,
1796,
58,
10869,
6014,
2122,
1483,
2484,
60,
55609,
198,
6014,
1901,
752,
5044,
369,
6743,
12864,
279,
3319,
198,
3784,
6743,
25,
1796,
58,
5317,
8995,
31992,
13316,
2097,
60,
55609,
198,
88765,
6743,
505,
1901,
752,
5044,
198,
3784,
1167,
752,
24321,
25,
1796,
58,
2097,
60,
55609,
198,
88765,
12399,
505,
1901,
752,
5044,
198,
3784,
1167,
752,
28350,
25,
12536,
17752,
60,
55609,
198,
88765,
12399,
505,
1901,
752,
5044
] | https://langchain.readthedocs.io/en/latest/memory/langchain.memory.chat_message_histories.zep.ZepChatMessageHistory.html |
829cb0eb21ad-0 | langchain.memory.chat_message_histories.file.FileChatMessageHistory¶
class langchain.memory.chat_message_histories.file.FileChatMessageHistory(file_path: str)[source]¶
Bases: BaseChatMessageHistory
Chat message history that stores history in a local file.
Parameters
file_path – path of the local file to store the messages.
Methods
__init__(file_path)
add_ai_message(message)
Add an AI message to the store
add_message(message)
Append the message to the record in the local file
add_user_message(message)
Add a user message to the store
clear()
Clear session memory from the local file
Attributes
messages
Retrieve the messages from the local file
add_ai_message(message: str) → None¶
Add an AI message to the store
add_message(message: BaseMessage) → None[source]¶
Append the message to the record in the local file
add_user_message(message: str) → None¶
Add a user message to the store
clear() → None[source]¶
Clear session memory from the local file
property messages: List[langchain.schema.BaseMessage]¶
Retrieve the messages from the local file | [
5317,
8995,
37711,
27215,
6598,
37499,
2490,
9914,
8744,
16047,
2097,
13730,
55609,
198,
1058,
8859,
8995,
37711,
27215,
6598,
37499,
2490,
9914,
8744,
16047,
2097,
13730,
4971,
2703,
25,
610,
6758,
2484,
60,
55609,
198,
33,
2315,
25,
5464,
16047,
2097,
13730,
198,
16047,
1984,
3925,
430,
10756,
3925,
304,
264,
2254,
1052,
627,
9905,
198,
1213,
2703,
1389,
1853,
315,
279,
2254,
1052,
311,
3637,
279,
6743,
627,
18337,
198,
565,
2381,
3889,
1213,
2703,
340,
723,
70515,
6598,
7483,
340,
2261,
459,
15592,
1984,
311,
279,
3637,
198,
723,
6598,
7483,
340,
24819,
279,
1984,
311,
279,
3335,
304,
279,
2254,
1052,
198,
723,
3398,
6598,
7483,
340,
2261,
264,
1217,
1984,
311,
279,
3637,
198,
7574,
746,
14335,
3882,
5044,
505,
279,
2254,
1052,
198,
10738,
198,
16727,
198,
88765,
279,
6743,
505,
279,
2254,
1052,
198,
723,
70515,
6598,
7483,
25,
610,
8,
11651,
2290,
55609,
198,
2261,
459,
15592,
1984,
311,
279,
3637,
198,
723,
6598,
7483,
25,
5464,
2097,
8,
11651,
2290,
76747,
60,
55609,
198,
24819,
279,
1984,
311,
279,
3335,
304,
279,
2254,
1052,
198,
723,
3398,
6598,
7483,
25,
610,
8,
11651,
2290,
55609,
198,
2261,
264,
1217,
1984,
311,
279,
3637,
198,
7574,
368,
11651,
2290,
76747,
60,
55609,
198,
14335,
3882,
5044,
505,
279,
2254,
1052,
198,
3784,
6743,
25,
1796,
58,
5317,
8995,
31992,
13316,
2097,
60,
55609,
198,
88765,
279,
6743,
505,
279,
2254,
1052
] | https://langchain.readthedocs.io/en/latest/memory/langchain.memory.chat_message_histories.file.FileChatMessageHistory.html |
306f94fae2d4-0 | langchain.memory.chat_message_histories.in_memory.ChatMessageHistory¶
class langchain.memory.chat_message_histories.in_memory.ChatMessageHistory(*, messages: List[BaseMessage] = [])[source]¶
Bases: BaseChatMessageHistory, BaseModel
Create a new model by parsing and validating input data from keyword arguments.
Raises ValidationError if the input data cannot be parsed to form a valid model.
param messages: List[langchain.schema.BaseMessage] = []¶
add_ai_message(message: str) → None¶
Add an AI message to the store
add_message(message: BaseMessage) → None[source]¶
Add a self-created message to the store
add_user_message(message: str) → None¶
Add a user message to the store
clear() → None[source]¶
Remove all messages from the store | [
5317,
8995,
37711,
27215,
6598,
37499,
2490,
1896,
19745,
59944,
2097,
13730,
55609,
198,
1058,
8859,
8995,
37711,
27215,
6598,
37499,
2490,
1896,
19745,
59944,
2097,
13730,
4163,
11,
6743,
25,
1796,
58,
4066,
2097,
60,
284,
510,
41105,
2484,
60,
55609,
198,
33,
2315,
25,
5464,
16047,
2097,
13730,
11,
65705,
198,
4110,
264,
502,
1646,
555,
23115,
323,
69772,
1988,
828,
505,
16570,
6105,
627,
36120,
54129,
422,
279,
1988,
828,
4250,
387,
16051,
311,
1376,
264,
2764,
1646,
627,
913,
6743,
25,
1796,
58,
5317,
8995,
31992,
13316,
2097,
60,
284,
3132,
55609,
198,
723,
70515,
6598,
7483,
25,
610,
8,
11651,
2290,
55609,
198,
2261,
459,
15592,
1984,
311,
279,
3637,
198,
723,
6598,
7483,
25,
5464,
2097,
8,
11651,
2290,
76747,
60,
55609,
198,
2261,
264,
659,
72057,
1984,
311,
279,
3637,
198,
723,
3398,
6598,
7483,
25,
610,
8,
11651,
2290,
55609,
198,
2261,
264,
1217,
1984,
311,
279,
3637,
198,
7574,
368,
11651,
2290,
76747,
60,
55609,
198,
13319,
682,
6743,
505,
279,
3637
] | https://langchain.readthedocs.io/en/latest/memory/langchain.memory.chat_message_histories.in_memory.ChatMessageHistory.html |
5a513c2646b5-0 | langchain.memory.chat_message_histories.redis.RedisChatMessageHistory¶
class langchain.memory.chat_message_histories.redis.RedisChatMessageHistory(session_id: str, url: str = 'redis://localhost:6379/0', key_prefix: str = 'message_store:', ttl: Optional[int] = None)[source]¶
Bases: BaseChatMessageHistory
Chat message history stored in a Redis database.
Methods
__init__(session_id[, url, key_prefix, ttl])
add_ai_message(message)
Add an AI message to the store
add_message(message)
Append the message to the record in Redis
add_user_message(message)
Add a user message to the store
clear()
Clear session memory from Redis
Attributes
key
Construct the record key to use
messages
Retrieve the messages from Redis
add_ai_message(message: str) → None¶
Add an AI message to the store
add_message(message: BaseMessage) → None[source]¶
Append the message to the record in Redis
add_user_message(message: str) → None¶
Add a user message to the store
clear() → None[source]¶
Clear session memory from Redis
property key: str¶
Construct the record key to use
property messages: List[langchain.schema.BaseMessage]¶
Retrieve the messages from Redis | [
5317,
8995,
37711,
27215,
6598,
37499,
2490,
50979,
83498,
16047,
2097,
13730,
55609,
198,
1058,
8859,
8995,
37711,
27215,
6598,
37499,
2490,
50979,
83498,
16047,
2097,
13730,
16663,
851,
25,
610,
11,
2576,
25,
610,
284,
364,
22496,
1129,
8465,
25,
21788,
24,
14,
15,
518,
1401,
14301,
25,
610,
284,
364,
2037,
15153,
17898,
55032,
25,
12536,
19155,
60,
284,
2290,
6758,
2484,
60,
55609,
198,
33,
2315,
25,
5464,
16047,
2097,
13730,
198,
16047,
1984,
3925,
9967,
304,
264,
35258,
4729,
627,
18337,
198,
565,
2381,
3889,
6045,
851,
38372,
4194,
1103,
11,
4194,
798,
14301,
11,
4194,
63958,
2608,
723,
70515,
6598,
7483,
340,
2261,
459,
15592,
1984,
311,
279,
3637,
198,
723,
6598,
7483,
340,
24819,
279,
1984,
311,
279,
3335,
304,
35258,
198,
723,
3398,
6598,
7483,
340,
2261,
264,
1217,
1984,
311,
279,
3637,
198,
7574,
746,
14335,
3882,
5044,
505,
35258,
198,
10738,
198,
798,
198,
29568,
279,
3335,
1401,
311,
1005,
198,
16727,
198,
88765,
279,
6743,
505,
35258,
198,
723,
70515,
6598,
7483,
25,
610,
8,
11651,
2290,
55609,
198,
2261,
459,
15592,
1984,
311,
279,
3637,
198,
723,
6598,
7483,
25,
5464,
2097,
8,
11651,
2290,
76747,
60,
55609,
198,
24819,
279,
1984,
311,
279,
3335,
304,
35258,
198,
723,
3398,
6598,
7483,
25,
610,
8,
11651,
2290,
55609,
198,
2261,
264,
1217,
1984,
311,
279,
3637,
198,
7574,
368,
11651,
2290,
76747,
60,
55609,
198,
14335,
3882,
5044,
505,
35258,
198,
3784,
1401,
25,
610,
55609,
198,
29568,
279,
3335,
1401,
311,
1005,
198,
3784,
6743,
25,
1796,
58,
5317,
8995,
31992,
13316,
2097,
60,
55609,
198,
88765,
279,
6743,
505,
35258
] | https://langchain.readthedocs.io/en/latest/memory/langchain.memory.chat_message_histories.redis.RedisChatMessageHistory.html |
a2d18e74f8fc-0 | langchain.memory.chat_memory.BaseChatMemory¶
class langchain.memory.chat_memory.BaseChatMemory(*, chat_memory: BaseChatMessageHistory = None, output_key: Optional[str] = None, input_key: Optional[str] = None, return_messages: bool = False)[source]¶
Bases: BaseMemory, ABC
Create a new model by parsing and validating input data from keyword arguments.
Raises ValidationError if the input data cannot be parsed to form a valid model.
param chat_memory: langchain.schema.BaseChatMessageHistory [Optional]¶
param input_key: Optional[str] = None¶
param output_key: Optional[str] = None¶
param return_messages: bool = False¶
clear() → None[source]¶
Clear memory contents.
abstract load_memory_variables(inputs: Dict[str, Any]) → Dict[str, Any]¶
Return key-value pairs given the text input to the chain.
If None, return all memories
save_context(inputs: Dict[str, Any], outputs: Dict[str, str]) → None[source]¶
Save context from this conversation to buffer.
to_json() → Union[SerializedConstructor, SerializedNotImplemented]¶
to_json_not_implemented() → SerializedNotImplemented¶
property lc_attributes: Dict¶
Return a list of attribute names that should be included in the
serialized kwargs. These attributes must be accepted by the
constructor.
property lc_namespace: List[str]¶
Return the namespace of the langchain object.
eg. [“langchain”, “llms”, “openai”]
property lc_secrets: Dict[str, str]¶
Return a map of constructor argument names to secret ids.
eg. {“openai_api_key”: “OPENAI_API_KEY”}
property lc_serializable: bool¶
Return whether or not the class is serializable.
abstract property memory_variables: List[str]¶ | [
5317,
8995,
37711,
27215,
19745,
13316,
16047,
10869,
55609,
198,
1058,
8859,
8995,
37711,
27215,
19745,
13316,
16047,
10869,
4163,
11,
6369,
19745,
25,
5464,
16047,
2097,
13730,
284,
2290,
11,
2612,
3173,
25,
12536,
17752,
60,
284,
2290,
11,
1988,
3173,
25,
12536,
17752,
60,
284,
2290,
11,
471,
24321,
25,
1845,
284,
3641,
6758,
2484,
60,
55609,
198,
33,
2315,
25,
5464,
10869,
11,
19921,
198,
4110,
264,
502,
1646,
555,
23115,
323,
69772,
1988,
828,
505,
16570,
6105,
627,
36120,
54129,
422,
279,
1988,
828,
4250,
387,
16051,
311,
1376,
264,
2764,
1646,
627,
913,
6369,
19745,
25,
8859,
8995,
31992,
13316,
16047,
2097,
13730,
510,
15669,
60,
55609,
198,
913,
1988,
3173,
25,
12536,
17752,
60,
284,
2290,
55609,
198,
913,
2612,
3173,
25,
12536,
17752,
60,
284,
2290,
55609,
198,
913,
471,
24321,
25,
1845,
284,
3641,
55609,
198,
7574,
368,
11651,
2290,
76747,
60,
55609,
198,
14335,
5044,
8970,
627,
16647,
2865,
19745,
29282,
35099,
25,
30226,
17752,
11,
5884,
2526,
11651,
30226,
17752,
11,
5884,
60,
55609,
198,
5715,
1401,
19625,
13840,
2728,
279,
1495,
1988,
311,
279,
8957,
627,
2746,
2290,
11,
471,
682,
19459,
198,
6766,
8634,
35099,
25,
30226,
17752,
11,
5884,
1145,
16674,
25,
30226,
17752,
11,
610,
2526,
11651,
2290,
76747,
60,
55609,
198,
8960,
2317,
505,
420,
10652,
311,
4240,
627,
998,
9643,
368,
11651,
9323,
58,
78621,
13591,
11,
92572,
2688,
18804,
60,
55609,
198,
998,
9643,
8072,
18377,
14565,
368,
11651,
92572,
2688,
18804,
55609,
198,
3784,
37313,
18741,
25,
30226,
55609,
198,
5715,
264,
1160,
315,
7180,
5144,
430,
1288,
387,
5343,
304,
279,
198,
76377,
16901,
13,
4314,
8365,
2011,
387,
11928,
555,
279,
198,
22602,
627,
3784,
37313,
42671,
25,
1796,
17752,
60,
55609,
198,
5715,
279,
4573,
315,
279,
8859,
8995,
1665,
627,
797,
13,
510,
2118,
5317,
8995,
9520,
1054,
657,
1026,
9520,
1054,
2569,
2192,
863,
933,
3784,
37313,
3537,
53810,
25,
30226,
17752,
11,
610,
60,
55609,
198,
5715,
264,
2472,
315,
4797,
5811,
5144,
311,
6367,
14483,
627,
797,
13,
314,
2118,
2569,
2192,
11959,
3173,
57633,
1054,
32033,
15836,
11669,
6738,
863,
534,
3784,
37313,
26684,
8499,
25,
1845,
55609,
198,
5715,
3508,
477,
539,
279,
538,
374,
6275,
8499,
627,
16647,
3424,
5044,
29282,
25,
1796,
17752,
60,
55609
] | https://langchain.readthedocs.io/en/latest/memory/langchain.memory.chat_memory.BaseChatMemory.html |
a2d18e74f8fc-1 | abstract property memory_variables: List[str]¶
Input keys this memory class will load dynamically.
model Config¶
Bases: object
Configuration for this pydantic object.
arbitrary_types_allowed = True¶ | [
16647,
3424,
5044,
29282,
25,
1796,
17752,
60,
55609,
198,
2566,
7039,
420,
5044,
538,
690,
2865,
43111,
627,
2590,
5649,
55609,
198,
33,
2315,
25,
1665,
198,
7843,
369,
420,
4611,
67,
8322,
1665,
627,
277,
88951,
9962,
43255,
284,
3082,
55609
] | https://langchain.readthedocs.io/en/latest/memory/langchain.memory.chat_memory.BaseChatMemory.html |
1ea6bb95fed2-0 | langchain.memory.entity.RedisEntityStore¶
class langchain.memory.entity.RedisEntityStore(session_id: str = 'default', url: str = 'redis://localhost:6379/0', key_prefix: str = 'memory_store', ttl: Optional[int] = 86400, recall_ttl: Optional[int] = 259200, *args: Any, redis_client: Any = None)[source]¶
Bases: BaseEntityStore
Redis-backed Entity store. Entities get a TTL of 1 day by default, and
that TTL is extended by 3 days every time the entity is read back.
Create a new model by parsing and validating input data from keyword arguments.
Raises ValidationError if the input data cannot be parsed to form a valid model.
param key_prefix: str = 'memory_store'¶
param recall_ttl: Optional[int] = 259200¶
param redis_client: Any = None¶
param session_id: str = 'default'¶
param ttl: Optional[int] = 86400¶
clear() → None[source]¶
Delete all entities from store.
delete(key: str) → None[source]¶
Delete entity value from store.
exists(key: str) → bool[source]¶
Check if entity exists in store.
get(key: str, default: Optional[str] = None) → Optional[str][source]¶
Get entity value from store.
set(key: str, value: Optional[str]) → None[source]¶
Set entity value in store.
property full_key_prefix: str¶ | [
5317,
8995,
37711,
9926,
83498,
3106,
6221,
55609,
198,
1058,
8859,
8995,
37711,
9926,
83498,
3106,
6221,
16663,
851,
25,
610,
284,
364,
2309,
518,
2576,
25,
610,
284,
364,
22496,
1129,
8465,
25,
21788,
24,
14,
15,
518,
1401,
14301,
25,
610,
284,
364,
17717,
15153,
518,
55032,
25,
12536,
19155,
60,
284,
220,
19355,
410,
11,
19635,
88257,
25,
12536,
19155,
60,
284,
220,
15537,
1049,
11,
353,
2164,
25,
5884,
11,
21540,
8342,
25,
5884,
284,
2290,
6758,
2484,
60,
55609,
198,
33,
2315,
25,
75182,
6221,
198,
49237,
46128,
10606,
3637,
13,
53349,
636,
264,
79632,
315,
220,
16,
1938,
555,
1670,
11,
323,
198,
9210,
79632,
374,
11838,
555,
220,
18,
2919,
1475,
892,
279,
5502,
374,
1373,
1203,
627,
4110,
264,
502,
1646,
555,
23115,
323,
69772,
1988,
828,
505,
16570,
6105,
627,
36120,
54129,
422,
279,
1988,
828,
4250,
387,
16051,
311,
1376,
264,
2764,
1646,
627,
913,
1401,
14301,
25,
610,
284,
364,
17717,
15153,
6,
55609,
198,
913,
19635,
88257,
25,
12536,
19155,
60,
284,
220,
15537,
1049,
55609,
198,
913,
21540,
8342,
25,
5884,
284,
2290,
55609,
198,
913,
3882,
851,
25,
610,
284,
364,
2309,
6,
55609,
198,
913,
55032,
25,
12536,
19155,
60,
284,
220,
19355,
410,
55609,
198,
7574,
368,
11651,
2290,
76747,
60,
55609,
198,
6571,
682,
15086,
505,
3637,
627,
4644,
4962,
25,
610,
8,
11651,
2290,
76747,
60,
55609,
198,
6571,
5502,
907,
505,
3637,
627,
16703,
4962,
25,
610,
8,
11651,
1845,
76747,
60,
55609,
198,
4061,
422,
5502,
6866,
304,
3637,
627,
456,
4962,
25,
610,
11,
1670,
25,
12536,
17752,
60,
284,
2290,
8,
11651,
12536,
17752,
1483,
2484,
60,
55609,
198,
1991,
5502,
907,
505,
3637,
627,
751,
4962,
25,
610,
11,
907,
25,
12536,
17752,
2526,
11651,
2290,
76747,
60,
55609,
198,
1681,
5502,
907,
304,
3637,
627,
3784,
2539,
3173,
14301,
25,
610,
55609
] | https://langchain.readthedocs.io/en/latest/memory/langchain.memory.entity.RedisEntityStore.html |
5620da23e88e-0 | langchain.memory.entity.InMemoryEntityStore¶
class langchain.memory.entity.InMemoryEntityStore(*, store: Dict[str, Optional[str]] = {})[source]¶
Bases: BaseEntityStore
Basic in-memory entity store.
Create a new model by parsing and validating input data from keyword arguments.
Raises ValidationError if the input data cannot be parsed to form a valid model.
param store: Dict[str, Optional[str]] = {}¶
clear() → None[source]¶
Delete all entities from store.
delete(key: str) → None[source]¶
Delete entity value from store.
exists(key: str) → bool[source]¶
Check if entity exists in store.
get(key: str, default: Optional[str] = None) → Optional[str][source]¶
Get entity value from store.
set(key: str, value: Optional[str]) → None[source]¶
Set entity value in store. | [
5317,
8995,
37711,
9926,
5450,
10869,
3106,
6221,
55609,
198,
1058,
8859,
8995,
37711,
9926,
5450,
10869,
3106,
6221,
4163,
11,
3637,
25,
30226,
17752,
11,
12536,
17752,
5163,
284,
4792,
6758,
2484,
60,
55609,
198,
33,
2315,
25,
75182,
6221,
198,
16323,
304,
65196,
5502,
3637,
627,
4110,
264,
502,
1646,
555,
23115,
323,
69772,
1988,
828,
505,
16570,
6105,
627,
36120,
54129,
422,
279,
1988,
828,
4250,
387,
16051,
311,
1376,
264,
2764,
1646,
627,
913,
3637,
25,
30226,
17752,
11,
12536,
17752,
5163,
284,
4792,
55609,
198,
7574,
368,
11651,
2290,
76747,
60,
55609,
198,
6571,
682,
15086,
505,
3637,
627,
4644,
4962,
25,
610,
8,
11651,
2290,
76747,
60,
55609,
198,
6571,
5502,
907,
505,
3637,
627,
16703,
4962,
25,
610,
8,
11651,
1845,
76747,
60,
55609,
198,
4061,
422,
5502,
6866,
304,
3637,
627,
456,
4962,
25,
610,
11,
1670,
25,
12536,
17752,
60,
284,
2290,
8,
11651,
12536,
17752,
1483,
2484,
60,
55609,
198,
1991,
5502,
907,
505,
3637,
627,
751,
4962,
25,
610,
11,
907,
25,
12536,
17752,
2526,
11651,
2290,
76747,
60,
55609,
198,
1681,
5502,
907,
304,
3637,
13
] | https://langchain.readthedocs.io/en/latest/memory/langchain.memory.entity.InMemoryEntityStore.html |
014cb47bb905-0 | langchain.memory.readonly.ReadOnlySharedMemory¶
class langchain.memory.readonly.ReadOnlySharedMemory(*, memory: BaseMemory)[source]¶
Bases: BaseMemory
A memory wrapper that is read-only and cannot be changed.
Create a new model by parsing and validating input data from keyword arguments.
Raises ValidationError if the input data cannot be parsed to form a valid model.
param memory: langchain.schema.BaseMemory [Required]¶
clear() → None[source]¶
Nothing to clear, got a memory like a vault.
load_memory_variables(inputs: Dict[str, Any]) → Dict[str, str][source]¶
Load memory variables from memory.
save_context(inputs: Dict[str, Any], outputs: Dict[str, str]) → None[source]¶
Nothing should be saved or changed
to_json() → Union[SerializedConstructor, SerializedNotImplemented]¶
to_json_not_implemented() → SerializedNotImplemented¶
property lc_attributes: Dict¶
Return a list of attribute names that should be included in the
serialized kwargs. These attributes must be accepted by the
constructor.
property lc_namespace: List[str]¶
Return the namespace of the langchain object.
eg. [“langchain”, “llms”, “openai”]
property lc_secrets: Dict[str, str]¶
Return a map of constructor argument names to secret ids.
eg. {“openai_api_key”: “OPENAI_API_KEY”}
property lc_serializable: bool¶
Return whether or not the class is serializable.
property memory_variables: List[str]¶
Return memory variables.
model Config¶
Bases: object
Configuration for this pydantic object.
arbitrary_types_allowed = True¶ | [
5317,
8995,
37711,
4217,
3323,
27558,
17430,
10869,
55609,
198,
1058,
8859,
8995,
37711,
4217,
3323,
27558,
17430,
10869,
4163,
11,
5044,
25,
5464,
10869,
6758,
2484,
60,
55609,
198,
33,
2315,
25,
5464,
10869,
198,
32,
5044,
13564,
430,
374,
1373,
15744,
323,
4250,
387,
5614,
627,
4110,
264,
502,
1646,
555,
23115,
323,
69772,
1988,
828,
505,
16570,
6105,
627,
36120,
54129,
422,
279,
1988,
828,
4250,
387,
16051,
311,
1376,
264,
2764,
1646,
627,
913,
5044,
25,
8859,
8995,
31992,
13316,
10869,
510,
8327,
60,
55609,
198,
7574,
368,
11651,
2290,
76747,
60,
55609,
198,
24714,
311,
2867,
11,
2751,
264,
5044,
1093,
264,
35684,
627,
1096,
19745,
29282,
35099,
25,
30226,
17752,
11,
5884,
2526,
11651,
30226,
17752,
11,
610,
1483,
2484,
60,
55609,
198,
6003,
5044,
7482,
505,
5044,
627,
6766,
8634,
35099,
25,
30226,
17752,
11,
5884,
1145,
16674,
25,
30226,
17752,
11,
610,
2526,
11651,
2290,
76747,
60,
55609,
198,
24714,
1288,
387,
6924,
477,
5614,
198,
998,
9643,
368,
11651,
9323,
58,
78621,
13591,
11,
92572,
2688,
18804,
60,
55609,
198,
998,
9643,
8072,
18377,
14565,
368,
11651,
92572,
2688,
18804,
55609,
198,
3784,
37313,
18741,
25,
30226,
55609,
198,
5715,
264,
1160,
315,
7180,
5144,
430,
1288,
387,
5343,
304,
279,
198,
76377,
16901,
13,
4314,
8365,
2011,
387,
11928,
555,
279,
198,
22602,
627,
3784,
37313,
42671,
25,
1796,
17752,
60,
55609,
198,
5715,
279,
4573,
315,
279,
8859,
8995,
1665,
627,
797,
13,
510,
2118,
5317,
8995,
9520,
1054,
657,
1026,
9520,
1054,
2569,
2192,
863,
933,
3784,
37313,
3537,
53810,
25,
30226,
17752,
11,
610,
60,
55609,
198,
5715,
264,
2472,
315,
4797,
5811,
5144,
311,
6367,
14483,
627,
797,
13,
314,
2118,
2569,
2192,
11959,
3173,
57633,
1054,
32033,
15836,
11669,
6738,
863,
534,
3784,
37313,
26684,
8499,
25,
1845,
55609,
198,
5715,
3508,
477,
539,
279,
538,
374,
6275,
8499,
627,
3784,
5044,
29282,
25,
1796,
17752,
60,
55609,
198,
5715,
5044,
7482,
627,
2590,
5649,
55609,
198,
33,
2315,
25,
1665,
198,
7843,
369,
420,
4611,
67,
8322,
1665,
627,
277,
88951,
9962,
43255,
284,
3082,
55609
] | https://langchain.readthedocs.io/en/latest/memory/langchain.memory.readonly.ReadOnlySharedMemory.html |
83822ac1cddf-0 | langchain.memory.summary.ConversationSummaryMemory¶
class langchain.memory.summary.ConversationSummaryMemory(*, human_prefix: str = 'Human', ai_prefix: str = 'AI', llm: ~langchain.base_language.BaseLanguageModel, prompt: ~langchain.prompts.base.BasePromptTemplate = PromptTemplate(input_variables=['summary', 'new_lines'], output_parser=None, partial_variables={}, template='Progressively summarize the lines of conversation provided, adding onto the previous summary returning a new summary.\n\nEXAMPLE\nCurrent summary:\nThe human asks what the AI thinks of artificial intelligence. The AI thinks artificial intelligence is a force for good.\n\nNew lines of conversation:\nHuman: Why do you think artificial intelligence is a force for good?\nAI: Because artificial intelligence will help humans reach their full potential.\n\nNew summary:\nThe human asks what the AI thinks of artificial intelligence. The AI thinks artificial intelligence is a force for good because it will help humans reach their full potential.\nEND OF EXAMPLE\n\nCurrent summary:\n{summary}\n\nNew lines of conversation:\n{new_lines}\n\nNew summary:', template_format='f-string', validate_template=True), summary_message_cls: ~typing.Type[~langchain.schema.BaseMessage] = <class 'langchain.schema.SystemMessage'>, chat_memory: ~langchain.schema.BaseChatMessageHistory = None, output_key: ~typing.Optional[str] = None, input_key: ~typing.Optional[str] = None, return_messages: bool = False, buffer: str = '', memory_key: str = 'history')[source]¶
Bases: BaseChatMemory, SummarizerMixin
Conversation summarizer to memory.
Create a new model by parsing and validating input data from keyword arguments.
Raises ValidationError if the input data cannot be parsed to form a valid model.
param ai_prefix: str = 'AI'¶
param buffer: str = ''¶ | [
5317,
8995,
37711,
36441,
4906,
23124,
19791,
10869,
55609,
198,
1058,
8859,
8995,
37711,
36441,
4906,
23124,
19791,
10869,
4163,
11,
3823,
14301,
25,
610,
284,
364,
35075,
518,
16796,
14301,
25,
610,
284,
364,
15836,
518,
9507,
76,
25,
4056,
5317,
8995,
9105,
30121,
13316,
14126,
1747,
11,
10137,
25,
4056,
5317,
8995,
61848,
13044,
9105,
13316,
55715,
7423,
284,
60601,
7423,
5498,
29282,
14314,
1743,
518,
364,
943,
18828,
4181,
2612,
19024,
5980,
11,
7276,
29282,
68525,
3896,
1151,
9685,
3210,
63179,
279,
5238,
315,
10652,
3984,
11,
7999,
8800,
279,
3766,
12399,
13758,
264,
502,
12399,
7255,
77,
1734,
96975,
1734,
5520,
12399,
7338,
89330,
3823,
17501,
1148,
279,
15592,
15849,
315,
21075,
11478,
13,
578,
15592,
15849,
21075,
11478,
374,
264,
5457,
369,
1695,
7255,
77,
1734,
3648,
5238,
315,
10652,
7338,
77,
35075,
25,
8595,
656,
499,
1781,
21075,
11478,
374,
264,
5457,
369,
1695,
33720,
77,
15836,
25,
9393,
21075,
11478,
690,
1520,
12966,
5662,
872,
2539,
4754,
7255,
77,
1734,
3648,
12399,
7338,
89330,
3823,
17501,
1148,
279,
15592,
15849,
315,
21075,
11478,
13,
578,
15592,
15849,
21075,
11478,
374,
264,
5457,
369,
1695,
1606,
433,
690,
1520,
12966,
5662,
872,
2539,
4754,
7255,
77,
4794,
3083,
67346,
1734,
1734,
5520,
12399,
7338,
77,
90,
1743,
11281,
77,
1734,
3648,
5238,
315,
10652,
7338,
77,
90,
943,
18828,
11281,
77,
1734,
3648,
12399,
17898,
3896,
9132,
1151,
69,
31981,
518,
9788,
8864,
3702,
705,
12399,
6598,
39756,
25,
4056,
90902,
10394,
58,
93,
5317,
8995,
31992,
13316,
2097,
60,
284,
366,
1058,
364,
5317,
8995,
31992,
17031,
2097,
6404,
11,
6369,
19745,
25,
4056,
5317,
8995,
31992,
13316,
16047,
2097,
13730,
284,
2290,
11,
2612,
3173,
25,
4056,
90902,
37464,
17752,
60,
284,
2290,
11,
1988,
3173,
25,
4056,
90902,
37464,
17752,
60,
284,
2290,
11,
471,
24321,
25,
1845,
284,
3641,
11,
4240,
25,
610,
284,
9158,
5044,
3173,
25,
610,
284,
364,
19375,
13588,
2484,
60,
55609,
198,
33,
2315,
25,
5464,
16047,
10869,
11,
8279,
5730,
3213,
39556,
198,
61413,
29385,
3213,
311,
5044,
627,
4110,
264,
502,
1646,
555,
23115,
323,
69772,
1988,
828,
505,
16570,
6105,
627,
36120,
54129,
422,
279,
1988,
828,
4250,
387,
16051,
311,
1376,
264,
2764,
1646,
627,
913,
16796,
14301,
25,
610,
284,
364,
15836,
6,
55609,
198,
913,
4240,
25,
610,
284,
3436,
55609
] | https://langchain.readthedocs.io/en/latest/memory/langchain.memory.summary.ConversationSummaryMemory.html |
83822ac1cddf-1 | param ai_prefix: str = 'AI'¶
param buffer: str = ''¶
param chat_memory: BaseChatMessageHistory [Optional]¶
param human_prefix: str = 'Human'¶
param input_key: Optional[str] = None¶
param llm: BaseLanguageModel [Required]¶
param output_key: Optional[str] = None¶
param prompt: BasePromptTemplate = PromptTemplate(input_variables=['summary', 'new_lines'], output_parser=None, partial_variables={}, template='Progressively summarize the lines of conversation provided, adding onto the previous summary returning a new summary.\n\nEXAMPLE\nCurrent summary:\nThe human asks what the AI thinks of artificial intelligence. The AI thinks artificial intelligence is a force for good.\n\nNew lines of conversation:\nHuman: Why do you think artificial intelligence is a force for good?\nAI: Because artificial intelligence will help humans reach their full potential.\n\nNew summary:\nThe human asks what the AI thinks of artificial intelligence. The AI thinks artificial intelligence is a force for good because it will help humans reach their full potential.\nEND OF EXAMPLE\n\nCurrent summary:\n{summary}\n\nNew lines of conversation:\n{new_lines}\n\nNew summary:', template_format='f-string', validate_template=True)¶
param return_messages: bool = False¶
param summary_message_cls: Type[BaseMessage] = <class 'langchain.schema.SystemMessage'>¶
clear() → None[source]¶
Clear memory contents.
classmethod from_messages(llm: BaseLanguageModel, chat_memory: BaseChatMessageHistory, *, summarize_step: int = 2, **kwargs: Any) → ConversationSummaryMemory[source]¶
load_memory_variables(inputs: Dict[str, Any]) → Dict[str, Any][source]¶
Return history buffer.
predict_new_summary(messages: List[BaseMessage], existing_summary: str) → str¶ | [
913,
16796,
14301,
25,
610,
284,
364,
15836,
6,
55609,
198,
913,
4240,
25,
610,
284,
3436,
55609,
198,
913,
6369,
19745,
25,
5464,
16047,
2097,
13730,
510,
15669,
60,
55609,
198,
913,
3823,
14301,
25,
610,
284,
364,
35075,
6,
55609,
198,
913,
1988,
3173,
25,
12536,
17752,
60,
284,
2290,
55609,
198,
913,
9507,
76,
25,
5464,
14126,
1747,
510,
8327,
60,
55609,
198,
913,
2612,
3173,
25,
12536,
17752,
60,
284,
2290,
55609,
198,
913,
10137,
25,
5464,
55715,
7423,
284,
60601,
7423,
5498,
29282,
14314,
1743,
518,
364,
943,
18828,
4181,
2612,
19024,
5980,
11,
7276,
29282,
68525,
3896,
1151,
9685,
3210,
63179,
279,
5238,
315,
10652,
3984,
11,
7999,
8800,
279,
3766,
12399,
13758,
264,
502,
12399,
7255,
77,
1734,
96975,
1734,
5520,
12399,
7338,
89330,
3823,
17501,
1148,
279,
15592,
15849,
315,
21075,
11478,
13,
578,
15592,
15849,
21075,
11478,
374,
264,
5457,
369,
1695,
7255,
77,
1734,
3648,
5238,
315,
10652,
7338,
77,
35075,
25,
8595,
656,
499,
1781,
21075,
11478,
374,
264,
5457,
369,
1695,
33720,
77,
15836,
25,
9393,
21075,
11478,
690,
1520,
12966,
5662,
872,
2539,
4754,
7255,
77,
1734,
3648,
12399,
7338,
89330,
3823,
17501,
1148,
279,
15592,
15849,
315,
21075,
11478,
13,
578,
15592,
15849,
21075,
11478,
374,
264,
5457,
369,
1695,
1606,
433,
690,
1520,
12966,
5662,
872,
2539,
4754,
7255,
77,
4794,
3083,
67346,
1734,
1734,
5520,
12399,
7338,
77,
90,
1743,
11281,
77,
1734,
3648,
5238,
315,
10652,
7338,
77,
90,
943,
18828,
11281,
77,
1734,
3648,
12399,
17898,
3896,
9132,
1151,
69,
31981,
518,
9788,
8864,
3702,
8,
55609,
198,
913,
471,
24321,
25,
1845,
284,
3641,
55609,
198,
913,
12399,
6598,
39756,
25,
4078,
58,
4066,
2097,
60,
284,
366,
1058,
364,
5317,
8995,
31992,
17031,
2097,
6404,
55609,
198,
7574,
368,
11651,
2290,
76747,
60,
55609,
198,
14335,
5044,
8970,
627,
27853,
505,
24321,
36621,
76,
25,
5464,
14126,
1747,
11,
6369,
19745,
25,
5464,
16047,
2097,
13730,
11,
12039,
63179,
12212,
25,
528,
284,
220,
17,
11,
3146,
9872,
25,
5884,
8,
11651,
51930,
19791,
10869,
76747,
60,
55609,
198,
1096,
19745,
29282,
35099,
25,
30226,
17752,
11,
5884,
2526,
11651,
30226,
17752,
11,
5884,
1483,
2484,
60,
55609,
198,
5715,
3925,
4240,
627,
35798,
6046,
28350,
56805,
25,
1796,
58,
4066,
2097,
1145,
6484,
28350,
25,
610,
8,
11651,
610,
55609
] | https://langchain.readthedocs.io/en/latest/memory/langchain.memory.summary.ConversationSummaryMemory.html |
83822ac1cddf-2 | predict_new_summary(messages: List[BaseMessage], existing_summary: str) → str¶
save_context(inputs: Dict[str, Any], outputs: Dict[str, str]) → None[source]¶
Save context from this conversation to buffer.
to_json() → Union[SerializedConstructor, SerializedNotImplemented]¶
to_json_not_implemented() → SerializedNotImplemented¶
validator validate_prompt_input_variables » all fields[source]¶
Validate that prompt input variables are consistent.
property lc_attributes: Dict¶
Return a list of attribute names that should be included in the
serialized kwargs. These attributes must be accepted by the
constructor.
property lc_namespace: List[str]¶
Return the namespace of the langchain object.
eg. [“langchain”, “llms”, “openai”]
property lc_secrets: Dict[str, str]¶
Return a map of constructor argument names to secret ids.
eg. {“openai_api_key”: “OPENAI_API_KEY”}
property lc_serializable: bool¶
Return whether or not the class is serializable.
model Config¶
Bases: object
Configuration for this pydantic object.
arbitrary_types_allowed = True¶ | [
35798,
6046,
28350,
56805,
25,
1796,
58,
4066,
2097,
1145,
6484,
28350,
25,
610,
8,
11651,
610,
55609,
198,
6766,
8634,
35099,
25,
30226,
17752,
11,
5884,
1145,
16674,
25,
30226,
17752,
11,
610,
2526,
11651,
2290,
76747,
60,
55609,
198,
8960,
2317,
505,
420,
10652,
311,
4240,
627,
998,
9643,
368,
11651,
9323,
58,
78621,
13591,
11,
92572,
2688,
18804,
60,
55609,
198,
998,
9643,
8072,
18377,
14565,
368,
11651,
92572,
2688,
18804,
55609,
198,
16503,
9788,
62521,
6022,
29282,
4194,
8345,
4194,
682,
5151,
76747,
60,
55609,
198,
18409,
430,
10137,
1988,
7482,
527,
13263,
627,
3784,
37313,
18741,
25,
30226,
55609,
198,
5715,
264,
1160,
315,
7180,
5144,
430,
1288,
387,
5343,
304,
279,
198,
76377,
16901,
13,
4314,
8365,
2011,
387,
11928,
555,
279,
198,
22602,
627,
3784,
37313,
42671,
25,
1796,
17752,
60,
55609,
198,
5715,
279,
4573,
315,
279,
8859,
8995,
1665,
627,
797,
13,
510,
2118,
5317,
8995,
9520,
1054,
657,
1026,
9520,
1054,
2569,
2192,
863,
933,
3784,
37313,
3537,
53810,
25,
30226,
17752,
11,
610,
60,
55609,
198,
5715,
264,
2472,
315,
4797,
5811,
5144,
311,
6367,
14483,
627,
797,
13,
314,
2118,
2569,
2192,
11959,
3173,
57633,
1054,
32033,
15836,
11669,
6738,
863,
534,
3784,
37313,
26684,
8499,
25,
1845,
55609,
198,
5715,
3508,
477,
539,
279,
538,
374,
6275,
8499,
627,
2590,
5649,
55609,
198,
33,
2315,
25,
1665,
198,
7843,
369,
420,
4611,
67,
8322,
1665,
627,
277,
88951,
9962,
43255,
284,
3082,
55609
] | https://langchain.readthedocs.io/en/latest/memory/langchain.memory.summary.ConversationSummaryMemory.html |
4057082cce9e-0 | langchain.memory.simple.SimpleMemory¶
class langchain.memory.simple.SimpleMemory(*, memories: Dict[str, Any] = {})[source]¶
Bases: BaseMemory
Simple memory for storing context or other bits of information that shouldn’t
ever change between prompts.
Create a new model by parsing and validating input data from keyword arguments.
Raises ValidationError if the input data cannot be parsed to form a valid model.
param memories: Dict[str, Any] = {}¶
clear() → None[source]¶
Nothing to clear, got a memory like a vault.
load_memory_variables(inputs: Dict[str, Any]) → Dict[str, str][source]¶
Return key-value pairs given the text input to the chain.
If None, return all memories
save_context(inputs: Dict[str, Any], outputs: Dict[str, str]) → None[source]¶
Nothing should be saved or changed, my memory is set in stone.
to_json() → Union[SerializedConstructor, SerializedNotImplemented]¶
to_json_not_implemented() → SerializedNotImplemented¶
property lc_attributes: Dict¶
Return a list of attribute names that should be included in the
serialized kwargs. These attributes must be accepted by the
constructor.
property lc_namespace: List[str]¶
Return the namespace of the langchain object.
eg. [“langchain”, “llms”, “openai”]
property lc_secrets: Dict[str, str]¶
Return a map of constructor argument names to secret ids.
eg. {“openai_api_key”: “OPENAI_API_KEY”}
property lc_serializable: bool¶
Return whether or not the class is serializable.
property memory_variables: List[str]¶
Input keys this memory class will load dynamically.
model Config¶
Bases: object
Configuration for this pydantic object. | [
5317,
8995,
37711,
25456,
25236,
10869,
55609,
198,
1058,
8859,
8995,
37711,
25456,
25236,
10869,
4163,
11,
19459,
25,
30226,
17752,
11,
5884,
60,
284,
4792,
6758,
2484,
60,
55609,
198,
33,
2315,
25,
5464,
10869,
198,
16778,
5044,
369,
28672,
2317,
477,
1023,
9660,
315,
2038,
430,
13434,
1431,
198,
2099,
2349,
1990,
52032,
627,
4110,
264,
502,
1646,
555,
23115,
323,
69772,
1988,
828,
505,
16570,
6105,
627,
36120,
54129,
422,
279,
1988,
828,
4250,
387,
16051,
311,
1376,
264,
2764,
1646,
627,
913,
19459,
25,
30226,
17752,
11,
5884,
60,
284,
4792,
55609,
198,
7574,
368,
11651,
2290,
76747,
60,
55609,
198,
24714,
311,
2867,
11,
2751,
264,
5044,
1093,
264,
35684,
627,
1096,
19745,
29282,
35099,
25,
30226,
17752,
11,
5884,
2526,
11651,
30226,
17752,
11,
610,
1483,
2484,
60,
55609,
198,
5715,
1401,
19625,
13840,
2728,
279,
1495,
1988,
311,
279,
8957,
627,
2746,
2290,
11,
471,
682,
19459,
198,
6766,
8634,
35099,
25,
30226,
17752,
11,
5884,
1145,
16674,
25,
30226,
17752,
11,
610,
2526,
11651,
2290,
76747,
60,
55609,
198,
24714,
1288,
387,
6924,
477,
5614,
11,
856,
5044,
374,
743,
304,
9998,
627,
998,
9643,
368,
11651,
9323,
58,
78621,
13591,
11,
92572,
2688,
18804,
60,
55609,
198,
998,
9643,
8072,
18377,
14565,
368,
11651,
92572,
2688,
18804,
55609,
198,
3784,
37313,
18741,
25,
30226,
55609,
198,
5715,
264,
1160,
315,
7180,
5144,
430,
1288,
387,
5343,
304,
279,
198,
76377,
16901,
13,
4314,
8365,
2011,
387,
11928,
555,
279,
198,
22602,
627,
3784,
37313,
42671,
25,
1796,
17752,
60,
55609,
198,
5715,
279,
4573,
315,
279,
8859,
8995,
1665,
627,
797,
13,
510,
2118,
5317,
8995,
9520,
1054,
657,
1026,
9520,
1054,
2569,
2192,
863,
933,
3784,
37313,
3537,
53810,
25,
30226,
17752,
11,
610,
60,
55609,
198,
5715,
264,
2472,
315,
4797,
5811,
5144,
311,
6367,
14483,
627,
797,
13,
314,
2118,
2569,
2192,
11959,
3173,
57633,
1054,
32033,
15836,
11669,
6738,
863,
534,
3784,
37313,
26684,
8499,
25,
1845,
55609,
198,
5715,
3508,
477,
539,
279,
538,
374,
6275,
8499,
627,
3784,
5044,
29282,
25,
1796,
17752,
60,
55609,
198,
2566,
7039,
420,
5044,
538,
690,
2865,
43111,
627,
2590,
5649,
55609,
198,
33,
2315,
25,
1665,
198,
7843,
369,
420,
4611,
67,
8322,
1665,
13
] | https://langchain.readthedocs.io/en/latest/memory/langchain.memory.simple.SimpleMemory.html |
4057082cce9e-1 | model Config¶
Bases: object
Configuration for this pydantic object.
arbitrary_types_allowed = True¶ | [
2590,
5649,
55609,
198,
33,
2315,
25,
1665,
198,
7843,
369,
420,
4611,
67,
8322,
1665,
627,
277,
88951,
9962,
43255,
284,
3082,
55609
] | https://langchain.readthedocs.io/en/latest/memory/langchain.memory.simple.SimpleMemory.html |
9b41054e1508-0 | langchain.memory.summary_buffer.ConversationSummaryBufferMemory¶
class langchain.memory.summary_buffer.ConversationSummaryBufferMemory(*, human_prefix: str = 'Human', ai_prefix: str = 'AI', llm: ~langchain.base_language.BaseLanguageModel, prompt: ~langchain.prompts.base.BasePromptTemplate = PromptTemplate(input_variables=['summary', 'new_lines'], output_parser=None, partial_variables={}, template='Progressively summarize the lines of conversation provided, adding onto the previous summary returning a new summary.\n\nEXAMPLE\nCurrent summary:\nThe human asks what the AI thinks of artificial intelligence. The AI thinks artificial intelligence is a force for good.\n\nNew lines of conversation:\nHuman: Why do you think artificial intelligence is a force for good?\nAI: Because artificial intelligence will help humans reach their full potential.\n\nNew summary:\nThe human asks what the AI thinks of artificial intelligence. The AI thinks artificial intelligence is a force for good because it will help humans reach their full potential.\nEND OF EXAMPLE\n\nCurrent summary:\n{summary}\n\nNew lines of conversation:\n{new_lines}\n\nNew summary:', template_format='f-string', validate_template=True), summary_message_cls: ~typing.Type[~langchain.schema.BaseMessage] = <class 'langchain.schema.SystemMessage'>, chat_memory: ~langchain.schema.BaseChatMessageHistory = None, output_key: ~typing.Optional[str] = None, input_key: ~typing.Optional[str] = None, return_messages: bool = False, max_token_limit: int = 2000, moving_summary_buffer: str = '', memory_key: str = 'history')[source]¶
Bases: BaseChatMemory, SummarizerMixin
Buffer with summarizer for storing conversation memory.
Create a new model by parsing and validating input data from keyword arguments.
Raises ValidationError if the input data cannot be parsed to form a valid model. | [
5317,
8995,
37711,
36441,
7932,
4906,
23124,
19791,
4187,
10869,
55609,
198,
1058,
8859,
8995,
37711,
36441,
7932,
4906,
23124,
19791,
4187,
10869,
4163,
11,
3823,
14301,
25,
610,
284,
364,
35075,
518,
16796,
14301,
25,
610,
284,
364,
15836,
518,
9507,
76,
25,
4056,
5317,
8995,
9105,
30121,
13316,
14126,
1747,
11,
10137,
25,
4056,
5317,
8995,
61848,
13044,
9105,
13316,
55715,
7423,
284,
60601,
7423,
5498,
29282,
14314,
1743,
518,
364,
943,
18828,
4181,
2612,
19024,
5980,
11,
7276,
29282,
68525,
3896,
1151,
9685,
3210,
63179,
279,
5238,
315,
10652,
3984,
11,
7999,
8800,
279,
3766,
12399,
13758,
264,
502,
12399,
7255,
77,
1734,
96975,
1734,
5520,
12399,
7338,
89330,
3823,
17501,
1148,
279,
15592,
15849,
315,
21075,
11478,
13,
578,
15592,
15849,
21075,
11478,
374,
264,
5457,
369,
1695,
7255,
77,
1734,
3648,
5238,
315,
10652,
7338,
77,
35075,
25,
8595,
656,
499,
1781,
21075,
11478,
374,
264,
5457,
369,
1695,
33720,
77,
15836,
25,
9393,
21075,
11478,
690,
1520,
12966,
5662,
872,
2539,
4754,
7255,
77,
1734,
3648,
12399,
7338,
89330,
3823,
17501,
1148,
279,
15592,
15849,
315,
21075,
11478,
13,
578,
15592,
15849,
21075,
11478,
374,
264,
5457,
369,
1695,
1606,
433,
690,
1520,
12966,
5662,
872,
2539,
4754,
7255,
77,
4794,
3083,
67346,
1734,
1734,
5520,
12399,
7338,
77,
90,
1743,
11281,
77,
1734,
3648,
5238,
315,
10652,
7338,
77,
90,
943,
18828,
11281,
77,
1734,
3648,
12399,
17898,
3896,
9132,
1151,
69,
31981,
518,
9788,
8864,
3702,
705,
12399,
6598,
39756,
25,
4056,
90902,
10394,
58,
93,
5317,
8995,
31992,
13316,
2097,
60,
284,
366,
1058,
364,
5317,
8995,
31992,
17031,
2097,
6404,
11,
6369,
19745,
25,
4056,
5317,
8995,
31992,
13316,
16047,
2097,
13730,
284,
2290,
11,
2612,
3173,
25,
4056,
90902,
37464,
17752,
60,
284,
2290,
11,
1988,
3173,
25,
4056,
90902,
37464,
17752,
60,
284,
2290,
11,
471,
24321,
25,
1845,
284,
3641,
11,
1973,
6594,
15106,
25,
528,
284,
220,
1049,
15,
11,
7366,
28350,
7932,
25,
610,
284,
9158,
5044,
3173,
25,
610,
284,
364,
19375,
13588,
2484,
60,
55609,
198,
33,
2315,
25,
5464,
16047,
10869,
11,
8279,
5730,
3213,
39556,
198,
4187,
449,
29385,
3213,
369,
28672,
10652,
5044,
627,
4110,
264,
502,
1646,
555,
23115,
323,
69772,
1988,
828,
505,
16570,
6105,
627,
36120,
54129,
422,
279,
1988,
828,
4250,
387,
16051,
311,
1376,
264,
2764,
1646,
13
] | https://langchain.readthedocs.io/en/latest/memory/langchain.memory.summary_buffer.ConversationSummaryBufferMemory.html |
9b41054e1508-1 | Raises ValidationError if the input data cannot be parsed to form a valid model.
param ai_prefix: str = 'AI'¶
param chat_memory: BaseChatMessageHistory [Optional]¶
param human_prefix: str = 'Human'¶
param input_key: Optional[str] = None¶
param llm: BaseLanguageModel [Required]¶
param max_token_limit: int = 2000¶
param memory_key: str = 'history'¶
param moving_summary_buffer: str = ''¶
param output_key: Optional[str] = None¶
param prompt: BasePromptTemplate = PromptTemplate(input_variables=['summary', 'new_lines'], output_parser=None, partial_variables={}, template='Progressively summarize the lines of conversation provided, adding onto the previous summary returning a new summary.\n\nEXAMPLE\nCurrent summary:\nThe human asks what the AI thinks of artificial intelligence. The AI thinks artificial intelligence is a force for good.\n\nNew lines of conversation:\nHuman: Why do you think artificial intelligence is a force for good?\nAI: Because artificial intelligence will help humans reach their full potential.\n\nNew summary:\nThe human asks what the AI thinks of artificial intelligence. The AI thinks artificial intelligence is a force for good because it will help humans reach their full potential.\nEND OF EXAMPLE\n\nCurrent summary:\n{summary}\n\nNew lines of conversation:\n{new_lines}\n\nNew summary:', template_format='f-string', validate_template=True)¶
param return_messages: bool = False¶
param summary_message_cls: Type[BaseMessage] = <class 'langchain.schema.SystemMessage'>¶
clear() → None[source]¶
Clear memory contents.
load_memory_variables(inputs: Dict[str, Any]) → Dict[str, Any][source]¶
Return history buffer.
predict_new_summary(messages: List[BaseMessage], existing_summary: str) → str¶ | [
36120,
54129,
422,
279,
1988,
828,
4250,
387,
16051,
311,
1376,
264,
2764,
1646,
627,
913,
16796,
14301,
25,
610,
284,
364,
15836,
6,
55609,
198,
913,
6369,
19745,
25,
5464,
16047,
2097,
13730,
510,
15669,
60,
55609,
198,
913,
3823,
14301,
25,
610,
284,
364,
35075,
6,
55609,
198,
913,
1988,
3173,
25,
12536,
17752,
60,
284,
2290,
55609,
198,
913,
9507,
76,
25,
5464,
14126,
1747,
510,
8327,
60,
55609,
198,
913,
1973,
6594,
15106,
25,
528,
284,
220,
1049,
15,
55609,
198,
913,
5044,
3173,
25,
610,
284,
364,
19375,
6,
55609,
198,
913,
7366,
28350,
7932,
25,
610,
284,
3436,
55609,
198,
913,
2612,
3173,
25,
12536,
17752,
60,
284,
2290,
55609,
198,
913,
10137,
25,
5464,
55715,
7423,
284,
60601,
7423,
5498,
29282,
14314,
1743,
518,
364,
943,
18828,
4181,
2612,
19024,
5980,
11,
7276,
29282,
68525,
3896,
1151,
9685,
3210,
63179,
279,
5238,
315,
10652,
3984,
11,
7999,
8800,
279,
3766,
12399,
13758,
264,
502,
12399,
7255,
77,
1734,
96975,
1734,
5520,
12399,
7338,
89330,
3823,
17501,
1148,
279,
15592,
15849,
315,
21075,
11478,
13,
578,
15592,
15849,
21075,
11478,
374,
264,
5457,
369,
1695,
7255,
77,
1734,
3648,
5238,
315,
10652,
7338,
77,
35075,
25,
8595,
656,
499,
1781,
21075,
11478,
374,
264,
5457,
369,
1695,
33720,
77,
15836,
25,
9393,
21075,
11478,
690,
1520,
12966,
5662,
872,
2539,
4754,
7255,
77,
1734,
3648,
12399,
7338,
89330,
3823,
17501,
1148,
279,
15592,
15849,
315,
21075,
11478,
13,
578,
15592,
15849,
21075,
11478,
374,
264,
5457,
369,
1695,
1606,
433,
690,
1520,
12966,
5662,
872,
2539,
4754,
7255,
77,
4794,
3083,
67346,
1734,
1734,
5520,
12399,
7338,
77,
90,
1743,
11281,
77,
1734,
3648,
5238,
315,
10652,
7338,
77,
90,
943,
18828,
11281,
77,
1734,
3648,
12399,
17898,
3896,
9132,
1151,
69,
31981,
518,
9788,
8864,
3702,
8,
55609,
198,
913,
471,
24321,
25,
1845,
284,
3641,
55609,
198,
913,
12399,
6598,
39756,
25,
4078,
58,
4066,
2097,
60,
284,
366,
1058,
364,
5317,
8995,
31992,
17031,
2097,
6404,
55609,
198,
7574,
368,
11651,
2290,
76747,
60,
55609,
198,
14335,
5044,
8970,
627,
1096,
19745,
29282,
35099,
25,
30226,
17752,
11,
5884,
2526,
11651,
30226,
17752,
11,
5884,
1483,
2484,
60,
55609,
198,
5715,
3925,
4240,
627,
35798,
6046,
28350,
56805,
25,
1796,
58,
4066,
2097,
1145,
6484,
28350,
25,
610,
8,
11651,
610,
55609
] | https://langchain.readthedocs.io/en/latest/memory/langchain.memory.summary_buffer.ConversationSummaryBufferMemory.html |
9b41054e1508-2 | predict_new_summary(messages: List[BaseMessage], existing_summary: str) → str¶
prune() → None[source]¶
Prune buffer if it exceeds max token limit
save_context(inputs: Dict[str, Any], outputs: Dict[str, str]) → None[source]¶
Save context from this conversation to buffer.
to_json() → Union[SerializedConstructor, SerializedNotImplemented]¶
to_json_not_implemented() → SerializedNotImplemented¶
validator validate_prompt_input_variables » all fields[source]¶
Validate that prompt input variables are consistent.
property buffer: List[langchain.schema.BaseMessage]¶
property lc_attributes: Dict¶
Return a list of attribute names that should be included in the
serialized kwargs. These attributes must be accepted by the
constructor.
property lc_namespace: List[str]¶
Return the namespace of the langchain object.
eg. [“langchain”, “llms”, “openai”]
property lc_secrets: Dict[str, str]¶
Return a map of constructor argument names to secret ids.
eg. {“openai_api_key”: “OPENAI_API_KEY”}
property lc_serializable: bool¶
Return whether or not the class is serializable.
model Config¶
Bases: object
Configuration for this pydantic object.
arbitrary_types_allowed = True¶ | [
35798,
6046,
28350,
56805,
25,
1796,
58,
4066,
2097,
1145,
6484,
28350,
25,
610,
8,
11651,
610,
55609,
198,
652,
2957,
368,
11651,
2290,
76747,
60,
55609,
198,
3617,
2957,
4240,
422,
433,
36375,
1973,
4037,
4017,
198,
6766,
8634,
35099,
25,
30226,
17752,
11,
5884,
1145,
16674,
25,
30226,
17752,
11,
610,
2526,
11651,
2290,
76747,
60,
55609,
198,
8960,
2317,
505,
420,
10652,
311,
4240,
627,
998,
9643,
368,
11651,
9323,
58,
78621,
13591,
11,
92572,
2688,
18804,
60,
55609,
198,
998,
9643,
8072,
18377,
14565,
368,
11651,
92572,
2688,
18804,
55609,
198,
16503,
9788,
62521,
6022,
29282,
4194,
8345,
4194,
682,
5151,
76747,
60,
55609,
198,
18409,
430,
10137,
1988,
7482,
527,
13263,
627,
3784,
4240,
25,
1796,
58,
5317,
8995,
31992,
13316,
2097,
60,
55609,
198,
3784,
37313,
18741,
25,
30226,
55609,
198,
5715,
264,
1160,
315,
7180,
5144,
430,
1288,
387,
5343,
304,
279,
198,
76377,
16901,
13,
4314,
8365,
2011,
387,
11928,
555,
279,
198,
22602,
627,
3784,
37313,
42671,
25,
1796,
17752,
60,
55609,
198,
5715,
279,
4573,
315,
279,
8859,
8995,
1665,
627,
797,
13,
510,
2118,
5317,
8995,
9520,
1054,
657,
1026,
9520,
1054,
2569,
2192,
863,
933,
3784,
37313,
3537,
53810,
25,
30226,
17752,
11,
610,
60,
55609,
198,
5715,
264,
2472,
315,
4797,
5811,
5144,
311,
6367,
14483,
627,
797,
13,
314,
2118,
2569,
2192,
11959,
3173,
57633,
1054,
32033,
15836,
11669,
6738,
863,
534,
3784,
37313,
26684,
8499,
25,
1845,
55609,
198,
5715,
3508,
477,
539,
279,
538,
374,
6275,
8499,
627,
2590,
5649,
55609,
198,
33,
2315,
25,
1665,
198,
7843,
369,
420,
4611,
67,
8322,
1665,
627,
277,
88951,
9962,
43255,
284,
3082,
55609
] | https://langchain.readthedocs.io/en/latest/memory/langchain.memory.summary_buffer.ConversationSummaryBufferMemory.html |
46a511460771-0 | langchain.memory.vectorstore.VectorStoreRetrieverMemory¶
class langchain.memory.vectorstore.VectorStoreRetrieverMemory(*, retriever: VectorStoreRetriever, memory_key: str = 'history', input_key: Optional[str] = None, return_docs: bool = False)[source]¶
Bases: BaseMemory
Class for a VectorStore-backed memory object.
Create a new model by parsing and validating input data from keyword arguments.
Raises ValidationError if the input data cannot be parsed to form a valid model.
param input_key: Optional[str] = None¶
Key name to index the inputs to load_memory_variables.
param memory_key: str = 'history'¶
Key name to locate the memories in the result of load_memory_variables.
param retriever: langchain.vectorstores.base.VectorStoreRetriever [Required]¶
VectorStoreRetriever object to connect to.
param return_docs: bool = False¶
Whether or not to return the result of querying the database directly.
clear() → None[source]¶
Nothing to clear.
load_memory_variables(inputs: Dict[str, Any]) → Dict[str, Union[List[Document], str]][source]¶
Return history buffer.
save_context(inputs: Dict[str, Any], outputs: Dict[str, str]) → None[source]¶
Save context from this conversation to buffer.
to_json() → Union[SerializedConstructor, SerializedNotImplemented]¶
to_json_not_implemented() → SerializedNotImplemented¶
property lc_attributes: Dict¶
Return a list of attribute names that should be included in the
serialized kwargs. These attributes must be accepted by the
constructor.
property lc_namespace: List[str]¶
Return the namespace of the langchain object.
eg. [“langchain”, “llms”, “openai”]
property lc_secrets: Dict[str, str]¶ | [
5317,
8995,
37711,
48203,
4412,
14621,
6221,
12289,
462,
2099,
10869,
55609,
198,
1058,
8859,
8995,
37711,
48203,
4412,
14621,
6221,
12289,
462,
2099,
10869,
4163,
11,
10992,
424,
25,
4290,
6221,
12289,
462,
2099,
11,
5044,
3173,
25,
610,
284,
364,
19375,
518,
1988,
3173,
25,
12536,
17752,
60,
284,
2290,
11,
471,
50792,
25,
1845,
284,
3641,
6758,
2484,
60,
55609,
198,
33,
2315,
25,
5464,
10869,
198,
1999,
369,
264,
4290,
6221,
46128,
5044,
1665,
627,
4110,
264,
502,
1646,
555,
23115,
323,
69772,
1988,
828,
505,
16570,
6105,
627,
36120,
54129,
422,
279,
1988,
828,
4250,
387,
16051,
311,
1376,
264,
2764,
1646,
627,
913,
1988,
3173,
25,
12536,
17752,
60,
284,
2290,
55609,
198,
1622,
836,
311,
1963,
279,
11374,
311,
2865,
19745,
29282,
627,
913,
5044,
3173,
25,
610,
284,
364,
19375,
6,
55609,
198,
1622,
836,
311,
25539,
279,
19459,
304,
279,
1121,
315,
2865,
19745,
29282,
627,
913,
10992,
424,
25,
8859,
8995,
48203,
44569,
9105,
14621,
6221,
12289,
462,
2099,
510,
8327,
60,
55609,
198,
3866,
6221,
12289,
462,
2099,
1665,
311,
4667,
311,
627,
913,
471,
50792,
25,
1845,
284,
3641,
55609,
198,
25729,
477,
539,
311,
471,
279,
1121,
315,
82198,
279,
4729,
6089,
627,
7574,
368,
11651,
2290,
76747,
60,
55609,
198,
24714,
311,
2867,
627,
1096,
19745,
29282,
35099,
25,
30226,
17752,
11,
5884,
2526,
11651,
30226,
17752,
11,
9323,
53094,
58,
7676,
1145,
610,
28819,
2484,
60,
55609,
198,
5715,
3925,
4240,
627,
6766,
8634,
35099,
25,
30226,
17752,
11,
5884,
1145,
16674,
25,
30226,
17752,
11,
610,
2526,
11651,
2290,
76747,
60,
55609,
198,
8960,
2317,
505,
420,
10652,
311,
4240,
627,
998,
9643,
368,
11651,
9323,
58,
78621,
13591,
11,
92572,
2688,
18804,
60,
55609,
198,
998,
9643,
8072,
18377,
14565,
368,
11651,
92572,
2688,
18804,
55609,
198,
3784,
37313,
18741,
25,
30226,
55609,
198,
5715,
264,
1160,
315,
7180,
5144,
430,
1288,
387,
5343,
304,
279,
198,
76377,
16901,
13,
4314,
8365,
2011,
387,
11928,
555,
279,
198,
22602,
627,
3784,
37313,
42671,
25,
1796,
17752,
60,
55609,
198,
5715,
279,
4573,
315,
279,
8859,
8995,
1665,
627,
797,
13,
510,
2118,
5317,
8995,
9520,
1054,
657,
1026,
9520,
1054,
2569,
2192,
863,
933,
3784,
37313,
3537,
53810,
25,
30226,
17752,
11,
610,
60,
55609
] | https://langchain.readthedocs.io/en/latest/memory/langchain.memory.vectorstore.VectorStoreRetrieverMemory.html |
46a511460771-1 | property lc_secrets: Dict[str, str]¶
Return a map of constructor argument names to secret ids.
eg. {“openai_api_key”: “OPENAI_API_KEY”}
property lc_serializable: bool¶
Return whether or not the class is serializable.
property memory_variables: List[str]¶
The list of keys emitted from the load_memory_variables method.
model Config¶
Bases: object
Configuration for this pydantic object.
arbitrary_types_allowed = True¶ | [
3784,
37313,
3537,
53810,
25,
30226,
17752,
11,
610,
60,
55609,
198,
5715,
264,
2472,
315,
4797,
5811,
5144,
311,
6367,
14483,
627,
797,
13,
314,
2118,
2569,
2192,
11959,
3173,
57633,
1054,
32033,
15836,
11669,
6738,
863,
534,
3784,
37313,
26684,
8499,
25,
1845,
55609,
198,
5715,
3508,
477,
539,
279,
538,
374,
6275,
8499,
627,
3784,
5044,
29282,
25,
1796,
17752,
60,
55609,
198,
791,
1160,
315,
7039,
48042,
505,
279,
2865,
19745,
29282,
1749,
627,
2590,
5649,
55609,
198,
33,
2315,
25,
1665,
198,
7843,
369,
420,
4611,
67,
8322,
1665,
627,
277,
88951,
9962,
43255,
284,
3082,
55609
] | https://langchain.readthedocs.io/en/latest/memory/langchain.memory.vectorstore.VectorStoreRetrieverMemory.html |
06b989e1f942-0 | langchain.memory.token_buffer.ConversationTokenBufferMemory¶
class langchain.memory.token_buffer.ConversationTokenBufferMemory(*, chat_memory: BaseChatMessageHistory = None, output_key: Optional[str] = None, input_key: Optional[str] = None, return_messages: bool = False, human_prefix: str = 'Human', ai_prefix: str = 'AI', llm: BaseLanguageModel, memory_key: str = 'history', max_token_limit: int = 2000)[source]¶
Bases: BaseChatMemory
Buffer for storing conversation memory.
Create a new model by parsing and validating input data from keyword arguments.
Raises ValidationError if the input data cannot be parsed to form a valid model.
param ai_prefix: str = 'AI'¶
param chat_memory: BaseChatMessageHistory [Optional]¶
param human_prefix: str = 'Human'¶
param input_key: Optional[str] = None¶
param llm: langchain.base_language.BaseLanguageModel [Required]¶
param max_token_limit: int = 2000¶
param memory_key: str = 'history'¶
param output_key: Optional[str] = None¶
param return_messages: bool = False¶
clear() → None¶
Clear memory contents.
load_memory_variables(inputs: Dict[str, Any]) → Dict[str, Any][source]¶
Return history buffer.
save_context(inputs: Dict[str, Any], outputs: Dict[str, str]) → None[source]¶
Save context from this conversation to buffer. Pruned.
to_json() → Union[SerializedConstructor, SerializedNotImplemented]¶
to_json_not_implemented() → SerializedNotImplemented¶
property buffer: List[langchain.schema.BaseMessage]¶
String buffer of memory.
property lc_attributes: Dict¶
Return a list of attribute names that should be included in the | [
5317,
8995,
37711,
14754,
7932,
4906,
3078,
17481,
4187,
10869,
55609,
198,
1058,
8859,
8995,
37711,
14754,
7932,
4906,
3078,
17481,
4187,
10869,
4163,
11,
6369,
19745,
25,
5464,
16047,
2097,
13730,
284,
2290,
11,
2612,
3173,
25,
12536,
17752,
60,
284,
2290,
11,
1988,
3173,
25,
12536,
17752,
60,
284,
2290,
11,
471,
24321,
25,
1845,
284,
3641,
11,
3823,
14301,
25,
610,
284,
364,
35075,
518,
16796,
14301,
25,
610,
284,
364,
15836,
518,
9507,
76,
25,
5464,
14126,
1747,
11,
5044,
3173,
25,
610,
284,
364,
19375,
518,
1973,
6594,
15106,
25,
528,
284,
220,
1049,
15,
6758,
2484,
60,
55609,
198,
33,
2315,
25,
5464,
16047,
10869,
198,
4187,
369,
28672,
10652,
5044,
627,
4110,
264,
502,
1646,
555,
23115,
323,
69772,
1988,
828,
505,
16570,
6105,
627,
36120,
54129,
422,
279,
1988,
828,
4250,
387,
16051,
311,
1376,
264,
2764,
1646,
627,
913,
16796,
14301,
25,
610,
284,
364,
15836,
6,
55609,
198,
913,
6369,
19745,
25,
5464,
16047,
2097,
13730,
510,
15669,
60,
55609,
198,
913,
3823,
14301,
25,
610,
284,
364,
35075,
6,
55609,
198,
913,
1988,
3173,
25,
12536,
17752,
60,
284,
2290,
55609,
198,
913,
9507,
76,
25,
8859,
8995,
9105,
30121,
13316,
14126,
1747,
510,
8327,
60,
55609,
198,
913,
1973,
6594,
15106,
25,
528,
284,
220,
1049,
15,
55609,
198,
913,
5044,
3173,
25,
610,
284,
364,
19375,
6,
55609,
198,
913,
2612,
3173,
25,
12536,
17752,
60,
284,
2290,
55609,
198,
913,
471,
24321,
25,
1845,
284,
3641,
55609,
198,
7574,
368,
11651,
2290,
55609,
198,
14335,
5044,
8970,
627,
1096,
19745,
29282,
35099,
25,
30226,
17752,
11,
5884,
2526,
11651,
30226,
17752,
11,
5884,
1483,
2484,
60,
55609,
198,
5715,
3925,
4240,
627,
6766,
8634,
35099,
25,
30226,
17752,
11,
5884,
1145,
16674,
25,
30226,
17752,
11,
610,
2526,
11651,
2290,
76747,
60,
55609,
198,
8960,
2317,
505,
420,
10652,
311,
4240,
13,
2394,
49983,
627,
998,
9643,
368,
11651,
9323,
58,
78621,
13591,
11,
92572,
2688,
18804,
60,
55609,
198,
998,
9643,
8072,
18377,
14565,
368,
11651,
92572,
2688,
18804,
55609,
198,
3784,
4240,
25,
1796,
58,
5317,
8995,
31992,
13316,
2097,
60,
55609,
198,
707,
4240,
315,
5044,
627,
3784,
37313,
18741,
25,
30226,
55609,
198,
5715,
264,
1160,
315,
7180,
5144,
430,
1288,
387,
5343,
304,
279
] | https://langchain.readthedocs.io/en/latest/memory/langchain.memory.token_buffer.ConversationTokenBufferMemory.html |
06b989e1f942-1 | property lc_attributes: Dict¶
Return a list of attribute names that should be included in the
serialized kwargs. These attributes must be accepted by the
constructor.
property lc_namespace: List[str]¶
Return the namespace of the langchain object.
eg. [“langchain”, “llms”, “openai”]
property lc_secrets: Dict[str, str]¶
Return a map of constructor argument names to secret ids.
eg. {“openai_api_key”: “OPENAI_API_KEY”}
property lc_serializable: bool¶
Return whether or not the class is serializable.
model Config¶
Bases: object
Configuration for this pydantic object.
arbitrary_types_allowed = True¶ | [
3784,
37313,
18741,
25,
30226,
55609,
198,
5715,
264,
1160,
315,
7180,
5144,
430,
1288,
387,
5343,
304,
279,
198,
76377,
16901,
13,
4314,
8365,
2011,
387,
11928,
555,
279,
198,
22602,
627,
3784,
37313,
42671,
25,
1796,
17752,
60,
55609,
198,
5715,
279,
4573,
315,
279,
8859,
8995,
1665,
627,
797,
13,
510,
2118,
5317,
8995,
9520,
1054,
657,
1026,
9520,
1054,
2569,
2192,
863,
933,
3784,
37313,
3537,
53810,
25,
30226,
17752,
11,
610,
60,
55609,
198,
5715,
264,
2472,
315,
4797,
5811,
5144,
311,
6367,
14483,
627,
797,
13,
314,
2118,
2569,
2192,
11959,
3173,
57633,
1054,
32033,
15836,
11669,
6738,
863,
534,
3784,
37313,
26684,
8499,
25,
1845,
55609,
198,
5715,
3508,
477,
539,
279,
538,
374,
6275,
8499,
627,
2590,
5649,
55609,
198,
33,
2315,
25,
1665,
198,
7843,
369,
420,
4611,
67,
8322,
1665,
627,
277,
88951,
9962,
43255,
284,
3082,
55609
] | https://langchain.readthedocs.io/en/latest/memory/langchain.memory.token_buffer.ConversationTokenBufferMemory.html |
3394021eeffc-0 | langchain.memory.utils.get_prompt_input_key¶
langchain.memory.utils.get_prompt_input_key(inputs: Dict[str, Any], memory_variables: List[str]) → str[source]¶
Get the prompt input key.
Parameters
inputs – Dict[str, Any]
memory_variables – List[str]
Returns
A prompt input key. | [
5317,
8995,
37711,
8576,
673,
62521,
6022,
3173,
55609,
198,
5317,
8995,
37711,
8576,
673,
62521,
6022,
3173,
35099,
25,
30226,
17752,
11,
5884,
1145,
5044,
29282,
25,
1796,
17752,
2526,
11651,
610,
76747,
60,
55609,
198,
1991,
279,
10137,
1988,
1401,
627,
9905,
198,
25986,
1389,
30226,
17752,
11,
5884,
933,
17717,
29282,
1389,
1796,
17752,
933,
16851,
198,
32,
10137,
1988,
1401,
13
] | https://langchain.readthedocs.io/en/latest/memory/langchain.memory.utils.get_prompt_input_key.html |
c56460d94e20-0 | langchain.memory.buffer.ConversationStringBufferMemory¶
class langchain.memory.buffer.ConversationStringBufferMemory(*, human_prefix: str = 'Human', ai_prefix: str = 'AI', buffer: str = '', output_key: Optional[str] = None, input_key: Optional[str] = None, memory_key: str = 'history')[source]¶
Bases: BaseMemory
Buffer for storing conversation memory.
Create a new model by parsing and validating input data from keyword arguments.
Raises ValidationError if the input data cannot be parsed to form a valid model.
param ai_prefix: str = 'AI'¶
Prefix to use for AI generated responses.
param buffer: str = ''¶
param human_prefix: str = 'Human'¶
param input_key: Optional[str] = None¶
param output_key: Optional[str] = None¶
clear() → None[source]¶
Clear memory contents.
load_memory_variables(inputs: Dict[str, Any]) → Dict[str, str][source]¶
Return history buffer.
save_context(inputs: Dict[str, Any], outputs: Dict[str, str]) → None[source]¶
Save context from this conversation to buffer.
to_json() → Union[SerializedConstructor, SerializedNotImplemented]¶
to_json_not_implemented() → SerializedNotImplemented¶
validator validate_chains » all fields[source]¶
Validate that return messages is not True.
property lc_attributes: Dict¶
Return a list of attribute names that should be included in the
serialized kwargs. These attributes must be accepted by the
constructor.
property lc_namespace: List[str]¶
Return the namespace of the langchain object.
eg. [“langchain”, “llms”, “openai”]
property lc_secrets: Dict[str, str]¶
Return a map of constructor argument names to secret ids. | [
5317,
8995,
37711,
25472,
4906,
23124,
707,
4187,
10869,
55609,
198,
1058,
8859,
8995,
37711,
25472,
4906,
23124,
707,
4187,
10869,
4163,
11,
3823,
14301,
25,
610,
284,
364,
35075,
518,
16796,
14301,
25,
610,
284,
364,
15836,
518,
4240,
25,
610,
284,
9158,
2612,
3173,
25,
12536,
17752,
60,
284,
2290,
11,
1988,
3173,
25,
12536,
17752,
60,
284,
2290,
11,
5044,
3173,
25,
610,
284,
364,
19375,
13588,
2484,
60,
55609,
198,
33,
2315,
25,
5464,
10869,
198,
4187,
369,
28672,
10652,
5044,
627,
4110,
264,
502,
1646,
555,
23115,
323,
69772,
1988,
828,
505,
16570,
6105,
627,
36120,
54129,
422,
279,
1988,
828,
4250,
387,
16051,
311,
1376,
264,
2764,
1646,
627,
913,
16796,
14301,
25,
610,
284,
364,
15836,
6,
55609,
198,
14672,
311,
1005,
369,
15592,
8066,
14847,
627,
913,
4240,
25,
610,
284,
3436,
55609,
198,
913,
3823,
14301,
25,
610,
284,
364,
35075,
6,
55609,
198,
913,
1988,
3173,
25,
12536,
17752,
60,
284,
2290,
55609,
198,
913,
2612,
3173,
25,
12536,
17752,
60,
284,
2290,
55609,
198,
7574,
368,
11651,
2290,
76747,
60,
55609,
198,
14335,
5044,
8970,
627,
1096,
19745,
29282,
35099,
25,
30226,
17752,
11,
5884,
2526,
11651,
30226,
17752,
11,
610,
1483,
2484,
60,
55609,
198,
5715,
3925,
4240,
627,
6766,
8634,
35099,
25,
30226,
17752,
11,
5884,
1145,
16674,
25,
30226,
17752,
11,
610,
2526,
11651,
2290,
76747,
60,
55609,
198,
8960,
2317,
505,
420,
10652,
311,
4240,
627,
998,
9643,
368,
11651,
9323,
58,
78621,
13591,
11,
92572,
2688,
18804,
60,
55609,
198,
998,
9643,
8072,
18377,
14565,
368,
11651,
92572,
2688,
18804,
55609,
198,
16503,
9788,
4231,
1771,
4194,
8345,
4194,
682,
5151,
76747,
60,
55609,
198,
18409,
430,
471,
6743,
374,
539,
3082,
627,
3784,
37313,
18741,
25,
30226,
55609,
198,
5715,
264,
1160,
315,
7180,
5144,
430,
1288,
387,
5343,
304,
279,
198,
76377,
16901,
13,
4314,
8365,
2011,
387,
11928,
555,
279,
198,
22602,
627,
3784,
37313,
42671,
25,
1796,
17752,
60,
55609,
198,
5715,
279,
4573,
315,
279,
8859,
8995,
1665,
627,
797,
13,
510,
2118,
5317,
8995,
9520,
1054,
657,
1026,
9520,
1054,
2569,
2192,
863,
933,
3784,
37313,
3537,
53810,
25,
30226,
17752,
11,
610,
60,
55609,
198,
5715,
264,
2472,
315,
4797,
5811,
5144,
311,
6367,
14483,
13
] | https://langchain.readthedocs.io/en/latest/memory/langchain.memory.buffer.ConversationStringBufferMemory.html |
c56460d94e20-1 | Return a map of constructor argument names to secret ids.
eg. {“openai_api_key”: “OPENAI_API_KEY”}
property lc_serializable: bool¶
Return whether or not the class is serializable.
property memory_variables: List[str]¶
Will always return list of memory variables.
:meta private:
model Config¶
Bases: object
Configuration for this pydantic object.
arbitrary_types_allowed = True¶ | [
5715,
264,
2472,
315,
4797,
5811,
5144,
311,
6367,
14483,
627,
797,
13,
314,
2118,
2569,
2192,
11959,
3173,
57633,
1054,
32033,
15836,
11669,
6738,
863,
534,
3784,
37313,
26684,
8499,
25,
1845,
55609,
198,
5715,
3508,
477,
539,
279,
538,
374,
6275,
8499,
627,
3784,
5044,
29282,
25,
1796,
17752,
60,
55609,
198,
10149,
2744,
471,
1160,
315,
5044,
7482,
627,
25,
5607,
879,
512,
2590,
5649,
55609,
198,
33,
2315,
25,
1665,
198,
7843,
369,
420,
4611,
67,
8322,
1665,
627,
277,
88951,
9962,
43255,
284,
3082,
55609
] | https://langchain.readthedocs.io/en/latest/memory/langchain.memory.buffer.ConversationStringBufferMemory.html |
513c5ce67bde-0 | langchain.memory.kg.ConversationKGMemory¶ | [
5317,
8995,
37711,
5314,
70,
4906,
23124,
44016,
10869,
55609
] | https://langchain.readthedocs.io/en/latest/memory/langchain.memory.kg.ConversationKGMemory.html |
513c5ce67bde-1 | class langchain.memory.kg.ConversationKGMemory(*, chat_memory: ~langchain.schema.BaseChatMessageHistory = None, output_key: ~typing.Optional[str] = None, input_key: ~typing.Optional[str] = None, return_messages: bool = False, k: int = 2, human_prefix: str = 'Human', ai_prefix: str = 'AI', kg: ~langchain.graphs.networkx_graph.NetworkxEntityGraph = None, knowledge_extraction_prompt: ~langchain.prompts.base.BasePromptTemplate = PromptTemplate(input_variables=['history', 'input'], output_parser=None, partial_variables={}, template="You are a networked intelligence helping a human track knowledge triples about all relevant people, things, concepts, etc. and integrating them with your knowledge stored within your weights as well as that stored in a knowledge graph. Extract all of the knowledge triples from the last line of conversation. A knowledge triple is a clause that contains a subject, a predicate, and an object. The subject is the entity being described, the predicate is the property of the subject that is being described, and the object is the value of the property.\n\nEXAMPLE\nConversation history:\nPerson #1: Did you hear aliens landed in Area 51?\nAI: No, I didn't hear that. What do you know about Area 51?\nPerson #1: It's a secret military base in Nevada.\nAI: What do you know about Nevada?\nLast line of conversation:\nPerson #1: It's a state in the US. It's also the number 1 producer of gold in the US.\n\nOutput: (Nevada, is a, state)<|>(Nevada, is in, US)<|>(Nevada, is the number 1 producer of, gold)\nEND OF EXAMPLE\n\nEXAMPLE\nConversation history:\nPerson #1: Hello.\nAI: Hi! How are | [
1058,
8859,
8995,
37711,
5314,
70,
4906,
23124,
44016,
10869,
4163,
11,
6369,
19745,
25,
4056,
5317,
8995,
31992,
13316,
16047,
2097,
13730,
284,
2290,
11,
2612,
3173,
25,
4056,
90902,
37464,
17752,
60,
284,
2290,
11,
1988,
3173,
25,
4056,
90902,
37464,
17752,
60,
284,
2290,
11,
471,
24321,
25,
1845,
284,
3641,
11,
597,
25,
528,
284,
220,
17,
11,
3823,
14301,
25,
610,
284,
364,
35075,
518,
16796,
14301,
25,
610,
284,
364,
15836,
518,
21647,
25,
4056,
5317,
8995,
10996,
82,
21216,
87,
15080,
31249,
87,
3106,
11461,
284,
2290,
11,
6677,
95942,
62521,
25,
4056,
5317,
8995,
61848,
13044,
9105,
13316,
55715,
7423,
284,
60601,
7423,
5498,
29282,
14314,
19375,
518,
364,
1379,
4181,
2612,
19024,
5980,
11,
7276,
29282,
68525,
3896,
429,
2675,
527,
264,
4009,
291,
11478,
10695,
264,
3823,
3839,
6677,
89661,
922,
682,
9959,
1274,
11,
2574,
11,
19476,
11,
5099,
13,
323,
54952,
1124,
449,
701,
6677,
9967,
2949,
701,
14661,
439,
1664,
439,
430,
9967,
304,
264,
6677,
4876,
13,
23673,
682,
315,
279,
6677,
89661,
505,
279,
1566,
1584,
315,
10652,
13,
362,
6677,
24657,
374,
264,
22381,
430,
5727,
264,
3917,
11,
264,
25269,
11,
323,
459,
1665,
13,
578,
3917,
374,
279,
5502,
1694,
7633,
11,
279,
25269,
374,
279,
3424,
315,
279,
3917,
430,
374,
1694,
7633,
11,
323,
279,
1665,
374,
279,
907,
315,
279,
3424,
7255,
77,
1734,
96975,
1734,
61413,
3925,
7338,
77,
10909,
674,
16,
25,
14910,
499,
6865,
37219,
27212,
304,
12299,
220,
3971,
33720,
77,
15836,
25,
2360,
11,
358,
3287,
956,
6865,
430,
13,
3639,
656,
499,
1440,
922,
12299,
220,
3971,
33720,
77,
10909,
674,
16,
25,
1102,
596,
264,
6367,
6411,
2385,
304,
27966,
7255,
77,
15836,
25,
3639,
656,
499,
1440,
922,
27966,
33720,
77,
5966,
1584,
315,
10652,
7338,
77,
10909,
674,
16,
25,
1102,
596,
264,
1614,
304,
279,
2326,
13,
1102,
596,
1101,
279,
1396,
220,
16,
17276,
315,
6761,
304,
279,
2326,
7255,
77,
1734,
5207,
25,
320,
45,
5230,
2649,
11,
374,
264,
11,
1614,
27530,
91,
2284,
45,
5230,
2649,
11,
374,
304,
11,
2326,
27530,
91,
2284,
45,
5230,
2649,
11,
374,
279,
1396,
220,
16,
17276,
315,
11,
6761,
10929,
77,
4794,
3083,
67346,
1734,
1734,
96975,
1734,
61413,
3925,
7338,
77,
10909,
674,
16,
25,
22691,
7255,
77,
15836,
25,
21694,
0,
2650,
527
] | https://langchain.readthedocs.io/en/latest/memory/langchain.memory.kg.ConversationKGMemory.html |
513c5ce67bde-2 | history:\nPerson #1: Hello.\nAI: Hi! How are you?\nPerson #1: I'm good. How are you?\nAI: I'm good too.\nLast line of conversation:\nPerson #1: I'm going to the store.\n\nOutput: NONE\nEND OF EXAMPLE\n\nEXAMPLE\nConversation history:\nPerson #1: What do you know about Descartes?\nAI: Descartes was a French philosopher, mathematician, and scientist who lived in the 17th century.\nPerson #1: The Descartes I'm referring to is a standup comedian and interior designer from Montreal.\nAI: Oh yes, He is a comedian and an interior designer. He has been in the industry for 30 years. His favorite food is baked bean pie.\nLast line of conversation:\nPerson #1: Oh huh. I know Descartes likes to drive antique scooters and play the mandolin.\nOutput: (Descartes, likes to drive, antique scooters)<|>(Descartes, plays, mandolin)\nEND OF EXAMPLE\n\nConversation history (for reference only):\n{history}\nLast line of conversation (for extraction):\nHuman: {input}\n\nOutput:", template_format='f-string', validate_template=True), entity_extraction_prompt: ~langchain.prompts.base.BasePromptTemplate = PromptTemplate(input_variables=['history', 'input'], output_parser=None, partial_variables={}, template='You are an AI assistant reading the transcript of a conversation between an AI and a human. Extract all of the proper nouns from the last line of conversation. As a guideline, a proper noun is generally capitalized. You should definitely extract all names and places.\n\nThe conversation history is provided just in case of a coreference (e.g. "What do you know about him" where "him" is defined in a previous line) -- ignore items mentioned there | [
19375,
7338,
77,
10909,
674,
16,
25,
22691,
7255,
77,
15836,
25,
21694,
0,
2650,
527,
499,
33720,
77,
10909,
674,
16,
25,
358,
2846,
1695,
13,
2650,
527,
499,
33720,
77,
15836,
25,
358,
2846,
1695,
2288,
7255,
77,
5966,
1584,
315,
10652,
7338,
77,
10909,
674,
16,
25,
358,
2846,
2133,
311,
279,
3637,
7255,
77,
1734,
5207,
25,
43969,
1734,
4794,
3083,
67346,
1734,
1734,
96975,
1734,
61413,
3925,
7338,
77,
10909,
674,
16,
25,
3639,
656,
499,
1440,
922,
32285,
472,
288,
33720,
77,
15836,
25,
32285,
472,
288,
574,
264,
8753,
55475,
11,
21651,
1122,
11,
323,
28568,
889,
12439,
304,
279,
220,
1114,
339,
9478,
7255,
77,
10909,
674,
16,
25,
578,
32285,
472,
288,
358,
2846,
22797,
311,
374,
264,
2559,
455,
51912,
323,
15135,
15034,
505,
30613,
7255,
77,
15836,
25,
8840,
10035,
11,
1283,
374,
264,
51912,
323,
459,
15135,
15034,
13,
1283,
706,
1027,
304,
279,
5064,
369,
220,
966,
1667,
13,
5414,
7075,
3691,
374,
41778,
21059,
4447,
7255,
77,
5966,
1584,
315,
10652,
7338,
77,
10909,
674,
16,
25,
8840,
57843,
13,
358,
1440,
32285,
472,
288,
13452,
311,
6678,
47691,
25661,
40896,
323,
1514,
279,
11837,
37737,
7255,
77,
5207,
25,
320,
11312,
472,
288,
11,
13452,
311,
6678,
11,
47691,
25661,
40896,
27530,
91,
2284,
11312,
472,
288,
11,
11335,
11,
11837,
37737,
10929,
77,
4794,
3083,
67346,
1734,
1734,
61413,
3925,
320,
2000,
5905,
1193,
90149,
77,
90,
19375,
11281,
77,
5966,
1584,
315,
10652,
320,
2000,
33289,
90149,
77,
35075,
25,
314,
1379,
11281,
77,
1734,
5207,
12421,
3896,
9132,
1151,
69,
31981,
518,
9788,
8864,
3702,
705,
5502,
95942,
62521,
25,
4056,
5317,
8995,
61848,
13044,
9105,
13316,
55715,
7423,
284,
60601,
7423,
5498,
29282,
14314,
19375,
518,
364,
1379,
4181,
2612,
19024,
5980,
11,
7276,
29282,
68525,
3896,
1151,
2675,
527,
459,
15592,
18328,
5403,
279,
36815,
315,
264,
10652,
1990,
459,
15592,
323,
264,
3823,
13,
23673,
682,
315,
279,
6300,
90938,
505,
279,
1566,
1584,
315,
10652,
13,
1666,
264,
73545,
11,
264,
6300,
38021,
374,
8965,
98421,
13,
1472,
1288,
8659,
8819,
682,
5144,
323,
7634,
7255,
77,
1734,
791,
10652,
3925,
374,
3984,
1120,
304,
1162,
315,
264,
6332,
2251,
320,
68,
1326,
13,
330,
3923,
656,
499,
1440,
922,
1461,
1,
1405,
330,
40617,
1,
374,
4613,
304,
264,
3766,
1584,
8,
1198,
10240,
3673,
9932,
1070
] | https://langchain.readthedocs.io/en/latest/memory/langchain.memory.kg.ConversationKGMemory.html |
513c5ce67bde-3 | know about him" where "him" is defined in a previous line) -- ignore items mentioned there that are not in the last line.\n\nReturn the output as a single comma-separated list, or NONE if there is nothing of note to return (e.g. the user is just issuing a greeting or having a simple conversation).\n\nEXAMPLE\nConversation history:\nPerson #1: how\'s it going today?\nAI: "It\'s going great! How about you?"\nPerson #1: good! busy working on Langchain. lots to do.\nAI: "That sounds like a lot of work! What kind of things are you doing to make Langchain better?"\nLast line:\nPerson #1: i\'m trying to improve Langchain\'s interfaces, the UX, its integrations with various products the user might want ... a lot of stuff.\nOutput: Langchain\nEND OF EXAMPLE\n\nEXAMPLE\nConversation history:\nPerson #1: how\'s it going today?\nAI: "It\'s going great! How about you?"\nPerson #1: good! busy working on Langchain. lots to do.\nAI: "That sounds like a lot of work! What kind of things are you doing to make Langchain better?"\nLast line:\nPerson #1: i\'m trying to improve Langchain\'s interfaces, the UX, its integrations with various products the user might want ... a lot of stuff. I\'m working with Person #2.\nOutput: Langchain, Person #2\nEND OF EXAMPLE\n\nConversation history (for reference only):\n{history}\nLast line of conversation (for extraction):\nHuman: {input}\n\nOutput:', template_format='f-string', validate_template=True), llm: ~langchain.base_language.BaseLanguageModel, summary_message_cls: | [
33134,
922,
1461,
1,
1405,
330,
40617,
1,
374,
4613,
304,
264,
3766,
1584,
8,
1198,
10240,
3673,
9932,
1070,
430,
527,
539,
304,
279,
1566,
1584,
7255,
77,
1734,
5715,
279,
2612,
439,
264,
3254,
32783,
73792,
1160,
11,
477,
43969,
422,
1070,
374,
4400,
315,
5296,
311,
471,
320,
68,
1326,
13,
279,
1217,
374,
1120,
43221,
264,
43213,
477,
3515,
264,
4382,
10652,
73441,
77,
1734,
96975,
1734,
61413,
3925,
7338,
77,
10909,
674,
16,
25,
1268,
10379,
82,
433,
2133,
3432,
33720,
77,
15836,
25,
330,
2181,
10379,
82,
2133,
2294,
0,
2650,
922,
499,
7673,
59,
77,
10909,
674,
16,
25,
1695,
0,
13326,
3318,
389,
23272,
8995,
13,
10283,
311,
656,
7255,
77,
15836,
25,
330,
4897,
10578,
1093,
264,
2763,
315,
990,
0,
3639,
3169,
315,
2574,
527,
499,
3815,
311,
1304,
23272,
8995,
2731,
7673,
59,
77,
5966,
1584,
7338,
77,
10909,
674,
16,
25,
602,
10379,
76,
4560,
311,
7417,
23272,
8995,
10379,
82,
25066,
11,
279,
62593,
11,
1202,
8936,
811,
449,
5370,
3956,
279,
1217,
2643,
1390,
2564,
264,
2763,
315,
6392,
7255,
77,
5207,
25,
23272,
8995,
1734,
4794,
3083,
67346,
1734,
1734,
96975,
1734,
61413,
3925,
7338,
77,
10909,
674,
16,
25,
1268,
10379,
82,
433,
2133,
3432,
33720,
77,
15836,
25,
330,
2181,
10379,
82,
2133,
2294,
0,
2650,
922,
499,
7673,
59,
77,
10909,
674,
16,
25,
1695,
0,
13326,
3318,
389,
23272,
8995,
13,
10283,
311,
656,
7255,
77,
15836,
25,
330,
4897,
10578,
1093,
264,
2763,
315,
990,
0,
3639,
3169,
315,
2574,
527,
499,
3815,
311,
1304,
23272,
8995,
2731,
7673,
59,
77,
5966,
1584,
7338,
77,
10909,
674,
16,
25,
602,
10379,
76,
4560,
311,
7417,
23272,
8995,
10379,
82,
25066,
11,
279,
62593,
11,
1202,
8936,
811,
449,
5370,
3956,
279,
1217,
2643,
1390,
2564,
264,
2763,
315,
6392,
13,
358,
10379,
76,
3318,
449,
7508,
674,
17,
7255,
77,
5207,
25,
23272,
8995,
11,
7508,
674,
17,
1734,
4794,
3083,
67346,
1734,
1734,
61413,
3925,
320,
2000,
5905,
1193,
90149,
77,
90,
19375,
11281,
77,
5966,
1584,
315,
10652,
320,
2000,
33289,
90149,
77,
35075,
25,
314,
1379,
11281,
77,
1734,
5207,
17898,
3896,
9132,
1151,
69,
31981,
518,
9788,
8864,
3702,
705,
9507,
76,
25,
4056,
5317,
8995,
9105,
30121,
13316,
14126,
1747,
11,
12399,
6598,
39756,
25
] | https://langchain.readthedocs.io/en/latest/memory/langchain.memory.kg.ConversationKGMemory.html |
513c5ce67bde-4 | validate_template=True), llm: ~langchain.base_language.BaseLanguageModel, summary_message_cls: ~typing.Type[~langchain.schema.BaseMessage] = <class 'langchain.schema.SystemMessage'>, memory_key: str = 'history')[source]¶ | [
7212,
8864,
3702,
705,
9507,
76,
25,
4056,
5317,
8995,
9105,
30121,
13316,
14126,
1747,
11,
12399,
6598,
39756,
25,
4056,
90902,
10394,
58,
93,
5317,
8995,
31992,
13316,
2097,
60,
284,
366,
1058,
364,
5317,
8995,
31992,
17031,
2097,
6404,
11,
5044,
3173,
25,
610,
284,
364,
19375,
13588,
2484,
60,
55609
] | https://langchain.readthedocs.io/en/latest/memory/langchain.memory.kg.ConversationKGMemory.html |
513c5ce67bde-5 | Bases: BaseChatMemory
Knowledge graph memory for storing conversation memory.
Integrates with external knowledge graph to store and retrieve
information about knowledge triples in the conversation.
Create a new model by parsing and validating input data from keyword arguments.
Raises ValidationError if the input data cannot be parsed to form a valid model.
param ai_prefix: str = 'AI'¶
param chat_memory: BaseChatMessageHistory [Optional]¶ | [
33,
2315,
25,
5464,
16047,
10869,
198,
81434,
4876,
5044,
369,
28672,
10652,
5044,
627,
1090,
14750,
988,
449,
9434,
6677,
4876,
311,
3637,
323,
17622,
198,
26125,
922,
6677,
89661,
304,
279,
10652,
627,
4110,
264,
502,
1646,
555,
23115,
323,
69772,
1988,
828,
505,
16570,
6105,
627,
36120,
54129,
422,
279,
1988,
828,
4250,
387,
16051,
311,
1376,
264,
2764,
1646,
627,
913,
16796,
14301,
25,
610,
284,
364,
15836,
6,
55609,
198,
913,
6369,
19745,
25,
5464,
16047,
2097,
13730,
510,
15669,
60,
55609
] | https://langchain.readthedocs.io/en/latest/memory/langchain.memory.kg.ConversationKGMemory.html |
513c5ce67bde-6 | param entity_extraction_prompt: langchain.prompts.base.BasePromptTemplate = PromptTemplate(input_variables=['history', 'input'], output_parser=None, partial_variables={}, template='You are an AI assistant reading the transcript of a conversation between an AI and a human. Extract all of the proper nouns from the last line of conversation. As a guideline, a proper noun is generally capitalized. You should definitely extract all names and places.\n\nThe conversation history is provided just in case of a coreference (e.g. "What do you know about him" where "him" is defined in a previous line) -- ignore items mentioned there that are not in the last line.\n\nReturn the output as a single comma-separated list, or NONE if there is nothing of note to return (e.g. the user is just issuing a greeting or having a simple conversation).\n\nEXAMPLE\nConversation history:\nPerson #1: how\'s it going today?\nAI: "It\'s going great! How about you?"\nPerson #1: good! busy working on Langchain. lots to do.\nAI: "That sounds like a lot of work! What kind of things are you doing to make Langchain better?"\nLast line:\nPerson #1: i\'m trying to improve Langchain\'s interfaces, the UX, its integrations with various products the user might want ... a lot of stuff.\nOutput: Langchain\nEND OF EXAMPLE\n\nEXAMPLE\nConversation history:\nPerson #1: how\'s it going today?\nAI: "It\'s going great! How about you?"\nPerson #1: good! busy working on Langchain. lots to do.\nAI: "That sounds like a lot of work! What kind of things are you doing to make Langchain better?"\nLast line:\nPerson #1: i\'m trying to improve Langchain\'s interfaces, the | [
913,
5502,
95942,
62521,
25,
8859,
8995,
61848,
13044,
9105,
13316,
55715,
7423,
284,
60601,
7423,
5498,
29282,
14314,
19375,
518,
364,
1379,
4181,
2612,
19024,
5980,
11,
7276,
29282,
68525,
3896,
1151,
2675,
527,
459,
15592,
18328,
5403,
279,
36815,
315,
264,
10652,
1990,
459,
15592,
323,
264,
3823,
13,
23673,
682,
315,
279,
6300,
90938,
505,
279,
1566,
1584,
315,
10652,
13,
1666,
264,
73545,
11,
264,
6300,
38021,
374,
8965,
98421,
13,
1472,
1288,
8659,
8819,
682,
5144,
323,
7634,
7255,
77,
1734,
791,
10652,
3925,
374,
3984,
1120,
304,
1162,
315,
264,
6332,
2251,
320,
68,
1326,
13,
330,
3923,
656,
499,
1440,
922,
1461,
1,
1405,
330,
40617,
1,
374,
4613,
304,
264,
3766,
1584,
8,
1198,
10240,
3673,
9932,
1070,
430,
527,
539,
304,
279,
1566,
1584,
7255,
77,
1734,
5715,
279,
2612,
439,
264,
3254,
32783,
73792,
1160,
11,
477,
43969,
422,
1070,
374,
4400,
315,
5296,
311,
471,
320,
68,
1326,
13,
279,
1217,
374,
1120,
43221,
264,
43213,
477,
3515,
264,
4382,
10652,
73441,
77,
1734,
96975,
1734,
61413,
3925,
7338,
77,
10909,
674,
16,
25,
1268,
10379,
82,
433,
2133,
3432,
33720,
77,
15836,
25,
330,
2181,
10379,
82,
2133,
2294,
0,
2650,
922,
499,
7673,
59,
77,
10909,
674,
16,
25,
1695,
0,
13326,
3318,
389,
23272,
8995,
13,
10283,
311,
656,
7255,
77,
15836,
25,
330,
4897,
10578,
1093,
264,
2763,
315,
990,
0,
3639,
3169,
315,
2574,
527,
499,
3815,
311,
1304,
23272,
8995,
2731,
7673,
59,
77,
5966,
1584,
7338,
77,
10909,
674,
16,
25,
602,
10379,
76,
4560,
311,
7417,
23272,
8995,
10379,
82,
25066,
11,
279,
62593,
11,
1202,
8936,
811,
449,
5370,
3956,
279,
1217,
2643,
1390,
2564,
264,
2763,
315,
6392,
7255,
77,
5207,
25,
23272,
8995,
1734,
4794,
3083,
67346,
1734,
1734,
96975,
1734,
61413,
3925,
7338,
77,
10909,
674,
16,
25,
1268,
10379,
82,
433,
2133,
3432,
33720,
77,
15836,
25,
330,
2181,
10379,
82,
2133,
2294,
0,
2650,
922,
499,
7673,
59,
77,
10909,
674,
16,
25,
1695,
0,
13326,
3318,
389,
23272,
8995,
13,
10283,
311,
656,
7255,
77,
15836,
25,
330,
4897,
10578,
1093,
264,
2763,
315,
990,
0,
3639,
3169,
315,
2574,
527,
499,
3815,
311,
1304,
23272,
8995,
2731,
7673,
59,
77,
5966,
1584,
7338,
77,
10909,
674,
16,
25,
602,
10379,
76,
4560,
311,
7417,
23272,
8995,
10379,
82,
25066,
11,
279
] | https://langchain.readthedocs.io/en/latest/memory/langchain.memory.kg.ConversationKGMemory.html |
513c5ce67bde-7 | line:\nPerson #1: i\'m trying to improve Langchain\'s interfaces, the UX, its integrations with various products the user might want ... a lot of stuff. I\'m working with Person #2.\nOutput: Langchain, Person #2\nEND OF EXAMPLE\n\nConversation history (for reference only):\n{history}\nLast line of conversation (for extraction):\nHuman: {input}\n\nOutput:', template_format='f-string', validate_template=True)¶ | [
1074,
7338,
77,
10909,
674,
16,
25,
602,
10379,
76,
4560,
311,
7417,
23272,
8995,
10379,
82,
25066,
11,
279,
62593,
11,
1202,
8936,
811,
449,
5370,
3956,
279,
1217,
2643,
1390,
2564,
264,
2763,
315,
6392,
13,
358,
10379,
76,
3318,
449,
7508,
674,
17,
7255,
77,
5207,
25,
23272,
8995,
11,
7508,
674,
17,
1734,
4794,
3083,
67346,
1734,
1734,
61413,
3925,
320,
2000,
5905,
1193,
90149,
77,
90,
19375,
11281,
77,
5966,
1584,
315,
10652,
320,
2000,
33289,
90149,
77,
35075,
25,
314,
1379,
11281,
77,
1734,
5207,
17898,
3896,
9132,
1151,
69,
31981,
518,
9788,
8864,
3702,
8,
55609
] | https://langchain.readthedocs.io/en/latest/memory/langchain.memory.kg.ConversationKGMemory.html |
513c5ce67bde-8 | param human_prefix: str = 'Human'¶
param input_key: Optional[str] = None¶
param k: int = 2¶
param kg: langchain.graphs.networkx_graph.NetworkxEntityGraph [Optional]¶ | [
913,
3823,
14301,
25,
610,
284,
364,
35075,
6,
55609,
198,
913,
1988,
3173,
25,
12536,
17752,
60,
284,
2290,
55609,
198,
913,
597,
25,
528,
284,
220,
17,
55609,
198,
913,
21647,
25,
8859,
8995,
10996,
82,
21216,
87,
15080,
31249,
87,
3106,
11461,
510,
15669,
60,
55609
] | https://langchain.readthedocs.io/en/latest/memory/langchain.memory.kg.ConversationKGMemory.html |
513c5ce67bde-9 | param knowledge_extraction_prompt: langchain.prompts.base.BasePromptTemplate = PromptTemplate(input_variables=['history', 'input'], output_parser=None, partial_variables={}, template="You are a networked intelligence helping a human track knowledge triples about all relevant people, things, concepts, etc. and integrating them with your knowledge stored within your weights as well as that stored in a knowledge graph. Extract all of the knowledge triples from the last line of conversation. A knowledge triple is a clause that contains a subject, a predicate, and an object. The subject is the entity being described, the predicate is the property of the subject that is being described, and the object is the value of the property.\n\nEXAMPLE\nConversation history:\nPerson #1: Did you hear aliens landed in Area 51?\nAI: No, I didn't hear that. What do you know about Area 51?\nPerson #1: It's a secret military base in Nevada.\nAI: What do you know about Nevada?\nLast line of conversation:\nPerson #1: It's a state in the US. It's also the number 1 producer of gold in the US.\n\nOutput: (Nevada, is a, state)<|>(Nevada, is in, US)<|>(Nevada, is the number 1 producer of, gold)\nEND OF EXAMPLE\n\nEXAMPLE\nConversation history:\nPerson #1: Hello.\nAI: Hi! How are you?\nPerson #1: I'm good. How are you?\nAI: I'm good too.\nLast line of conversation:\nPerson #1: I'm going to the store.\n\nOutput: NONE\nEND OF EXAMPLE\n\nEXAMPLE\nConversation history:\nPerson #1: What do you know about Descartes?\nAI: Descartes was a French philosopher, mathematician, and scientist who lived in the 17th | [
913,
6677,
95942,
62521,
25,
8859,
8995,
61848,
13044,
9105,
13316,
55715,
7423,
284,
60601,
7423,
5498,
29282,
14314,
19375,
518,
364,
1379,
4181,
2612,
19024,
5980,
11,
7276,
29282,
68525,
3896,
429,
2675,
527,
264,
4009,
291,
11478,
10695,
264,
3823,
3839,
6677,
89661,
922,
682,
9959,
1274,
11,
2574,
11,
19476,
11,
5099,
13,
323,
54952,
1124,
449,
701,
6677,
9967,
2949,
701,
14661,
439,
1664,
439,
430,
9967,
304,
264,
6677,
4876,
13,
23673,
682,
315,
279,
6677,
89661,
505,
279,
1566,
1584,
315,
10652,
13,
362,
6677,
24657,
374,
264,
22381,
430,
5727,
264,
3917,
11,
264,
25269,
11,
323,
459,
1665,
13,
578,
3917,
374,
279,
5502,
1694,
7633,
11,
279,
25269,
374,
279,
3424,
315,
279,
3917,
430,
374,
1694,
7633,
11,
323,
279,
1665,
374,
279,
907,
315,
279,
3424,
7255,
77,
1734,
96975,
1734,
61413,
3925,
7338,
77,
10909,
674,
16,
25,
14910,
499,
6865,
37219,
27212,
304,
12299,
220,
3971,
33720,
77,
15836,
25,
2360,
11,
358,
3287,
956,
6865,
430,
13,
3639,
656,
499,
1440,
922,
12299,
220,
3971,
33720,
77,
10909,
674,
16,
25,
1102,
596,
264,
6367,
6411,
2385,
304,
27966,
7255,
77,
15836,
25,
3639,
656,
499,
1440,
922,
27966,
33720,
77,
5966,
1584,
315,
10652,
7338,
77,
10909,
674,
16,
25,
1102,
596,
264,
1614,
304,
279,
2326,
13,
1102,
596,
1101,
279,
1396,
220,
16,
17276,
315,
6761,
304,
279,
2326,
7255,
77,
1734,
5207,
25,
320,
45,
5230,
2649,
11,
374,
264,
11,
1614,
27530,
91,
2284,
45,
5230,
2649,
11,
374,
304,
11,
2326,
27530,
91,
2284,
45,
5230,
2649,
11,
374,
279,
1396,
220,
16,
17276,
315,
11,
6761,
10929,
77,
4794,
3083,
67346,
1734,
1734,
96975,
1734,
61413,
3925,
7338,
77,
10909,
674,
16,
25,
22691,
7255,
77,
15836,
25,
21694,
0,
2650,
527,
499,
33720,
77,
10909,
674,
16,
25,
358,
2846,
1695,
13,
2650,
527,
499,
33720,
77,
15836,
25,
358,
2846,
1695,
2288,
7255,
77,
5966,
1584,
315,
10652,
7338,
77,
10909,
674,
16,
25,
358,
2846,
2133,
311,
279,
3637,
7255,
77,
1734,
5207,
25,
43969,
1734,
4794,
3083,
67346,
1734,
1734,
96975,
1734,
61413,
3925,
7338,
77,
10909,
674,
16,
25,
3639,
656,
499,
1440,
922,
32285,
472,
288,
33720,
77,
15836,
25,
32285,
472,
288,
574,
264,
8753,
55475,
11,
21651,
1122,
11,
323,
28568,
889,
12439,
304,
279,
220,
1114,
339
] | https://langchain.readthedocs.io/en/latest/memory/langchain.memory.kg.ConversationKGMemory.html |
513c5ce67bde-10 | Descartes was a French philosopher, mathematician, and scientist who lived in the 17th century.\nPerson #1: The Descartes I'm referring to is a standup comedian and interior designer from Montreal.\nAI: Oh yes, He is a comedian and an interior designer. He has been in the industry for 30 years. His favorite food is baked bean pie.\nLast line of conversation:\nPerson #1: Oh huh. I know Descartes likes to drive antique scooters and play the mandolin.\nOutput: (Descartes, likes to drive, antique scooters)<|>(Descartes, plays, mandolin)\nEND OF EXAMPLE\n\nConversation history (for reference only):\n{history}\nLast line of conversation (for extraction):\nHuman: {input}\n\nOutput:", template_format='f-string', validate_template=True)¶ | [
11312,
472,
288,
574,
264,
8753,
55475,
11,
21651,
1122,
11,
323,
28568,
889,
12439,
304,
279,
220,
1114,
339,
9478,
7255,
77,
10909,
674,
16,
25,
578,
32285,
472,
288,
358,
2846,
22797,
311,
374,
264,
2559,
455,
51912,
323,
15135,
15034,
505,
30613,
7255,
77,
15836,
25,
8840,
10035,
11,
1283,
374,
264,
51912,
323,
459,
15135,
15034,
13,
1283,
706,
1027,
304,
279,
5064,
369,
220,
966,
1667,
13,
5414,
7075,
3691,
374,
41778,
21059,
4447,
7255,
77,
5966,
1584,
315,
10652,
7338,
77,
10909,
674,
16,
25,
8840,
57843,
13,
358,
1440,
32285,
472,
288,
13452,
311,
6678,
47691,
25661,
40896,
323,
1514,
279,
11837,
37737,
7255,
77,
5207,
25,
320,
11312,
472,
288,
11,
13452,
311,
6678,
11,
47691,
25661,
40896,
27530,
91,
2284,
11312,
472,
288,
11,
11335,
11,
11837,
37737,
10929,
77,
4794,
3083,
67346,
1734,
1734,
61413,
3925,
320,
2000,
5905,
1193,
90149,
77,
90,
19375,
11281,
77,
5966,
1584,
315,
10652,
320,
2000,
33289,
90149,
77,
35075,
25,
314,
1379,
11281,
77,
1734,
5207,
12421,
3896,
9132,
1151,
69,
31981,
518,
9788,
8864,
3702,
8,
55609
] | https://langchain.readthedocs.io/en/latest/memory/langchain.memory.kg.ConversationKGMemory.html |
513c5ce67bde-11 | param llm: langchain.base_language.BaseLanguageModel [Required]¶
param output_key: Optional[str] = None¶
param return_messages: bool = False¶
param summary_message_cls: Type[langchain.schema.BaseMessage] = <class 'langchain.schema.SystemMessage'>¶
Number of previous utterances to include in the context.
clear() → None[source]¶
Clear memory contents.
get_current_entities(input_string: str) → List[str][source]¶
get_knowledge_triplets(input_string: str) → List[KnowledgeTriple][source]¶
load_memory_variables(inputs: Dict[str, Any]) → Dict[str, Any][source]¶
Return history buffer.
save_context(inputs: Dict[str, Any], outputs: Dict[str, str]) → None[source]¶
Save context from this conversation to buffer.
to_json() → Union[SerializedConstructor, SerializedNotImplemented]¶
to_json_not_implemented() → SerializedNotImplemented¶
property lc_attributes: Dict¶
Return a list of attribute names that should be included in the
serialized kwargs. These attributes must be accepted by the
constructor.
property lc_namespace: List[str]¶
Return the namespace of the langchain object.
eg. [“langchain”, “llms”, “openai”]
property lc_secrets: Dict[str, str]¶
Return a map of constructor argument names to secret ids.
eg. {“openai_api_key”: “OPENAI_API_KEY”}
property lc_serializable: bool¶
Return whether or not the class is serializable.
model Config¶
Bases: object
Configuration for this pydantic object.
arbitrary_types_allowed = True¶ | [
913,
9507,
76,
25,
8859,
8995,
9105,
30121,
13316,
14126,
1747,
510,
8327,
60,
55609,
198,
913,
2612,
3173,
25,
12536,
17752,
60,
284,
2290,
55609,
198,
913,
471,
24321,
25,
1845,
284,
3641,
55609,
198,
913,
12399,
6598,
39756,
25,
4078,
58,
5317,
8995,
31992,
13316,
2097,
60,
284,
366,
1058,
364,
5317,
8995,
31992,
17031,
2097,
6404,
55609,
198,
2903,
315,
3766,
22256,
3095,
311,
2997,
304,
279,
2317,
627,
7574,
368,
11651,
2290,
76747,
60,
55609,
198,
14335,
5044,
8970,
627,
456,
11327,
48477,
5498,
3991,
25,
610,
8,
11651,
1796,
17752,
1483,
2484,
60,
55609,
198,
456,
4803,
52286,
64983,
10145,
5498,
3991,
25,
610,
8,
11651,
1796,
58,
81434,
83926,
1483,
2484,
60,
55609,
198,
1096,
19745,
29282,
35099,
25,
30226,
17752,
11,
5884,
2526,
11651,
30226,
17752,
11,
5884,
1483,
2484,
60,
55609,
198,
5715,
3925,
4240,
627,
6766,
8634,
35099,
25,
30226,
17752,
11,
5884,
1145,
16674,
25,
30226,
17752,
11,
610,
2526,
11651,
2290,
76747,
60,
55609,
198,
8960,
2317,
505,
420,
10652,
311,
4240,
627,
998,
9643,
368,
11651,
9323,
58,
78621,
13591,
11,
92572,
2688,
18804,
60,
55609,
198,
998,
9643,
8072,
18377,
14565,
368,
11651,
92572,
2688,
18804,
55609,
198,
3784,
37313,
18741,
25,
30226,
55609,
198,
5715,
264,
1160,
315,
7180,
5144,
430,
1288,
387,
5343,
304,
279,
198,
76377,
16901,
13,
4314,
8365,
2011,
387,
11928,
555,
279,
198,
22602,
627,
3784,
37313,
42671,
25,
1796,
17752,
60,
55609,
198,
5715,
279,
4573,
315,
279,
8859,
8995,
1665,
627,
797,
13,
510,
2118,
5317,
8995,
9520,
1054,
657,
1026,
9520,
1054,
2569,
2192,
863,
933,
3784,
37313,
3537,
53810,
25,
30226,
17752,
11,
610,
60,
55609,
198,
5715,
264,
2472,
315,
4797,
5811,
5144,
311,
6367,
14483,
627,
797,
13,
314,
2118,
2569,
2192,
11959,
3173,
57633,
1054,
32033,
15836,
11669,
6738,
863,
534,
3784,
37313,
26684,
8499,
25,
1845,
55609,
198,
5715,
3508,
477,
539,
279,
538,
374,
6275,
8499,
627,
2590,
5649,
55609,
198,
33,
2315,
25,
1665,
198,
7843,
369,
420,
4611,
67,
8322,
1665,
627,
277,
88951,
9962,
43255,
284,
3082,
55609
] | https://langchain.readthedocs.io/en/latest/memory/langchain.memory.kg.ConversationKGMemory.html |
304d92a2ad36-0 | langchain.cache.RedisCache¶
class langchain.cache.RedisCache(redis_: Any)[source]¶
Bases: BaseCache
Cache that uses Redis as a backend.
Initialize by passing in Redis instance.
Methods
__init__(redis_)
Initialize by passing in Redis instance.
clear(**kwargs)
Clear cache.
lookup(prompt, llm_string)
Look up based on prompt and llm_string.
update(prompt, llm_string, return_val)
Update cache based on prompt and llm_string.
clear(**kwargs: Any) → None[source]¶
Clear cache. If asynchronous is True, flush asynchronously.
lookup(prompt: str, llm_string: str) → Optional[Sequence[Generation]][source]¶
Look up based on prompt and llm_string.
update(prompt: str, llm_string: str, return_val: Sequence[Generation]) → None[source]¶
Update cache based on prompt and llm_string. | [
5317,
8995,
20688,
83498,
8397,
55609,
198,
1058,
8859,
8995,
20688,
83498,
8397,
98776,
24089,
5884,
6758,
2484,
60,
55609,
198,
33,
2315,
25,
5464,
8397,
198,
8397,
430,
5829,
35258,
439,
264,
19713,
627,
10130,
555,
12579,
304,
35258,
2937,
627,
18337,
198,
565,
2381,
3889,
22496,
24262,
10130,
555,
12579,
304,
35258,
2937,
627,
7574,
22551,
9872,
340,
14335,
6636,
627,
21696,
73353,
11,
4194,
657,
76,
3991,
340,
10596,
709,
3196,
389,
10137,
323,
9507,
76,
3991,
627,
2443,
73353,
11,
4194,
657,
76,
3991,
11,
4194,
693,
6320,
340,
4387,
6636,
3196,
389,
10137,
323,
9507,
76,
3991,
627,
7574,
22551,
9872,
25,
5884,
8,
11651,
2290,
76747,
60,
55609,
198,
14335,
6636,
13,
1442,
40107,
374,
3082,
11,
18698,
68881,
627,
21696,
73353,
25,
610,
11,
9507,
76,
3991,
25,
610,
8,
11651,
12536,
58,
14405,
58,
38238,
28819,
2484,
60,
55609,
198,
10596,
709,
3196,
389,
10137,
323,
9507,
76,
3991,
627,
2443,
73353,
25,
610,
11,
9507,
76,
3991,
25,
610,
11,
471,
6320,
25,
29971,
58,
38238,
2526,
11651,
2290,
76747,
60,
55609,
198,
4387,
6636,
3196,
389,
10137,
323,
9507,
76,
3991,
13
] | https://langchain.readthedocs.io/en/latest/cache/langchain.cache.RedisCache.html |
49a3cb26a05f-0 | langchain.cache.InMemoryCache¶
class langchain.cache.InMemoryCache[source]¶
Bases: BaseCache
Cache that stores things in memory.
Initialize with empty cache.
Methods
__init__()
Initialize with empty cache.
clear(**kwargs)
Clear cache.
lookup(prompt, llm_string)
Look up based on prompt and llm_string.
update(prompt, llm_string, return_val)
Update cache based on prompt and llm_string.
clear(**kwargs: Any) → None[source]¶
Clear cache.
lookup(prompt: str, llm_string: str) → Optional[Sequence[Generation]][source]¶
Look up based on prompt and llm_string.
update(prompt: str, llm_string: str, return_val: Sequence[Generation]) → None[source]¶
Update cache based on prompt and llm_string. | [
5317,
8995,
20688,
5450,
10869,
8397,
55609,
198,
1058,
8859,
8995,
20688,
5450,
10869,
8397,
76747,
60,
55609,
198,
33,
2315,
25,
5464,
8397,
198,
8397,
430,
10756,
2574,
304,
5044,
627,
10130,
449,
4384,
6636,
627,
18337,
198,
565,
2381,
33716,
10130,
449,
4384,
6636,
627,
7574,
22551,
9872,
340,
14335,
6636,
627,
21696,
73353,
11,
4194,
657,
76,
3991,
340,
10596,
709,
3196,
389,
10137,
323,
9507,
76,
3991,
627,
2443,
73353,
11,
4194,
657,
76,
3991,
11,
4194,
693,
6320,
340,
4387,
6636,
3196,
389,
10137,
323,
9507,
76,
3991,
627,
7574,
22551,
9872,
25,
5884,
8,
11651,
2290,
76747,
60,
55609,
198,
14335,
6636,
627,
21696,
73353,
25,
610,
11,
9507,
76,
3991,
25,
610,
8,
11651,
12536,
58,
14405,
58,
38238,
28819,
2484,
60,
55609,
198,
10596,
709,
3196,
389,
10137,
323,
9507,
76,
3991,
627,
2443,
73353,
25,
610,
11,
9507,
76,
3991,
25,
610,
11,
471,
6320,
25,
29971,
58,
38238,
2526,
11651,
2290,
76747,
60,
55609,
198,
4387,
6636,
3196,
389,
10137,
323,
9507,
76,
3991,
13
] | https://langchain.readthedocs.io/en/latest/cache/langchain.cache.InMemoryCache.html |
dda9dc2fb9d4-0 | langchain.cache.GPTCache¶
class langchain.cache.GPTCache(init_func: Optional[Union[Callable[[Any, str], None], Callable[[Any], None]]] = None)[source]¶
Bases: BaseCache
Cache that uses GPTCache as a backend.
Initialize by passing in init function (default: None).
Parameters
init_func (Optional[Callable[[Any], None]]) – init GPTCache function
(default – None)
Example:
.. code-block:: python
# Initialize GPTCache with a custom init function
import gptcache
from gptcache.processor.pre import get_prompt
from gptcache.manager.factory import get_data_manager
# Avoid multiple caches using the same file,
causing different llm model caches to affect each other
def init_gptcache(cache_obj: gptcache.Cache, llm str):
cache_obj.init(pre_embedding_func=get_prompt,
data_manager=manager_factory(
manager=”map”,
data_dir=f”map_cache_{llm}”
),
)
langchain.llm_cache = GPTCache(init_gptcache)
Methods
__init__([init_func])
Initialize by passing in init function (default: None).
clear(**kwargs)
Clear cache.
lookup(prompt, llm_string)
Look up the cache data.
update(prompt, llm_string, return_val)
Update cache.
clear(**kwargs: Any) → None[source]¶
Clear cache.
lookup(prompt: str, llm_string: str) → Optional[Sequence[Generation]][source]¶
Look up the cache data.
First, retrieve the corresponding cache object using the llm_string parameter,
and then retrieve the data from the cache based on the prompt. | [
5317,
8995,
20688,
1246,
2898,
8397,
55609,
198,
1058,
8859,
8995,
20688,
1246,
2898,
8397,
39350,
9791,
25,
12536,
58,
33758,
58,
41510,
15873,
8780,
11,
610,
1145,
2290,
1145,
54223,
15873,
8780,
1145,
2290,
5163,
60,
284,
2290,
6758,
2484,
60,
55609,
198,
33,
2315,
25,
5464,
8397,
198,
8397,
430,
5829,
480,
2898,
8397,
439,
264,
19713,
627,
10130,
555,
12579,
304,
3003,
734,
320,
2309,
25,
2290,
4390,
9905,
198,
2381,
9791,
320,
15669,
58,
41510,
15873,
8780,
1145,
2290,
30716,
1389,
3003,
480,
2898,
8397,
734,
198,
19515,
1389,
2290,
340,
13617,
512,
497,
2082,
9612,
487,
10344,
198,
2,
9185,
480,
2898,
8397,
449,
264,
2587,
3003,
734,
198,
475,
342,
418,
9544,
198,
1527,
342,
418,
9544,
80858,
6357,
1179,
636,
62521,
198,
1527,
342,
418,
9544,
33915,
14411,
1179,
636,
1807,
12418,
198,
2,
35106,
5361,
54688,
1701,
279,
1890,
1052,
345,
936,
985,
2204,
9507,
76,
1646,
54688,
311,
7958,
1855,
1023,
198,
755,
3003,
1928,
418,
9544,
33033,
7478,
25,
342,
418,
9544,
47230,
11,
9507,
76,
610,
997,
9544,
7478,
8435,
28759,
52602,
9791,
29380,
62521,
345,
695,
12418,
28,
13600,
25255,
1021,
13600,
45221,
2235,
863,
345,
695,
4432,
18603,
863,
2235,
11790,
15511,
657,
76,
92,
89874,
1350,
340,
5317,
8995,
60098,
76,
11790,
284,
480,
2898,
8397,
39350,
1928,
418,
9544,
340,
18337,
198,
565,
2381,
565,
2625,
2381,
9791,
2608,
10130,
555,
12579,
304,
3003,
734,
320,
2309,
25,
2290,
4390,
7574,
22551,
9872,
340,
14335,
6636,
627,
21696,
73353,
11,
4194,
657,
76,
3991,
340,
10596,
709,
279,
6636,
828,
627,
2443,
73353,
11,
4194,
657,
76,
3991,
11,
4194,
693,
6320,
340,
4387,
6636,
627,
7574,
22551,
9872,
25,
5884,
8,
11651,
2290,
76747,
60,
55609,
198,
14335,
6636,
627,
21696,
73353,
25,
610,
11,
9507,
76,
3991,
25,
610,
8,
11651,
12536,
58,
14405,
58,
38238,
28819,
2484,
60,
55609,
198,
10596,
709,
279,
6636,
828,
627,
5451,
11,
17622,
279,
12435,
6636,
1665,
1701,
279,
9507,
76,
3991,
5852,
345,
438,
1243,
17622,
279,
828,
505,
279,
6636,
3196,
389,
279,
10137,
13
] | https://langchain.readthedocs.io/en/latest/cache/langchain.cache.GPTCache.html |
dda9dc2fb9d4-1 | and then retrieve the data from the cache based on the prompt.
update(prompt: str, llm_string: str, return_val: Sequence[Generation]) → None[source]¶
Update cache.
First, retrieve the corresponding cache object using the llm_string parameter,
and then store the prompt and return_val in the cache object. | [
438,
1243,
17622,
279,
828,
505,
279,
6636,
3196,
389,
279,
10137,
627,
2443,
73353,
25,
610,
11,
9507,
76,
3991,
25,
610,
11,
471,
6320,
25,
29971,
58,
38238,
2526,
11651,
2290,
76747,
60,
55609,
198,
4387,
6636,
627,
5451,
11,
17622,
279,
12435,
6636,
1665,
1701,
279,
9507,
76,
3991,
5852,
345,
438,
1243,
3637,
279,
10137,
323,
471,
6320,
304,
279,
6636,
1665,
13
] | https://langchain.readthedocs.io/en/latest/cache/langchain.cache.GPTCache.html |
79e619d7564d-0 | langchain.cache.SQLAlchemyCache¶
class langchain.cache.SQLAlchemyCache(engine: ~sqlalchemy.engine.base.Engine, cache_schema: ~typing.Type[~langchain.cache.FullLLMCache] = <class 'langchain.cache.FullLLMCache'>)[source]¶
Bases: BaseCache
Cache that uses SQAlchemy as a backend.
Initialize by creating all tables.
Methods
__init__(engine[, cache_schema])
Initialize by creating all tables.
clear(**kwargs)
Clear cache.
lookup(prompt, llm_string)
Look up based on prompt and llm_string.
update(prompt, llm_string, return_val)
Update based on prompt and llm_string.
clear(**kwargs: Any) → None[source]¶
Clear cache.
lookup(prompt: str, llm_string: str) → Optional[Sequence[Generation]][source]¶
Look up based on prompt and llm_string.
update(prompt: str, llm_string: str, return_val: Sequence[Generation]) → None[source]¶
Update based on prompt and llm_string. | [
5317,
8995,
20688,
26151,
68614,
8397,
55609,
198,
1058,
8859,
8995,
20688,
26151,
68614,
8397,
50074,
25,
4056,
3628,
36005,
25421,
9105,
55524,
11,
6636,
26443,
25,
4056,
90902,
10394,
58,
93,
5317,
8995,
20688,
33138,
4178,
44,
8397,
60,
284,
366,
1058,
364,
5317,
8995,
20688,
33138,
4178,
44,
8397,
6404,
6758,
2484,
60,
55609,
198,
33,
2315,
25,
5464,
8397,
198,
8397,
430,
5829,
52718,
68614,
439,
264,
19713,
627,
10130,
555,
6968,
682,
12920,
627,
18337,
198,
565,
2381,
3889,
8680,
38372,
4194,
9544,
26443,
2608,
10130,
555,
6968,
682,
12920,
627,
7574,
22551,
9872,
340,
14335,
6636,
627,
21696,
73353,
11,
4194,
657,
76,
3991,
340,
10596,
709,
3196,
389,
10137,
323,
9507,
76,
3991,
627,
2443,
73353,
11,
4194,
657,
76,
3991,
11,
4194,
693,
6320,
340,
4387,
3196,
389,
10137,
323,
9507,
76,
3991,
627,
7574,
22551,
9872,
25,
5884,
8,
11651,
2290,
76747,
60,
55609,
198,
14335,
6636,
627,
21696,
73353,
25,
610,
11,
9507,
76,
3991,
25,
610,
8,
11651,
12536,
58,
14405,
58,
38238,
28819,
2484,
60,
55609,
198,
10596,
709,
3196,
389,
10137,
323,
9507,
76,
3991,
627,
2443,
73353,
25,
610,
11,
9507,
76,
3991,
25,
610,
11,
471,
6320,
25,
29971,
58,
38238,
2526,
11651,
2290,
76747,
60,
55609,
198,
4387,
3196,
389,
10137,
323,
9507,
76,
3991,
13
] | https://langchain.readthedocs.io/en/latest/cache/langchain.cache.SQLAlchemyCache.html |
b54a2876a751-0 | langchain.cache.FullLLMCache¶
class langchain.cache.FullLLMCache(**kwargs)[source]¶
Bases: Base
SQLite table for full LLM Cache (all generations).
A simple constructor that allows initialization from kwargs.
Sets attributes on the constructed instance using the names and
values in kwargs.
Only keys that are present as
attributes of the instance’s class are allowed. These could be,
for example, any mapped columns or relationships.
Methods
__init__(**kwargs)
A simple constructor that allows initialization from kwargs.
Attributes
idx
llm
metadata
prompt
registry
response
idx¶
llm¶
metadata: MetaData = MetaData()¶
prompt¶
registry: RegistryType = <sqlalchemy.orm.decl_api.registry object>¶
response¶ | [
5317,
8995,
20688,
33138,
4178,
44,
8397,
55609,
198,
1058,
8859,
8995,
20688,
33138,
4178,
44,
8397,
22551,
9872,
6758,
2484,
60,
55609,
198,
33,
2315,
25,
5464,
198,
82872,
2007,
369,
2539,
445,
11237,
20044,
320,
543,
22540,
4390,
32,
4382,
4797,
430,
6276,
17923,
505,
16901,
627,
31275,
8365,
389,
279,
20968,
2937,
1701,
279,
5144,
323,
198,
3745,
304,
16901,
627,
7456,
7039,
430,
527,
3118,
439,
198,
12620,
315,
279,
2937,
753,
538,
527,
5535,
13,
4314,
1436,
387,
345,
2000,
3187,
11,
904,
24784,
8310,
477,
12135,
627,
18337,
198,
565,
2381,
3889,
334,
9872,
340,
32,
4382,
4797,
430,
6276,
17923,
505,
16901,
627,
10738,
198,
6495,
198,
657,
76,
198,
18103,
198,
41681,
198,
30272,
198,
2376,
198,
6495,
55609,
198,
657,
76,
55609,
198,
18103,
25,
16197,
1061,
284,
16197,
1061,
368,
55609,
198,
41681,
55609,
198,
30272,
25,
33212,
941,
284,
366,
3628,
36005,
48849,
2337,
566,
11959,
56668,
1665,
29,
55609,
198,
2376,
55609
] | https://langchain.readthedocs.io/en/latest/cache/langchain.cache.FullLLMCache.html |
d8124b6269ec-0 | langchain.cache.BaseCache¶
class langchain.cache.BaseCache[source]¶
Bases: ABC
Base interface for cache.
Methods
__init__()
clear(**kwargs)
Clear cache that can take additional keyword arguments.
lookup(prompt, llm_string)
Look up based on prompt and llm_string.
update(prompt, llm_string, return_val)
Update cache based on prompt and llm_string.
abstract clear(**kwargs: Any) → None[source]¶
Clear cache that can take additional keyword arguments.
abstract lookup(prompt: str, llm_string: str) → Optional[Sequence[Generation]][source]¶
Look up based on prompt and llm_string.
abstract update(prompt: str, llm_string: str, return_val: Sequence[Generation]) → None[source]¶
Update cache based on prompt and llm_string. | [
5317,
8995,
20688,
13316,
8397,
55609,
198,
1058,
8859,
8995,
20688,
13316,
8397,
76747,
60,
55609,
198,
33,
2315,
25,
19921,
198,
4066,
3834,
369,
6636,
627,
18337,
198,
565,
2381,
33716,
7574,
22551,
9872,
340,
14335,
6636,
430,
649,
1935,
5217,
16570,
6105,
627,
21696,
73353,
11,
4194,
657,
76,
3991,
340,
10596,
709,
3196,
389,
10137,
323,
9507,
76,
3991,
627,
2443,
73353,
11,
4194,
657,
76,
3991,
11,
4194,
693,
6320,
340,
4387,
6636,
3196,
389,
10137,
323,
9507,
76,
3991,
627,
16647,
2867,
22551,
9872,
25,
5884,
8,
11651,
2290,
76747,
60,
55609,
198,
14335,
6636,
430,
649,
1935,
5217,
16570,
6105,
627,
16647,
19128,
73353,
25,
610,
11,
9507,
76,
3991,
25,
610,
8,
11651,
12536,
58,
14405,
58,
38238,
28819,
2484,
60,
55609,
198,
10596,
709,
3196,
389,
10137,
323,
9507,
76,
3991,
627,
16647,
2713,
73353,
25,
610,
11,
9507,
76,
3991,
25,
610,
11,
471,
6320,
25,
29971,
58,
38238,
2526,
11651,
2290,
76747,
60,
55609,
198,
4387,
6636,
3196,
389,
10137,
323,
9507,
76,
3991,
13
] | https://langchain.readthedocs.io/en/latest/cache/langchain.cache.BaseCache.html |
c05e928796cd-0 | langchain.cache.RedisSemanticCache¶
class langchain.cache.RedisSemanticCache(redis_url: str, embedding: Embeddings, score_threshold: float = 0.2)[source]¶
Bases: BaseCache
Cache that uses Redis as a vector-store backend.
Initialize by passing in the init GPTCache func
Parameters
redis_url (str) – URL to connect to Redis.
embedding (Embedding) – Embedding provider for semantic encoding and search.
score_threshold (float, 0.2) –
Example:
import langchain
from langchain.cache import RedisSemanticCache
from langchain.embeddings import OpenAIEmbeddings
langchain.llm_cache = RedisSemanticCache(
redis_url="redis://localhost:6379",
embedding=OpenAIEmbeddings()
)
Methods
__init__(redis_url, embedding[, score_threshold])
Initialize by passing in the init GPTCache func
clear(**kwargs)
Clear semantic cache for a given llm_string.
lookup(prompt, llm_string)
Look up based on prompt and llm_string.
update(prompt, llm_string, return_val)
Update cache based on prompt and llm_string.
clear(**kwargs: Any) → None[source]¶
Clear semantic cache for a given llm_string.
lookup(prompt: str, llm_string: str) → Optional[Sequence[Generation]][source]¶
Look up based on prompt and llm_string.
update(prompt: str, llm_string: str, return_val: Sequence[Generation]) → None[source]¶
Update cache based on prompt and llm_string. | [
5317,
8995,
20688,
83498,
99031,
8397,
55609,
198,
1058,
8859,
8995,
20688,
83498,
99031,
8397,
98776,
2975,
25,
610,
11,
40188,
25,
38168,
25624,
11,
5573,
22616,
25,
2273,
284,
220,
15,
13,
17,
6758,
2484,
60,
55609,
198,
33,
2315,
25,
5464,
8397,
198,
8397,
430,
5829,
35258,
439,
264,
4724,
34352,
19713,
627,
10130,
555,
12579,
304,
279,
3003,
480,
2898,
8397,
2988,
198,
9905,
198,
22496,
2975,
320,
496,
8,
1389,
5665,
311,
4667,
311,
35258,
627,
95711,
320,
26566,
7113,
8,
1389,
38168,
7113,
9287,
369,
42833,
11418,
323,
2778,
627,
12618,
22616,
320,
3733,
11,
220,
15,
13,
17,
8,
1389,
720,
13617,
512,
475,
8859,
8995,
198,
1527,
8859,
8995,
20688,
1179,
35258,
99031,
8397,
198,
1527,
8859,
8995,
41541,
25624,
1179,
5377,
15836,
26566,
25624,
198,
5317,
8995,
60098,
76,
11790,
284,
35258,
99031,
8397,
1021,
262,
21540,
2975,
429,
22496,
1129,
8465,
25,
21788,
24,
761,
262,
40188,
28,
5109,
15836,
26566,
25624,
746,
340,
18337,
198,
565,
2381,
3889,
22496,
2975,
11,
4194,
95711,
38372,
4194,
12618,
22616,
2608,
10130,
555,
12579,
304,
279,
3003,
480,
2898,
8397,
2988,
198,
7574,
22551,
9872,
340,
14335,
42833,
6636,
369,
264,
2728,
9507,
76,
3991,
627,
21696,
73353,
11,
4194,
657,
76,
3991,
340,
10596,
709,
3196,
389,
10137,
323,
9507,
76,
3991,
627,
2443,
73353,
11,
4194,
657,
76,
3991,
11,
4194,
693,
6320,
340,
4387,
6636,
3196,
389,
10137,
323,
9507,
76,
3991,
627,
7574,
22551,
9872,
25,
5884,
8,
11651,
2290,
76747,
60,
55609,
198,
14335,
42833,
6636,
369,
264,
2728,
9507,
76,
3991,
627,
21696,
73353,
25,
610,
11,
9507,
76,
3991,
25,
610,
8,
11651,
12536,
58,
14405,
58,
38238,
28819,
2484,
60,
55609,
198,
10596,
709,
3196,
389,
10137,
323,
9507,
76,
3991,
627,
2443,
73353,
25,
610,
11,
9507,
76,
3991,
25,
610,
11,
471,
6320,
25,
29971,
58,
38238,
2526,
11651,
2290,
76747,
60,
55609,
198,
4387,
6636,
3196,
389,
10137,
323,
9507,
76,
3991,
13
] | https://langchain.readthedocs.io/en/latest/cache/langchain.cache.RedisSemanticCache.html |
6368194f6570-0 | langchain.cache.MomentoCache¶
class langchain.cache.MomentoCache(cache_client: momento.CacheClient, cache_name: str, *, ttl: Optional[timedelta] = None, ensure_cache_exists: bool = True)[source]¶
Bases: BaseCache
Cache that uses Momento as a backend. See https://gomomento.com/
Instantiate a prompt cache using Momento as a backend.
Note: to instantiate the cache client passed to MomentoCache,
you must have a Momento account. See https://gomomento.com/.
Parameters
cache_client (CacheClient) – The Momento cache client.
cache_name (str) – The name of the cache to use to store the data.
ttl (Optional[timedelta], optional) – The time to live for the cache items.
Defaults to None, ie use the client default TTL.
ensure_cache_exists (bool, optional) – Create the cache if it doesn’t
exist. Defaults to True.
Raises
ImportError – Momento python package is not installed.
TypeError – cache_client is not of type momento.CacheClientObject
ValueError – ttl is non-null and non-negative
Methods
__init__(cache_client, cache_name, *[, ttl, ...])
Instantiate a prompt cache using Momento as a backend.
clear(**kwargs)
Clear the cache.
from_client_params(cache_name, ttl, *[, ...])
Construct cache from CacheClient parameters.
lookup(prompt, llm_string)
Lookup llm generations in cache by prompt and associated model and settings.
update(prompt, llm_string, return_val)
Store llm generations in cache.
clear(**kwargs: Any) → None[source]¶
Clear the cache.
Raises
SdkException – Momento service or network error | [
5317,
8995,
20688,
1345,
13209,
78,
8397,
55609,
198,
1058,
8859,
8995,
20688,
1345,
13209,
78,
8397,
33033,
8342,
25,
31221,
47230,
3032,
11,
6636,
1292,
25,
610,
11,
12039,
55032,
25,
12536,
14527,
318,
47954,
60,
284,
2290,
11,
6106,
11790,
9965,
25,
1845,
284,
3082,
6758,
2484,
60,
55609,
198,
33,
2315,
25,
5464,
8397,
198,
8397,
430,
5829,
40096,
78,
439,
264,
19713,
13,
3580,
3788,
1129,
37183,
13209,
78,
916,
6018,
81651,
264,
10137,
6636,
1701,
40096,
78,
439,
264,
19713,
627,
9290,
25,
311,
42002,
279,
6636,
3016,
5946,
311,
40096,
78,
8397,
345,
9514,
2011,
617,
264,
40096,
78,
2759,
13,
3580,
3788,
1129,
37183,
13209,
78,
916,
76969,
9905,
198,
9544,
8342,
320,
8397,
3032,
8,
1389,
578,
40096,
78,
6636,
3016,
627,
9544,
1292,
320,
496,
8,
1389,
578,
836,
315,
279,
6636,
311,
1005,
311,
3637,
279,
828,
627,
63958,
320,
15669,
14527,
318,
47954,
1145,
10309,
8,
1389,
578,
892,
311,
3974,
369,
279,
6636,
3673,
627,
16672,
311,
2290,
11,
30958,
1005,
279,
3016,
1670,
79632,
627,
28389,
11790,
9965,
320,
2707,
11,
10309,
8,
1389,
4324,
279,
6636,
422,
433,
3250,
1431,
198,
29675,
13,
37090,
311,
3082,
627,
36120,
198,
11772,
1480,
1389,
40096,
78,
10344,
6462,
374,
539,
10487,
627,
81176,
1389,
6636,
8342,
374,
539,
315,
955,
31221,
47230,
3032,
1211,
198,
1150,
1480,
1389,
55032,
374,
2536,
61441,
323,
2536,
62035,
198,
18337,
198,
565,
2381,
3889,
9544,
8342,
11,
4194,
9544,
1292,
11,
4194,
9,
38372,
4194,
63958,
11,
4194,
1131,
2608,
81651,
264,
10137,
6636,
1701,
40096,
78,
439,
264,
19713,
627,
7574,
22551,
9872,
340,
14335,
279,
6636,
627,
1527,
8342,
6887,
33033,
1292,
11,
4194,
63958,
11,
4194,
9,
38372,
4194,
1131,
2608,
29568,
6636,
505,
20044,
3032,
5137,
627,
21696,
73353,
11,
4194,
657,
76,
3991,
340,
35347,
9507,
76,
22540,
304,
6636,
555,
10137,
323,
5938,
1646,
323,
5110,
627,
2443,
73353,
11,
4194,
657,
76,
3991,
11,
4194,
693,
6320,
340,
6221,
9507,
76,
22540,
304,
6636,
627,
7574,
22551,
9872,
25,
5884,
8,
11651,
2290,
76747,
60,
55609,
198,
14335,
279,
6636,
627,
36120,
198,
58275,
1378,
1389,
40096,
78,
2532,
477,
4009,
1493
] | https://langchain.readthedocs.io/en/latest/cache/langchain.cache.MomentoCache.html |
6368194f6570-1 | Clear the cache.
Raises
SdkException – Momento service or network error
classmethod from_client_params(cache_name: str, ttl: timedelta, *, configuration: Optional[momento.config.Configuration] = None, auth_token: Optional[str] = None, **kwargs: Any) → MomentoCache[source]¶
Construct cache from CacheClient parameters.
lookup(prompt: str, llm_string: str) → Optional[Sequence[Generation]][source]¶
Lookup llm generations in cache by prompt and associated model and settings.
Parameters
prompt (str) – The prompt run through the language model.
llm_string (str) – The language model version and settings.
Raises
SdkException – Momento service or network error
Returns
A list of language model generations.
Return type
Optional[RETURN_VAL_TYPE]
update(prompt: str, llm_string: str, return_val: Sequence[Generation]) → None[source]¶
Store llm generations in cache.
Parameters
prompt (str) – The prompt run through the language model.
llm_string (str) – The language model string.
return_val (RETURN_VAL_TYPE) – A list of language model generations.
Raises
SdkException – Momento service or network error
Exception – Unexpected response | [
14335,
279,
6636,
627,
36120,
198,
58275,
1378,
1389,
40096,
78,
2532,
477,
4009,
1493,
198,
27853,
505,
8342,
6887,
33033,
1292,
25,
610,
11,
55032,
25,
43355,
11,
12039,
6683,
25,
12536,
12335,
13209,
78,
5539,
17785,
60,
284,
2290,
11,
4259,
6594,
25,
12536,
17752,
60,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
40096,
78,
8397,
76747,
60,
55609,
198,
29568,
6636,
505,
20044,
3032,
5137,
627,
21696,
73353,
25,
610,
11,
9507,
76,
3991,
25,
610,
8,
11651,
12536,
58,
14405,
58,
38238,
28819,
2484,
60,
55609,
198,
35347,
9507,
76,
22540,
304,
6636,
555,
10137,
323,
5938,
1646,
323,
5110,
627,
9905,
198,
41681,
320,
496,
8,
1389,
578,
10137,
1629,
1555,
279,
4221,
1646,
627,
657,
76,
3991,
320,
496,
8,
1389,
578,
4221,
1646,
2373,
323,
5110,
627,
36120,
198,
58275,
1378,
1389,
40096,
78,
2532,
477,
4009,
1493,
198,
16851,
198,
32,
1160,
315,
4221,
1646,
22540,
627,
5715,
955,
198,
15669,
58,
52633,
6224,
4283,
933,
2443,
73353,
25,
610,
11,
9507,
76,
3991,
25,
610,
11,
471,
6320,
25,
29971,
58,
38238,
2526,
11651,
2290,
76747,
60,
55609,
198,
6221,
9507,
76,
22540,
304,
6636,
627,
9905,
198,
41681,
320,
496,
8,
1389,
578,
10137,
1629,
1555,
279,
4221,
1646,
627,
657,
76,
3991,
320,
496,
8,
1389,
578,
4221,
1646,
925,
627,
693,
6320,
320,
52633,
6224,
4283,
8,
1389,
362,
1160,
315,
4221,
1646,
22540,
627,
36120,
198,
58275,
1378,
1389,
40096,
78,
2532,
477,
4009,
1493,
198,
1378,
1389,
71500,
2077
] | https://langchain.readthedocs.io/en/latest/cache/langchain.cache.MomentoCache.html |
22c465964089-0 | langchain.cache.SQLiteCache¶
class langchain.cache.SQLiteCache(database_path: str = '.langchain.db')[source]¶
Bases: SQLAlchemyCache
Cache that uses SQLite as a backend.
Initialize by creating the engine and all tables.
Methods
__init__([database_path])
Initialize by creating the engine and all tables.
clear(**kwargs)
Clear cache.
lookup(prompt, llm_string)
Look up based on prompt and llm_string.
update(prompt, llm_string, return_val)
Update based on prompt and llm_string.
clear(**kwargs: Any) → None¶
Clear cache.
lookup(prompt: str, llm_string: str) → Optional[Sequence[Generation]]¶
Look up based on prompt and llm_string.
update(prompt: str, llm_string: str, return_val: Sequence[Generation]) → None¶
Update based on prompt and llm_string. | [
5317,
8995,
20688,
98701,
8397,
55609,
198,
1058,
8859,
8995,
20688,
98701,
8397,
42749,
2703,
25,
610,
284,
6389,
5317,
8995,
7221,
13588,
2484,
60,
55609,
198,
33,
2315,
25,
85829,
8397,
198,
8397,
430,
5829,
29434,
439,
264,
19713,
627,
10130,
555,
6968,
279,
4817,
323,
682,
12920,
627,
18337,
198,
565,
2381,
565,
2625,
12494,
2703,
2608,
10130,
555,
6968,
279,
4817,
323,
682,
12920,
627,
7574,
22551,
9872,
340,
14335,
6636,
627,
21696,
73353,
11,
4194,
657,
76,
3991,
340,
10596,
709,
3196,
389,
10137,
323,
9507,
76,
3991,
627,
2443,
73353,
11,
4194,
657,
76,
3991,
11,
4194,
693,
6320,
340,
4387,
3196,
389,
10137,
323,
9507,
76,
3991,
627,
7574,
22551,
9872,
25,
5884,
8,
11651,
2290,
55609,
198,
14335,
6636,
627,
21696,
73353,
25,
610,
11,
9507,
76,
3991,
25,
610,
8,
11651,
12536,
58,
14405,
58,
38238,
5163,
55609,
198,
10596,
709,
3196,
389,
10137,
323,
9507,
76,
3991,
627,
2443,
73353,
25,
610,
11,
9507,
76,
3991,
25,
610,
11,
471,
6320,
25,
29971,
58,
38238,
2526,
11651,
2290,
55609,
198,
4387,
3196,
389,
10137,
323,
9507,
76,
3991,
13
] | https://langchain.readthedocs.io/en/latest/cache/langchain.cache.SQLiteCache.html |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.