id
stringlengths 14
15
| text
stringlengths 35
2.07k
| embedding
sequence | source
stringlengths 61
154
|
---|---|---|---|
8a7391548d1d-0 | langchain.document_loaders.confluence.ConfluenceLoader¶
class langchain.document_loaders.confluence.ConfluenceLoader(url: str, api_key: Optional[str] = None, username: Optional[str] = None, oauth2: Optional[dict] = None, token: Optional[str] = None, cloud: Optional[bool] = True, number_of_retries: Optional[int] = 3, min_retry_seconds: Optional[int] = 2, max_retry_seconds: Optional[int] = 10, confluence_kwargs: Optional[dict] = None)[source]¶
Bases: BaseLoader
Load Confluence pages. Port of https://llamahub.ai/l/confluence
This currently supports username/api_key, Oauth2 login or personal access token
authentication.
Specify a list page_ids and/or space_key to load in the corresponding pages into
Document objects, if both are specified the union of both sets will be returned.
You can also specify a boolean include_attachments to include attachments, this
is set to False by default, if set to True all attachments will be downloaded and
ConfluenceReader will extract the text from the attachments and add it to the
Document object. Currently supported attachment types are: PDF, PNG, JPEG/JPG,
SVG, Word and Excel.
Confluence API supports difference format of page content. The storage format is the
raw XML representation for storage. The view format is the HTML representation for
viewing with macros are rendered as though it is viewed by users. You can pass
a enum content_format argument to load() to specify the content format, this is
set to ContentFormat.STORAGE by default.
Hint: space_key and page_id can both be found in the URL of a page in Confluence
- https://yoursite.atlassian.com/wiki/spaces/<space_key>/pages/<page_id>
Example | [
5317,
8995,
17926,
12693,
388,
2932,
41116,
4906,
41116,
9360,
55609,
198,
1058,
8859,
8995,
17926,
12693,
388,
2932,
41116,
4906,
41116,
9360,
6659,
25,
610,
11,
6464,
3173,
25,
12536,
17752,
60,
284,
2290,
11,
6059,
25,
12536,
17752,
60,
284,
2290,
11,
47515,
17,
25,
12536,
58,
8644,
60,
284,
2290,
11,
4037,
25,
12536,
17752,
60,
284,
2290,
11,
9624,
25,
12536,
58,
2707,
60,
284,
3082,
11,
1396,
3659,
1311,
4646,
25,
12536,
19155,
60,
284,
220,
18,
11,
1332,
63845,
35925,
25,
12536,
19155,
60,
284,
220,
17,
11,
1973,
63845,
35925,
25,
12536,
19155,
60,
284,
220,
605,
11,
390,
41116,
37335,
25,
12536,
58,
8644,
60,
284,
2290,
6758,
2484,
60,
55609,
198,
33,
2315,
25,
5464,
9360,
198,
6003,
1221,
41116,
6959,
13,
5896,
315,
3788,
1129,
657,
309,
1494,
392,
41483,
13631,
32336,
41116,
198,
2028,
5131,
11815,
6059,
10729,
3173,
11,
507,
3322,
17,
5982,
477,
4443,
2680,
4037,
198,
46990,
627,
71152,
264,
1160,
2199,
8237,
323,
5255,
3634,
3173,
311,
2865,
304,
279,
12435,
6959,
1139,
198,
7676,
6302,
11,
422,
2225,
527,
5300,
279,
11552,
315,
2225,
7437,
690,
387,
6052,
627,
2675,
649,
1101,
14158,
264,
2777,
2997,
99681,
311,
2997,
34779,
11,
420,
198,
285,
743,
311,
3641,
555,
1670,
11,
422,
743,
311,
3082,
682,
34779,
690,
387,
24174,
323,
198,
1128,
41116,
5172,
690,
8819,
279,
1495,
505,
279,
34779,
323,
923,
433,
311,
279,
198,
7676,
1665,
13,
25122,
7396,
20581,
4595,
527,
25,
11612,
11,
42739,
11,
55662,
32801,
11637,
345,
65497,
11,
9506,
323,
21705,
627,
1128,
41116,
5446,
11815,
6811,
3645,
315,
2199,
2262,
13,
578,
5942,
3645,
374,
279,
198,
1059,
12138,
13340,
369,
5942,
13,
578,
1684,
3645,
374,
279,
9492,
13340,
369,
198,
1068,
287,
449,
38814,
527,
23188,
439,
3582,
433,
374,
19894,
555,
3932,
13,
1472,
649,
1522,
198,
64,
7773,
2262,
9132,
5811,
311,
2865,
368,
311,
14158,
279,
2262,
3645,
11,
420,
374,
198,
751,
311,
9059,
4152,
16009,
28808,
555,
1670,
627,
28085,
25,
3634,
3173,
323,
2199,
851,
649,
2225,
387,
1766,
304,
279,
5665,
315,
264,
2199,
304,
1221,
41116,
198,
12,
3788,
1129,
88,
2530,
635,
6990,
90697,
916,
26583,
27832,
2492,
23875,
8920,
3173,
18597,
11014,
23875,
2964,
851,
397,
13617
] | https://langchain.readthedocs.io/en/latest/document_loaders/langchain.document_loaders.confluence.ConfluenceLoader.html |
8a7391548d1d-1 | Example
from langchain.document_loaders import ConfluenceLoader
loader = ConfluenceLoader(
url="https://yoursite.atlassian.com/wiki",
username="me",
api_key="12345"
)
documents = loader.load(space_key="SPACE",limit=50)
Parameters
url (str) – _description_
api_key (str, optional) – _description_, defaults to None
username (str, optional) – _description_, defaults to None
oauth2 (dict, optional) – _description_, defaults to {}
token (str, optional) – _description_, defaults to None
cloud (bool, optional) – _description_, defaults to True
number_of_retries (Optional[int], optional) – How many times to retry, defaults to 3
min_retry_seconds (Optional[int], optional) – defaults to 2
max_retry_seconds (Optional[int], optional) – defaults to 10
confluence_kwargs (dict, optional) – additional kwargs to initialize confluence with
Raises
ValueError – Errors while validating input
ImportError – Required dependencies not installed.
Methods
__init__(url[, api_key, username, oauth2, ...])
is_public_page(page)
Check if a page is publicly accessible.
lazy_load()
A lazy loader for document content.
load([space_key, page_ids, label, cql, ...])
param space_key
Space key retrieved from a confluence URL, defaults to None
load_and_split([text_splitter])
Load documents and split into chunks.
paginate_request(retrieval_method, **kwargs)
Paginate the various methods to retrieve groups of pages.
process_attachment(page_id[, ocr_languages])
process_doc(link)
process_image(link[, ocr_languages])
process_page(page, include_attachments, ...) | [
13617,
198,
1527,
8859,
8995,
17926,
12693,
388,
1179,
1221,
41116,
9360,
198,
8520,
284,
1221,
41116,
9360,
1021,
262,
2576,
429,
2485,
1129,
88,
2530,
635,
6990,
90697,
916,
26583,
761,
262,
6059,
429,
2727,
761,
262,
6464,
3173,
429,
4513,
1774,
702,
340,
51878,
284,
16432,
5214,
61804,
3173,
429,
45741,
498,
9696,
28,
1135,
340,
9905,
198,
1103,
320,
496,
8,
1389,
721,
4789,
13220,
2113,
3173,
320,
496,
11,
10309,
8,
1389,
721,
4789,
7022,
17088,
311,
2290,
198,
5223,
320,
496,
11,
10309,
8,
1389,
721,
4789,
7022,
17088,
311,
2290,
198,
35463,
17,
320,
8644,
11,
10309,
8,
1389,
721,
4789,
7022,
17088,
311,
5731,
5963,
320,
496,
11,
10309,
8,
1389,
721,
4789,
7022,
17088,
311,
2290,
198,
12641,
320,
2707,
11,
10309,
8,
1389,
721,
4789,
7022,
17088,
311,
3082,
198,
4174,
3659,
1311,
4646,
320,
15669,
19155,
1145,
10309,
8,
1389,
2650,
1690,
3115,
311,
23515,
11,
17088,
311,
220,
18,
198,
1083,
63845,
35925,
320,
15669,
19155,
1145,
10309,
8,
1389,
17088,
311,
220,
17,
198,
2880,
63845,
35925,
320,
15669,
19155,
1145,
10309,
8,
1389,
17088,
311,
220,
605,
198,
444,
41116,
37335,
320,
8644,
11,
10309,
8,
1389,
5217,
16901,
311,
9656,
390,
41116,
449,
198,
36120,
198,
1150,
1480,
1389,
40356,
1418,
69772,
1988,
198,
11772,
1480,
1389,
12948,
20113,
539,
10487,
627,
18337,
198,
565,
2381,
3889,
1103,
38372,
4194,
2113,
3173,
11,
4194,
5223,
11,
4194,
35463,
17,
11,
4194,
1131,
2608,
285,
28173,
6257,
12293,
340,
4061,
422,
264,
2199,
374,
17880,
15987,
627,
50113,
12693,
746,
32,
16053,
16432,
369,
2246,
2262,
627,
1096,
2625,
8920,
3173,
11,
4194,
2964,
8237,
11,
4194,
1530,
11,
4194,
66,
1498,
11,
4194,
1131,
2608,
913,
3634,
3173,
198,
10115,
1401,
31503,
505,
264,
390,
41116,
5665,
11,
17088,
311,
2290,
198,
1096,
8543,
17489,
2625,
1342,
17489,
466,
2608,
6003,
9477,
323,
6859,
1139,
27855,
627,
39565,
8052,
5921,
9104,
838,
9209,
11,
4194,
334,
9872,
340,
48812,
3357,
279,
5370,
5528,
311,
17622,
5315,
315,
6959,
627,
4734,
41968,
12293,
851,
38372,
4194,
4309,
78676,
2608,
4734,
19401,
26461,
340,
4734,
5060,
26461,
38372,
4194,
4309,
78676,
2608,
4734,
6257,
12293,
11,
4194,
1012,
99681,
11,
4194,
33674
] | https://langchain.readthedocs.io/en/latest/document_loaders/langchain.document_loaders.confluence.ConfluenceLoader.html |
8a7391548d1d-2 | process_image(link[, ocr_languages])
process_page(page, include_attachments, ...)
process_pages(pages, ...[, ocr_languages])
Process a list of pages into a list of documents.
process_pdf(link[, ocr_languages])
process_svg(link[, ocr_languages])
process_xls(link)
validate_init_args([url, api_key, username, ...])
Validates proper combinations of init arguments
is_public_page(page: dict) → bool[source]¶
Check if a page is publicly accessible.
lazy_load() → Iterator[Document]¶
A lazy loader for document content.
load(space_key: Optional[str] = None, page_ids: Optional[List[str]] = None, label: Optional[str] = None, cql: Optional[str] = None, include_restricted_content: bool = False, include_archived_content: bool = False, include_attachments: bool = False, include_comments: bool = False, content_format: ContentFormat = ContentFormat.STORAGE, limit: Optional[int] = 50, max_pages: Optional[int] = 1000, ocr_languages: Optional[str] = None) → List[Document][source]¶
Parameters
space_key (Optional[str], optional) – Space key retrieved from a confluence URL, defaults to None
page_ids (Optional[List[str]], optional) – List of specific page IDs to load, defaults to None
label (Optional[str], optional) – Get all pages with this label, defaults to None
cql (Optional[str], optional) – CQL Expression, defaults to None
include_restricted_content (bool, optional) – defaults to False
include_archived_content (bool, optional) – Whether to include archived content,
defaults to False
include_attachments (bool, optional) – defaults to False
include_comments (bool, optional) – defaults to False | [
4734,
5060,
26461,
38372,
4194,
4309,
78676,
2608,
4734,
6257,
12293,
11,
4194,
1012,
99681,
11,
4194,
32318,
4734,
22391,
99822,
11,
4194,
1131,
38372,
4194,
4309,
78676,
2608,
7575,
264,
1160,
315,
6959,
1139,
264,
1160,
315,
9477,
627,
4734,
41048,
26461,
38372,
4194,
4309,
78676,
2608,
4734,
77949,
26461,
38372,
4194,
4309,
78676,
2608,
4734,
3292,
4835,
26461,
340,
7212,
6265,
8550,
2625,
1103,
11,
4194,
2113,
3173,
11,
4194,
5223,
11,
4194,
1131,
2608,
4180,
988,
6300,
28559,
315,
3003,
6105,
198,
285,
28173,
6257,
12293,
25,
6587,
8,
11651,
1845,
76747,
60,
55609,
198,
4061,
422,
264,
2199,
374,
17880,
15987,
627,
50113,
12693,
368,
11651,
23887,
58,
7676,
60,
55609,
198,
32,
16053,
16432,
369,
2246,
2262,
627,
1096,
61804,
3173,
25,
12536,
17752,
60,
284,
2290,
11,
2199,
8237,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
2440,
25,
12536,
17752,
60,
284,
2290,
11,
272,
1498,
25,
12536,
17752,
60,
284,
2290,
11,
2997,
1311,
74172,
7647,
25,
1845,
284,
3641,
11,
2997,
35430,
2270,
7647,
25,
1845,
284,
3641,
11,
2997,
99681,
25,
1845,
284,
3641,
11,
2997,
31459,
25,
1845,
284,
3641,
11,
2262,
9132,
25,
9059,
4152,
284,
9059,
4152,
16009,
28808,
11,
4017,
25,
12536,
19155,
60,
284,
220,
1135,
11,
1973,
22391,
25,
12536,
19155,
60,
284,
220,
1041,
15,
11,
297,
5192,
78676,
25,
12536,
17752,
60,
284,
2290,
8,
11651,
1796,
58,
7676,
1483,
2484,
60,
55609,
198,
9905,
198,
8920,
3173,
320,
15669,
17752,
1145,
10309,
8,
1389,
11746,
1401,
31503,
505,
264,
390,
41116,
5665,
11,
17088,
311,
2290,
198,
2964,
8237,
320,
15669,
53094,
17752,
21128,
10309,
8,
1389,
1796,
315,
3230,
2199,
29460,
311,
2865,
11,
17088,
311,
2290,
198,
1530,
320,
15669,
17752,
1145,
10309,
8,
1389,
2175,
682,
6959,
449,
420,
2440,
11,
17088,
311,
2290,
198,
66,
1498,
320,
15669,
17752,
1145,
10309,
8,
1389,
356,
3672,
16783,
11,
17088,
311,
2290,
198,
1012,
1311,
74172,
7647,
320,
2707,
11,
10309,
8,
1389,
17088,
311,
3641,
198,
1012,
35430,
2270,
7647,
320,
2707,
11,
10309,
8,
1389,
13440,
311,
2997,
53093,
2262,
345,
27854,
311,
3641,
198,
1012,
99681,
320,
2707,
11,
10309,
8,
1389,
17088,
311,
3641,
198,
1012,
31459,
320,
2707,
11,
10309,
8,
1389,
17088,
311,
3641
] | https://langchain.readthedocs.io/en/latest/document_loaders/langchain.document_loaders.confluence.ConfluenceLoader.html |
8a7391548d1d-3 | include_comments (bool, optional) – defaults to False
content_format (ContentFormat) – Specify content format, defaults to ContentFormat.STORAGE
limit (int, optional) – Maximum number of pages to retrieve per request, defaults to 50
max_pages (int, optional) – Maximum number of pages to retrieve in total, defaults 1000
ocr_languages (str, optional) – The languages to use for the Tesseract agent. To use a
language, you’ll first need to install the appropriate
Tesseract language pack.
Raises
ValueError – _description_
ImportError – _description_
Returns
_description_
Return type
List[Document]
load_and_split(text_splitter: Optional[TextSplitter] = None) → List[Document]¶
Load documents and split into chunks.
paginate_request(retrieval_method: Callable, **kwargs: Any) → List[source]¶
Paginate the various methods to retrieve groups of pages.
Unfortunately, due to page size, sometimes the Confluence API
doesn’t match the limit value. If limit is >100 confluence
seems to cap the response to 100. Also, due to the Atlassian Python
package, we don’t get the “next” values from the “_links” key because
they only return the value from the results key. So here, the pagination
starts from 0 and goes until the max_pages, getting the limit number
of pages with each request. We have to manually check if there
are more docs based on the length of the returned list of pages, rather than
just checking for the presence of a next key in the response like this page
would have you do:
https://developer.atlassian.com/server/confluence/pagination-in-the-rest-api/
Parameters
retrieval_method (callable) – Function used to retrieve docs
Returns
List of documents
Return type | [
1012,
31459,
320,
2707,
11,
10309,
8,
1389,
17088,
311,
3641,
198,
1834,
9132,
320,
2831,
4152,
8,
1389,
48495,
2262,
3645,
11,
17088,
311,
9059,
4152,
16009,
28808,
198,
9696,
320,
396,
11,
10309,
8,
1389,
27697,
1396,
315,
6959,
311,
17622,
824,
1715,
11,
17088,
311,
220,
1135,
198,
2880,
22391,
320,
396,
11,
10309,
8,
1389,
27697,
1396,
315,
6959,
311,
17622,
304,
2860,
11,
17088,
220,
1041,
15,
198,
4309,
78676,
320,
496,
11,
10309,
8,
1389,
578,
15823,
311,
1005,
369,
279,
350,
83638,
8479,
13,
2057,
1005,
264,
198,
11789,
11,
499,
4805,
1176,
1205,
311,
4685,
279,
8475,
198,
51,
83638,
4221,
3854,
627,
36120,
198,
1150,
1480,
1389,
721,
4789,
13220,
11772,
1480,
1389,
721,
4789,
13220,
16851,
198,
11703,
13220,
5715,
955,
198,
861,
58,
7676,
933,
1096,
8543,
17489,
7383,
17489,
466,
25,
12536,
58,
1199,
20805,
466,
60,
284,
2290,
8,
11651,
1796,
58,
7676,
60,
55609,
198,
6003,
9477,
323,
6859,
1139,
27855,
627,
39565,
8052,
5921,
9104,
838,
9209,
25,
54223,
11,
3146,
9872,
25,
5884,
8,
11651,
1796,
76747,
60,
55609,
198,
48812,
3357,
279,
5370,
5528,
311,
17622,
5315,
315,
6959,
627,
31140,
11,
4245,
311,
2199,
1404,
11,
7170,
279,
1221,
41116,
5446,
198,
73014,
1431,
2489,
279,
4017,
907,
13,
1442,
4017,
374,
220,
871,
1041,
390,
41116,
198,
325,
12116,
311,
2107,
279,
2077,
311,
220,
1041,
13,
7429,
11,
4245,
311,
279,
2468,
90697,
13325,
198,
1757,
11,
584,
1541,
1431,
636,
279,
1054,
3684,
863,
2819,
505,
279,
1054,
62,
16259,
863,
1401,
1606,
198,
20670,
1193,
471,
279,
907,
505,
279,
3135,
1401,
13,
2100,
1618,
11,
279,
29595,
198,
66976,
505,
220,
15,
323,
5900,
3156,
279,
1973,
22391,
11,
3794,
279,
4017,
1396,
198,
1073,
6959,
449,
1855,
1715,
13,
1226,
617,
311,
20684,
1817,
422,
1070,
198,
548,
810,
27437,
3196,
389,
279,
3160,
315,
279,
6052,
1160,
315,
6959,
11,
4856,
1109,
198,
4345,
13598,
369,
279,
9546,
315,
264,
1828,
1401,
304,
279,
2077,
1093,
420,
2199,
198,
41450,
617,
499,
656,
512,
2485,
1129,
35501,
6990,
90697,
916,
38355,
32336,
41116,
4420,
10569,
3502,
10826,
77333,
24851,
6018,
9905,
198,
265,
9104,
838,
9209,
320,
96292,
8,
1389,
5830,
1511,
311,
17622,
27437,
198,
16851,
198,
861,
315,
9477,
198,
5715,
955
] | https://langchain.readthedocs.io/en/latest/document_loaders/langchain.document_loaders.confluence.ConfluenceLoader.html |
8a7391548d1d-4 | Returns
List of documents
Return type
List
process_attachment(page_id: str, ocr_languages: Optional[str] = None) → List[str][source]¶
process_doc(link: str) → str[source]¶
process_image(link: str, ocr_languages: Optional[str] = None) → str[source]¶
process_page(page: dict, include_attachments: bool, include_comments: bool, content_format: ContentFormat, ocr_languages: Optional[str] = None) → Document[source]¶
process_pages(pages: List[dict], include_restricted_content: bool, include_attachments: bool, include_comments: bool, content_format: ContentFormat, ocr_languages: Optional[str] = None) → List[Document][source]¶
Process a list of pages into a list of documents.
process_pdf(link: str, ocr_languages: Optional[str] = None) → str[source]¶
process_svg(link: str, ocr_languages: Optional[str] = None) → str[source]¶
process_xls(link: str) → str[source]¶
static validate_init_args(url: Optional[str] = None, api_key: Optional[str] = None, username: Optional[str] = None, oauth2: Optional[dict] = None, token: Optional[str] = None) → Optional[List][source]¶
Validates proper combinations of init arguments | [
16851,
198,
861,
315,
9477,
198,
5715,
955,
198,
861,
198,
4734,
41968,
12293,
851,
25,
610,
11,
297,
5192,
78676,
25,
12536,
17752,
60,
284,
2290,
8,
11651,
1796,
17752,
1483,
2484,
60,
55609,
198,
4734,
19401,
26461,
25,
610,
8,
11651,
610,
76747,
60,
55609,
198,
4734,
5060,
26461,
25,
610,
11,
297,
5192,
78676,
25,
12536,
17752,
60,
284,
2290,
8,
11651,
610,
76747,
60,
55609,
198,
4734,
6257,
12293,
25,
6587,
11,
2997,
99681,
25,
1845,
11,
2997,
31459,
25,
1845,
11,
2262,
9132,
25,
9059,
4152,
11,
297,
5192,
78676,
25,
12536,
17752,
60,
284,
2290,
8,
11651,
12051,
76747,
60,
55609,
198,
4734,
22391,
99822,
25,
1796,
58,
8644,
1145,
2997,
1311,
74172,
7647,
25,
1845,
11,
2997,
99681,
25,
1845,
11,
2997,
31459,
25,
1845,
11,
2262,
9132,
25,
9059,
4152,
11,
297,
5192,
78676,
25,
12536,
17752,
60,
284,
2290,
8,
11651,
1796,
58,
7676,
1483,
2484,
60,
55609,
198,
7575,
264,
1160,
315,
6959,
1139,
264,
1160,
315,
9477,
627,
4734,
41048,
26461,
25,
610,
11,
297,
5192,
78676,
25,
12536,
17752,
60,
284,
2290,
8,
11651,
610,
76747,
60,
55609,
198,
4734,
77949,
26461,
25,
610,
11,
297,
5192,
78676,
25,
12536,
17752,
60,
284,
2290,
8,
11651,
610,
76747,
60,
55609,
198,
4734,
3292,
4835,
26461,
25,
610,
8,
11651,
610,
76747,
60,
55609,
198,
2020,
9788,
6265,
8550,
6659,
25,
12536,
17752,
60,
284,
2290,
11,
6464,
3173,
25,
12536,
17752,
60,
284,
2290,
11,
6059,
25,
12536,
17752,
60,
284,
2290,
11,
47515,
17,
25,
12536,
58,
8644,
60,
284,
2290,
11,
4037,
25,
12536,
17752,
60,
284,
2290,
8,
11651,
12536,
53094,
1483,
2484,
60,
55609,
198,
4180,
988,
6300,
28559,
315,
3003,
6105
] | https://langchain.readthedocs.io/en/latest/document_loaders/langchain.document_loaders.confluence.ConfluenceLoader.html |
c6844e2cb589-0 | langchain.document_loaders.parsers.pdf.PDFMinerParser¶
class langchain.document_loaders.parsers.pdf.PDFMinerParser[source]¶
Bases: BaseBlobParser
Parse PDFs with PDFMiner.
Methods
__init__()
lazy_parse(blob)
Lazily parse the blob.
parse(blob)
Eagerly parse the blob into a document or documents.
lazy_parse(blob: Blob) → Iterator[Document][source]¶
Lazily parse the blob.
parse(blob: Blob) → List[Document]¶
Eagerly parse the blob into a document or documents.
This is a convenience method for interactive development environment.
Production applications should favor the lazy_parse method instead.
Subclasses should generally not over-ride this parse method.
Parameters
blob – Blob instance
Returns
List of documents | [
5317,
8995,
17926,
12693,
388,
76592,
16378,
1087,
5375,
6349,
261,
6707,
55609,
198,
1058,
8859,
8995,
17926,
12693,
388,
76592,
16378,
1087,
5375,
6349,
261,
6707,
76747,
60,
55609,
198,
33,
2315,
25,
5464,
39085,
6707,
198,
14802,
11612,
82,
449,
11612,
6349,
261,
627,
18337,
198,
565,
2381,
33716,
50113,
21715,
69038,
340,
43,
1394,
1570,
4820,
279,
24295,
627,
6534,
69038,
340,
36,
1435,
398,
4820,
279,
24295,
1139,
264,
2246,
477,
9477,
627,
50113,
21715,
69038,
25,
50539,
8,
11651,
23887,
58,
7676,
1483,
2484,
60,
55609,
198,
43,
1394,
1570,
4820,
279,
24295,
627,
6534,
69038,
25,
50539,
8,
11651,
1796,
58,
7676,
60,
55609,
198,
36,
1435,
398,
4820,
279,
24295,
1139,
264,
2246,
477,
9477,
627,
2028,
374,
264,
19679,
1749,
369,
21416,
4500,
4676,
627,
46067,
8522,
1288,
4799,
279,
16053,
21715,
1749,
4619,
627,
3214,
9031,
1288,
8965,
539,
927,
12,
1425,
420,
4820,
1749,
627,
9905,
198,
36212,
1389,
50539,
2937,
198,
16851,
198,
861,
315,
9477
] | https://langchain.readthedocs.io/en/latest/document_loaders/langchain.document_loaders.parsers.pdf.PDFMinerParser.html |
ef640147e6e9-0 | langchain.document_loaders.pdf.PyMuPDFLoader¶
class langchain.document_loaders.pdf.PyMuPDFLoader(file_path: str)[source]¶
Bases: BasePDFLoader
Loader that uses PyMuPDF to load PDF files.
Initialize with file path.
Methods
__init__(file_path)
Initialize with file path.
lazy_load()
A lazy loader for document content.
load(**kwargs)
Load file.
load_and_split([text_splitter])
Load documents and split into chunks.
Attributes
source
lazy_load() → Iterator[Document]¶
A lazy loader for document content.
load(**kwargs: Optional[Any]) → List[Document][source]¶
Load file.
load_and_split(text_splitter: Optional[TextSplitter] = None) → List[Document]¶
Load documents and split into chunks.
property source: str¶ | [
5317,
8995,
17926,
12693,
388,
16378,
1087,
88,
40220,
24317,
9360,
55609,
198,
1058,
8859,
8995,
17926,
12693,
388,
16378,
1087,
88,
40220,
24317,
9360,
4971,
2703,
25,
610,
6758,
2484,
60,
55609,
198,
33,
2315,
25,
5464,
24317,
9360,
198,
9360,
430,
5829,
5468,
40220,
24317,
311,
2865,
11612,
3626,
627,
10130,
449,
1052,
1853,
627,
18337,
198,
565,
2381,
3889,
1213,
2703,
340,
10130,
449,
1052,
1853,
627,
50113,
12693,
746,
32,
16053,
16432,
369,
2246,
2262,
627,
1096,
22551,
9872,
340,
6003,
1052,
627,
1096,
8543,
17489,
2625,
1342,
17489,
466,
2608,
6003,
9477,
323,
6859,
1139,
27855,
627,
10738,
198,
2484,
198,
50113,
12693,
368,
11651,
23887,
58,
7676,
60,
55609,
198,
32,
16053,
16432,
369,
2246,
2262,
627,
1096,
22551,
9872,
25,
12536,
71401,
2526,
11651,
1796,
58,
7676,
1483,
2484,
60,
55609,
198,
6003,
1052,
627,
1096,
8543,
17489,
7383,
17489,
466,
25,
12536,
58,
1199,
20805,
466,
60,
284,
2290,
8,
11651,
1796,
58,
7676,
60,
55609,
198,
6003,
9477,
323,
6859,
1139,
27855,
627,
3784,
2592,
25,
610,
55609
] | https://langchain.readthedocs.io/en/latest/document_loaders/langchain.document_loaders.pdf.PyMuPDFLoader.html |
8c8275e8ec2e-0 | langchain.document_loaders.bibtex.BibtexLoader¶
class langchain.document_loaders.bibtex.BibtexLoader(file_path: str, *, parser: Optional[BibtexparserWrapper] = None, max_docs: Optional[int] = None, max_content_chars: Optional[int] = 4000, load_extra_metadata: bool = False, file_pattern: str = '[^:]+\\.pdf')[source]¶
Bases: BaseLoader
Loads a bibtex file into a list of Documents.
Each document represents one entry from the bibtex file.
If a PDF file is present in the file bibtex field, the original PDF
is loaded into the document text. If no such file entry is present,
the abstract field is used instead.
Initialize the BibtexLoader.
Parameters
file_path – Path to the bibtex file.
max_docs – Max number of associated documents to load. Use -1 means
no limit.
Methods
__init__(file_path, *[, parser, max_docs, ...])
Initialize the BibtexLoader.
lazy_load()
Load bibtex file using bibtexparser and get the article texts plus the
load()
Load bibtex file documents from the given bibtex file path.
load_and_split([text_splitter])
Load documents and split into chunks.
lazy_load() → Iterator[Document][source]¶
Load bibtex file using bibtexparser and get the article texts plus the
article metadata.
See https://bibtexparser.readthedocs.io/en/master/
Returns
a list of documents with the document.page_content in text format
load() → List[Document][source]¶
Load bibtex file documents from the given bibtex file path. | [
5317,
8995,
17926,
12693,
388,
960,
20938,
327,
1823,
20938,
327,
9360,
55609,
198,
1058,
8859,
8995,
17926,
12693,
388,
960,
20938,
327,
1823,
20938,
327,
9360,
4971,
2703,
25,
610,
11,
12039,
6871,
25,
12536,
33722,
20938,
327,
9854,
11803,
60,
284,
2290,
11,
1973,
50792,
25,
12536,
19155,
60,
284,
2290,
11,
1973,
7647,
38518,
25,
12536,
19155,
60,
284,
220,
3443,
15,
11,
2865,
32958,
23012,
25,
1845,
284,
3641,
11,
1052,
21957,
25,
610,
284,
18814,
61,
25,
7727,
68257,
12091,
13588,
2484,
60,
55609,
198,
33,
2315,
25,
5464,
9360,
198,
79617,
264,
293,
20938,
327,
1052,
1139,
264,
1160,
315,
45890,
627,
4959,
2246,
11105,
832,
4441,
505,
279,
293,
20938,
327,
1052,
627,
2746,
264,
11612,
1052,
374,
3118,
304,
279,
1052,
293,
20938,
327,
2115,
11,
279,
4113,
11612,
198,
285,
6799,
1139,
279,
2246,
1495,
13,
1442,
912,
1778,
1052,
4441,
374,
3118,
345,
1820,
8278,
2115,
374,
1511,
4619,
627,
10130,
279,
426,
20938,
327,
9360,
627,
9905,
198,
1213,
2703,
1389,
8092,
311,
279,
293,
20938,
327,
1052,
627,
2880,
50792,
1389,
7639,
1396,
315,
5938,
9477,
311,
2865,
13,
5560,
482,
16,
3445,
198,
2201,
4017,
627,
18337,
198,
565,
2381,
3889,
1213,
2703,
11,
4194,
9,
38372,
4194,
9854,
11,
4194,
2880,
50792,
11,
4194,
1131,
2608,
10130,
279,
426,
20938,
327,
9360,
627,
50113,
12693,
746,
6003,
293,
20938,
327,
1052,
1701,
293,
20938,
327,
9854,
323,
636,
279,
4652,
22755,
5636,
279,
198,
1096,
746,
6003,
293,
20938,
327,
1052,
9477,
505,
279,
2728,
293,
20938,
327,
1052,
1853,
627,
1096,
8543,
17489,
2625,
1342,
17489,
466,
2608,
6003,
9477,
323,
6859,
1139,
27855,
627,
50113,
12693,
368,
11651,
23887,
58,
7676,
1483,
2484,
60,
55609,
198,
6003,
293,
20938,
327,
1052,
1701,
293,
20938,
327,
9854,
323,
636,
279,
4652,
22755,
5636,
279,
198,
7203,
11408,
627,
10031,
3788,
1129,
65,
20938,
327,
9854,
4217,
79371,
14460,
4340,
13920,
24184,
6018,
16851,
198,
64,
1160,
315,
9477,
449,
279,
2246,
10678,
7647,
304,
1495,
3645,
198,
1096,
368,
11651,
1796,
58,
7676,
1483,
2484,
60,
55609,
198,
6003,
293,
20938,
327,
1052,
9477,
505,
279,
2728,
293,
20938,
327,
1052,
1853,
13
] | https://langchain.readthedocs.io/en/latest/document_loaders/langchain.document_loaders.bibtex.BibtexLoader.html |
8c8275e8ec2e-1 | Load bibtex file documents from the given bibtex file path.
See https://bibtexparser.readthedocs.io/en/master/
Parameters
file_path – the path to the bibtex file
Returns
a list of documents with the document.page_content in text format
load_and_split(text_splitter: Optional[TextSplitter] = None) → List[Document]¶
Load documents and split into chunks. | [
6003,
293,
20938,
327,
1052,
9477,
505,
279,
2728,
293,
20938,
327,
1052,
1853,
627,
10031,
3788,
1129,
65,
20938,
327,
9854,
4217,
79371,
14460,
4340,
13920,
24184,
6018,
9905,
198,
1213,
2703,
1389,
279,
1853,
311,
279,
293,
20938,
327,
1052,
198,
16851,
198,
64,
1160,
315,
9477,
449,
279,
2246,
10678,
7647,
304,
1495,
3645,
198,
1096,
8543,
17489,
7383,
17489,
466,
25,
12536,
58,
1199,
20805,
466,
60,
284,
2290,
8,
11651,
1796,
58,
7676,
60,
55609,
198,
6003,
9477,
323,
6859,
1139,
27855,
13
] | https://langchain.readthedocs.io/en/latest/document_loaders/langchain.document_loaders.bibtex.BibtexLoader.html |
db63921f3b9d-0 | langchain.document_loaders.unstructured.validate_unstructured_version¶
langchain.document_loaders.unstructured.validate_unstructured_version(min_unstructured_version: str) → None[source]¶
Raises an error if the unstructured version does not exceed the
specified minimum. | [
5317,
8995,
17926,
12693,
388,
6441,
52243,
20090,
5012,
52243,
9625,
55609,
198,
5317,
8995,
17926,
12693,
388,
6441,
52243,
20090,
5012,
52243,
9625,
14478,
5012,
52243,
9625,
25,
610,
8,
11651,
2290,
76747,
60,
55609,
198,
36120,
459,
1493,
422,
279,
653,
52243,
2373,
1587,
539,
12771,
279,
198,
54534,
8187,
13
] | https://langchain.readthedocs.io/en/latest/document_loaders/langchain.document_loaders.unstructured.validate_unstructured_version.html |
f80942041627-0 | langchain.document_loaders.excel.UnstructuredExcelLoader¶
class langchain.document_loaders.excel.UnstructuredExcelLoader(file_path: str, mode: str = 'single', **unstructured_kwargs: Any)[source]¶
Bases: UnstructuredFileLoader
Loader that uses unstructured to load Microsoft Excel files.
Initialize with file path.
Methods
__init__(file_path[, mode])
Initialize with file path.
lazy_load()
A lazy loader for document content.
load()
Load file.
load_and_split([text_splitter])
Load documents and split into chunks.
lazy_load() → Iterator[Document]¶
A lazy loader for document content.
load() → List[Document]¶
Load file.
load_and_split(text_splitter: Optional[TextSplitter] = None) → List[Document]¶
Load documents and split into chunks. | [
5317,
8995,
17926,
12693,
388,
2272,
3757,
10840,
52243,
20656,
9360,
55609,
198,
1058,
8859,
8995,
17926,
12693,
388,
2272,
3757,
10840,
52243,
20656,
9360,
4971,
2703,
25,
610,
11,
3941,
25,
610,
284,
364,
15698,
518,
3146,
359,
52243,
37335,
25,
5884,
6758,
2484,
60,
55609,
198,
33,
2315,
25,
1252,
52243,
1738,
9360,
198,
9360,
430,
5829,
653,
52243,
311,
2865,
5210,
21705,
3626,
627,
10130,
449,
1052,
1853,
627,
18337,
198,
565,
2381,
3889,
1213,
2703,
38372,
4194,
8684,
2608,
10130,
449,
1052,
1853,
627,
50113,
12693,
746,
32,
16053,
16432,
369,
2246,
2262,
627,
1096,
746,
6003,
1052,
627,
1096,
8543,
17489,
2625,
1342,
17489,
466,
2608,
6003,
9477,
323,
6859,
1139,
27855,
627,
50113,
12693,
368,
11651,
23887,
58,
7676,
60,
55609,
198,
32,
16053,
16432,
369,
2246,
2262,
627,
1096,
368,
11651,
1796,
58,
7676,
60,
55609,
198,
6003,
1052,
627,
1096,
8543,
17489,
7383,
17489,
466,
25,
12536,
58,
1199,
20805,
466,
60,
284,
2290,
8,
11651,
1796,
58,
7676,
60,
55609,
198,
6003,
9477,
323,
6859,
1139,
27855,
13
] | https://langchain.readthedocs.io/en/latest/document_loaders/langchain.document_loaders.excel.UnstructuredExcelLoader.html |
6915be2869ea-0 | langchain.document_loaders.whatsapp_chat.WhatsAppChatLoader¶
class langchain.document_loaders.whatsapp_chat.WhatsAppChatLoader(path: str)[source]¶
Bases: BaseLoader
Loader that loads WhatsApp messages text file.
Initialize with path.
Methods
__init__(path)
Initialize with path.
lazy_load()
A lazy loader for document content.
load()
Load documents.
load_and_split([text_splitter])
Load documents and split into chunks.
lazy_load() → Iterator[Document]¶
A lazy loader for document content.
load() → List[Document][source]¶
Load documents.
load_and_split(text_splitter: Optional[TextSplitter] = None) → List[Document]¶
Load documents and split into chunks. | [
5317,
8995,
17926,
12693,
388,
47836,
72544,
36153,
18951,
1900,
2213,
16047,
9360,
55609,
198,
1058,
8859,
8995,
17926,
12693,
388,
47836,
72544,
36153,
18951,
1900,
2213,
16047,
9360,
5698,
25,
610,
6758,
2484,
60,
55609,
198,
33,
2315,
25,
5464,
9360,
198,
9360,
430,
21577,
41113,
6743,
1495,
1052,
627,
10130,
449,
1853,
627,
18337,
198,
565,
2381,
3889,
2398,
340,
10130,
449,
1853,
627,
50113,
12693,
746,
32,
16053,
16432,
369,
2246,
2262,
627,
1096,
746,
6003,
9477,
627,
1096,
8543,
17489,
2625,
1342,
17489,
466,
2608,
6003,
9477,
323,
6859,
1139,
27855,
627,
50113,
12693,
368,
11651,
23887,
58,
7676,
60,
55609,
198,
32,
16053,
16432,
369,
2246,
2262,
627,
1096,
368,
11651,
1796,
58,
7676,
1483,
2484,
60,
55609,
198,
6003,
9477,
627,
1096,
8543,
17489,
7383,
17489,
466,
25,
12536,
58,
1199,
20805,
466,
60,
284,
2290,
8,
11651,
1796,
58,
7676,
60,
55609,
198,
6003,
9477,
323,
6859,
1139,
27855,
13
] | https://langchain.readthedocs.io/en/latest/document_loaders/langchain.document_loaders.whatsapp_chat.WhatsAppChatLoader.html |
eaf9333a0058-0 | langchain.document_loaders.parsers.grobid.ServerUnavailableException¶
class langchain.document_loaders.parsers.grobid.ServerUnavailableException[source]¶
Bases: Exception
add_note()¶
Exception.add_note(note) –
add a note to the exception
with_traceback()¶
Exception.with_traceback(tb) –
set self.__traceback__ to tb and return self.
args¶ | [
5317,
8995,
17926,
12693,
388,
76592,
1326,
299,
21301,
23858,
94028,
1378,
55609,
198,
1058,
8859,
8995,
17926,
12693,
388,
76592,
1326,
299,
21301,
23858,
94028,
1378,
76747,
60,
55609,
198,
33,
2315,
25,
4204,
198,
723,
28306,
368,
55609,
198,
1378,
1388,
28306,
45151,
8,
1389,
198,
723,
264,
5296,
311,
279,
4788,
198,
4291,
24489,
1445,
368,
55609,
198,
1378,
18662,
24489,
1445,
62514,
8,
1389,
198,
751,
659,
4952,
15417,
1445,
565,
311,
16767,
323,
471,
659,
627,
2164,
55609
] | https://langchain.readthedocs.io/en/latest/document_loaders/langchain.document_loaders.parsers.grobid.ServerUnavailableException.html |
1cc80c0ec26e-0 | langchain.document_loaders.youtube.GoogleApiYoutubeLoader¶
class langchain.document_loaders.youtube.GoogleApiYoutubeLoader(google_api_client: GoogleApiClient, channel_name: Optional[str] = None, video_ids: Optional[List[str]] = None, add_video_info: bool = True, captions_language: str = 'en', continue_on_failure: bool = False)[source]¶
Bases: BaseLoader
Loader that loads all Videos from a Channel
To use, you should have the googleapiclient,youtube_transcript_api
python package installed.
As the service needs a google_api_client, you first have to initialize
the GoogleApiClient.
Additionally you have to either provide a channel name or a list of videoids
“https://developers.google.com/docs/api/quickstart/python”
Example
from langchain.document_loaders import GoogleApiClient
from langchain.document_loaders import GoogleApiYoutubeLoader
google_api_client = GoogleApiClient(
service_account_path=Path("path_to_your_sec_file.json")
)
loader = GoogleApiYoutubeLoader(
google_api_client=google_api_client,
channel_name = "CodeAesthetic"
)
load.load()
Methods
__init__(google_api_client[, channel_name, ...])
lazy_load()
A lazy loader for document content.
load()
Load documents.
load_and_split([text_splitter])
Load documents and split into chunks.
validate_channel_or_videoIds_is_set(values)
Validate that either folder_id or document_ids is set, but not both.
Attributes
add_video_info
captions_language
channel_name
continue_on_failure
video_ids
google_api_client
lazy_load() → Iterator[Document]¶
A lazy loader for document content.
load() → List[Document][source]¶
Load documents. | [
5317,
8995,
17926,
12693,
388,
20751,
61493,
6700,
92767,
9360,
55609,
198,
1058,
8859,
8995,
17926,
12693,
388,
20751,
61493,
6700,
92767,
9360,
3348,
2738,
11959,
8342,
25,
5195,
57684,
11,
5613,
1292,
25,
12536,
17752,
60,
284,
2290,
11,
2835,
8237,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
923,
20402,
3186,
25,
1845,
284,
3082,
11,
78888,
30121,
25,
610,
284,
364,
268,
518,
3136,
4570,
44718,
25,
1845,
284,
3641,
6758,
2484,
60,
55609,
198,
33,
2315,
25,
5464,
9360,
198,
9360,
430,
21577,
682,
20114,
505,
264,
13740,
198,
1271,
1005,
11,
499,
1288,
617,
279,
11819,
391,
292,
1477,
7509,
10432,
8124,
1250,
11959,
198,
12958,
6462,
10487,
627,
2170,
279,
2532,
3966,
264,
11819,
11959,
8342,
11,
499,
1176,
617,
311,
9656,
198,
1820,
5195,
57684,
627,
50674,
499,
617,
311,
3060,
3493,
264,
5613,
836,
477,
264,
1160,
315,
23895,
17390,
198,
2118,
2485,
1129,
65405,
5831,
916,
27057,
10729,
14,
28863,
2527,
24147,
89874,
13617,
198,
1527,
8859,
8995,
17926,
12693,
388,
1179,
5195,
57684,
198,
1527,
8859,
8995,
17926,
12693,
388,
1179,
5195,
6700,
92767,
9360,
198,
17943,
11959,
8342,
284,
5195,
57684,
1021,
262,
2532,
13808,
2703,
28,
1858,
446,
2398,
2401,
4271,
414,
17687,
2517,
4421,
1158,
340,
8520,
284,
5195,
6700,
92767,
9360,
1021,
262,
11819,
11959,
8342,
28,
17943,
11959,
8342,
345,
262,
5613,
1292,
284,
330,
2123,
32,
71109,
702,
340,
1096,
5214,
746,
18337,
198,
565,
2381,
3889,
17943,
11959,
8342,
38372,
4194,
10327,
1292,
11,
4194,
1131,
2608,
50113,
12693,
746,
32,
16053,
16432,
369,
2246,
2262,
627,
1096,
746,
6003,
9477,
627,
1096,
8543,
17489,
2625,
1342,
17489,
466,
2608,
6003,
9477,
323,
6859,
1139,
27855,
627,
7212,
14913,
8908,
20402,
12990,
7037,
2667,
20706,
340,
18409,
430,
3060,
8695,
851,
477,
2246,
8237,
374,
743,
11,
719,
539,
2225,
627,
10738,
198,
723,
20402,
3186,
198,
44303,
919,
30121,
198,
10327,
1292,
198,
9726,
4570,
44718,
198,
10191,
8237,
198,
17943,
11959,
8342,
198,
50113,
12693,
368,
11651,
23887,
58,
7676,
60,
55609,
198,
32,
16053,
16432,
369,
2246,
2262,
627,
1096,
368,
11651,
1796,
58,
7676,
1483,
2484,
60,
55609,
198,
6003,
9477,
13
] | https://langchain.readthedocs.io/en/latest/document_loaders/langchain.document_loaders.youtube.GoogleApiYoutubeLoader.html |
1cc80c0ec26e-1 | load() → List[Document][source]¶
Load documents.
load_and_split(text_splitter: Optional[TextSplitter] = None) → List[Document]¶
Load documents and split into chunks.
classmethod validate_channel_or_videoIds_is_set(values: Dict[str, Any]) → Dict[str, Any][source]¶
Validate that either folder_id or document_ids is set, but not both.
add_video_info: bool = True¶
captions_language: str = 'en'¶
channel_name: Optional[str] = None¶
continue_on_failure: bool = False¶
google_api_client: langchain.document_loaders.youtube.GoogleApiClient¶
video_ids: Optional[List[str]] = None¶ | [
1096,
368,
11651,
1796,
58,
7676,
1483,
2484,
60,
55609,
198,
6003,
9477,
627,
1096,
8543,
17489,
7383,
17489,
466,
25,
12536,
58,
1199,
20805,
466,
60,
284,
2290,
8,
11651,
1796,
58,
7676,
60,
55609,
198,
6003,
9477,
323,
6859,
1139,
27855,
627,
27853,
9788,
14913,
8908,
20402,
12990,
7037,
2667,
20706,
25,
30226,
17752,
11,
5884,
2526,
11651,
30226,
17752,
11,
5884,
1483,
2484,
60,
55609,
198,
18409,
430,
3060,
8695,
851,
477,
2246,
8237,
374,
743,
11,
719,
539,
2225,
627,
723,
20402,
3186,
25,
1845,
284,
3082,
55609,
198,
44303,
919,
30121,
25,
610,
284,
364,
268,
6,
55609,
198,
10327,
1292,
25,
12536,
17752,
60,
284,
2290,
55609,
198,
9726,
4570,
44718,
25,
1845,
284,
3641,
55609,
198,
17943,
11959,
8342,
25,
8859,
8995,
17926,
12693,
388,
20751,
61493,
57684,
55609,
198,
10191,
8237,
25,
12536,
53094,
17752,
5163,
284,
2290,
55609
] | https://langchain.readthedocs.io/en/latest/document_loaders/langchain.document_loaders.youtube.GoogleApiYoutubeLoader.html |
5f3cecb2522f-0 | langchain.document_loaders.parsers.pdf.PyPDFParser¶
class langchain.document_loaders.parsers.pdf.PyPDFParser(password: Optional[Union[str, bytes]] = None)[source]¶
Bases: BaseBlobParser
Loads a PDF with pypdf and chunks at character level.
Methods
__init__([password])
lazy_parse(blob)
Lazily parse the blob.
parse(blob)
Eagerly parse the blob into a document or documents.
lazy_parse(blob: Blob) → Iterator[Document][source]¶
Lazily parse the blob.
parse(blob: Blob) → List[Document]¶
Eagerly parse the blob into a document or documents.
This is a convenience method for interactive development environment.
Production applications should favor the lazy_parse method instead.
Subclasses should generally not over-ride this parse method.
Parameters
blob – Blob instance
Returns
List of documents | [
5317,
8995,
17926,
12693,
388,
76592,
16378,
1087,
88,
24317,
6707,
55609,
198,
1058,
8859,
8995,
17926,
12693,
388,
76592,
16378,
1087,
88,
24317,
6707,
23608,
25,
12536,
58,
33758,
17752,
11,
5943,
5163,
284,
2290,
6758,
2484,
60,
55609,
198,
33,
2315,
25,
5464,
39085,
6707,
198,
79617,
264,
11612,
449,
281,
1100,
3013,
323,
27855,
520,
3752,
2237,
627,
18337,
198,
565,
2381,
565,
2625,
3918,
2608,
50113,
21715,
69038,
340,
43,
1394,
1570,
4820,
279,
24295,
627,
6534,
69038,
340,
36,
1435,
398,
4820,
279,
24295,
1139,
264,
2246,
477,
9477,
627,
50113,
21715,
69038,
25,
50539,
8,
11651,
23887,
58,
7676,
1483,
2484,
60,
55609,
198,
43,
1394,
1570,
4820,
279,
24295,
627,
6534,
69038,
25,
50539,
8,
11651,
1796,
58,
7676,
60,
55609,
198,
36,
1435,
398,
4820,
279,
24295,
1139,
264,
2246,
477,
9477,
627,
2028,
374,
264,
19679,
1749,
369,
21416,
4500,
4676,
627,
46067,
8522,
1288,
4799,
279,
16053,
21715,
1749,
4619,
627,
3214,
9031,
1288,
8965,
539,
927,
12,
1425,
420,
4820,
1749,
627,
9905,
198,
36212,
1389,
50539,
2937,
198,
16851,
198,
861,
315,
9477
] | https://langchain.readthedocs.io/en/latest/document_loaders/langchain.document_loaders.parsers.pdf.PyPDFParser.html |
4030dda7dd3f-0 | langchain.document_loaders.facebook_chat.concatenate_rows¶
langchain.document_loaders.facebook_chat.concatenate_rows(row: dict) → str[source]¶
Combine message information in a readable format ready to be used. | [
5317,
8995,
17926,
12693,
388,
20172,
36153,
39859,
11189,
55609,
198,
5317,
8995,
17926,
12693,
388,
20172,
36153,
39859,
11189,
7991,
25,
6587,
8,
11651,
610,
76747,
60,
55609,
198,
82214,
1984,
2038,
304,
264,
34898,
3645,
5644,
311,
387,
1511,
13
] | https://langchain.readthedocs.io/en/latest/document_loaders/langchain.document_loaders.facebook_chat.concatenate_rows.html |
005aff82fed1-0 | langchain.document_loaders.whatsapp_chat.concatenate_rows¶
langchain.document_loaders.whatsapp_chat.concatenate_rows(date: str, sender: str, text: str) → str[source]¶
Combine message information in a readable format ready to be used. | [
5317,
8995,
17926,
12693,
388,
47836,
72544,
36153,
39859,
11189,
55609,
198,
5317,
8995,
17926,
12693,
388,
47836,
72544,
36153,
39859,
11189,
12237,
25,
610,
11,
4750,
25,
610,
11,
1495,
25,
610,
8,
11651,
610,
76747,
60,
55609,
198,
82214,
1984,
2038,
304,
264,
34898,
3645,
5644,
311,
387,
1511,
13
] | https://langchain.readthedocs.io/en/latest/document_loaders/langchain.document_loaders.whatsapp_chat.concatenate_rows.html |
0c9671599bb0-0 | langchain.document_loaders.unstructured.UnstructuredBaseLoader¶
class langchain.document_loaders.unstructured.UnstructuredBaseLoader(mode: str = 'single', **unstructured_kwargs: Any)[source]¶
Bases: BaseLoader, ABC
Loader that uses unstructured to load files.
Initialize with file path.
Methods
__init__([mode])
Initialize with file path.
lazy_load()
A lazy loader for document content.
load()
Load file.
load_and_split([text_splitter])
Load documents and split into chunks.
lazy_load() → Iterator[Document]¶
A lazy loader for document content.
load() → List[Document][source]¶
Load file.
load_and_split(text_splitter: Optional[TextSplitter] = None) → List[Document]¶
Load documents and split into chunks. | [
5317,
8995,
17926,
12693,
388,
6441,
52243,
10840,
52243,
4066,
9360,
55609,
198,
1058,
8859,
8995,
17926,
12693,
388,
6441,
52243,
10840,
52243,
4066,
9360,
32456,
25,
610,
284,
364,
15698,
518,
3146,
359,
52243,
37335,
25,
5884,
6758,
2484,
60,
55609,
198,
33,
2315,
25,
5464,
9360,
11,
19921,
198,
9360,
430,
5829,
653,
52243,
311,
2865,
3626,
627,
10130,
449,
1052,
1853,
627,
18337,
198,
565,
2381,
565,
2625,
8684,
2608,
10130,
449,
1052,
1853,
627,
50113,
12693,
746,
32,
16053,
16432,
369,
2246,
2262,
627,
1096,
746,
6003,
1052,
627,
1096,
8543,
17489,
2625,
1342,
17489,
466,
2608,
6003,
9477,
323,
6859,
1139,
27855,
627,
50113,
12693,
368,
11651,
23887,
58,
7676,
60,
55609,
198,
32,
16053,
16432,
369,
2246,
2262,
627,
1096,
368,
11651,
1796,
58,
7676,
1483,
2484,
60,
55609,
198,
6003,
1052,
627,
1096,
8543,
17489,
7383,
17489,
466,
25,
12536,
58,
1199,
20805,
466,
60,
284,
2290,
8,
11651,
1796,
58,
7676,
60,
55609,
198,
6003,
9477,
323,
6859,
1139,
27855,
13
] | https://langchain.readthedocs.io/en/latest/document_loaders/langchain.document_loaders.unstructured.UnstructuredBaseLoader.html |
bf792e304978-0 | langchain.document_loaders.parsers.grobid.GrobidParser¶
class langchain.document_loaders.parsers.grobid.GrobidParser(segment_sentences: bool, grobid_server: str = 'http://localhost:8070/api/processFulltextDocument')[source]¶
Bases: BaseBlobParser
Loader that uses Grobid to load article PDF files.
Methods
__init__(segment_sentences[, grobid_server])
lazy_parse(blob)
Lazy parsing interface.
parse(blob)
Eagerly parse the blob into a document or documents.
process_xml(file_path, xml_data, ...)
Process the XML file from Grobin.
lazy_parse(blob: Blob) → Iterator[Document][source]¶
Lazy parsing interface.
Subclasses are required to implement this method.
Parameters
blob – Blob instance
Returns
Generator of documents
parse(blob: Blob) → List[Document]¶
Eagerly parse the blob into a document or documents.
This is a convenience method for interactive development environment.
Production applications should favor the lazy_parse method instead.
Subclasses should generally not over-ride this parse method.
Parameters
blob – Blob instance
Returns
List of documents
process_xml(file_path: str, xml_data: str, segment_sentences: bool) → Iterator[Document][source]¶
Process the XML file from Grobin. | [
5317,
8995,
17926,
12693,
388,
76592,
1326,
299,
21301,
1246,
299,
21301,
6707,
55609,
198,
1058,
8859,
8995,
17926,
12693,
388,
76592,
1326,
299,
21301,
1246,
299,
21301,
6707,
65237,
77003,
25,
1845,
11,
10707,
21301,
12284,
25,
610,
284,
364,
1277,
1129,
8465,
25,
23178,
15,
10729,
87880,
9619,
1342,
7676,
13588,
2484,
60,
55609,
198,
33,
2315,
25,
5464,
39085,
6707,
198,
9360,
430,
5829,
18370,
21301,
311,
2865,
4652,
11612,
3626,
627,
18337,
198,
565,
2381,
3889,
24044,
77003,
38372,
4194,
42511,
21301,
12284,
2608,
50113,
21715,
69038,
340,
40866,
23115,
3834,
627,
6534,
69038,
340,
36,
1435,
398,
4820,
279,
24295,
1139,
264,
2246,
477,
9477,
627,
4734,
24796,
4971,
2703,
11,
4194,
6591,
1807,
11,
4194,
32318,
7575,
279,
12138,
1052,
505,
18370,
7006,
627,
50113,
21715,
69038,
25,
50539,
8,
11651,
23887,
58,
7676,
1483,
2484,
60,
55609,
198,
40866,
23115,
3834,
627,
3214,
9031,
527,
2631,
311,
4305,
420,
1749,
627,
9905,
198,
36212,
1389,
50539,
2937,
198,
16851,
198,
12846,
315,
9477,
198,
6534,
69038,
25,
50539,
8,
11651,
1796,
58,
7676,
60,
55609,
198,
36,
1435,
398,
4820,
279,
24295,
1139,
264,
2246,
477,
9477,
627,
2028,
374,
264,
19679,
1749,
369,
21416,
4500,
4676,
627,
46067,
8522,
1288,
4799,
279,
16053,
21715,
1749,
4619,
627,
3214,
9031,
1288,
8965,
539,
927,
12,
1425,
420,
4820,
1749,
627,
9905,
198,
36212,
1389,
50539,
2937,
198,
16851,
198,
861,
315,
9477,
198,
4734,
24796,
4971,
2703,
25,
610,
11,
8562,
1807,
25,
610,
11,
10449,
77003,
25,
1845,
8,
11651,
23887,
58,
7676,
1483,
2484,
60,
55609,
198,
7575,
279,
12138,
1052,
505,
18370,
7006,
13
] | https://langchain.readthedocs.io/en/latest/document_loaders/langchain.document_loaders.parsers.grobid.GrobidParser.html |
bec7acf3153d-0 | langchain.document_loaders.notebook.NotebookLoader¶
class langchain.document_loaders.notebook.NotebookLoader(path: str, include_outputs: bool = False, max_output_length: int = 10, remove_newline: bool = False, traceback: bool = False)[source]¶
Bases: BaseLoader
Loader that loads .ipynb notebook files.
Initialize with path.
Methods
__init__(path[, include_outputs, ...])
Initialize with path.
lazy_load()
A lazy loader for document content.
load()
Load documents.
load_and_split([text_splitter])
Load documents and split into chunks.
lazy_load() → Iterator[Document]¶
A lazy loader for document content.
load() → List[Document][source]¶
Load documents.
load_and_split(text_splitter: Optional[TextSplitter] = None) → List[Document]¶
Load documents and split into chunks. | [
5317,
8995,
17926,
12693,
388,
41431,
2239,
70028,
2239,
9360,
55609,
198,
1058,
8859,
8995,
17926,
12693,
388,
41431,
2239,
70028,
2239,
9360,
5698,
25,
610,
11,
2997,
36289,
25,
1845,
284,
3641,
11,
1973,
7800,
5228,
25,
528,
284,
220,
605,
11,
4148,
6046,
1074,
25,
1845,
284,
3641,
11,
47158,
25,
1845,
284,
3641,
6758,
2484,
60,
55609,
198,
33,
2315,
25,
5464,
9360,
198,
9360,
430,
21577,
662,
575,
1910,
65,
38266,
3626,
627,
10130,
449,
1853,
627,
18337,
198,
565,
2381,
3889,
2398,
38372,
4194,
1012,
36289,
11,
4194,
1131,
2608,
10130,
449,
1853,
627,
50113,
12693,
746,
32,
16053,
16432,
369,
2246,
2262,
627,
1096,
746,
6003,
9477,
627,
1096,
8543,
17489,
2625,
1342,
17489,
466,
2608,
6003,
9477,
323,
6859,
1139,
27855,
627,
50113,
12693,
368,
11651,
23887,
58,
7676,
60,
55609,
198,
32,
16053,
16432,
369,
2246,
2262,
627,
1096,
368,
11651,
1796,
58,
7676,
1483,
2484,
60,
55609,
198,
6003,
9477,
627,
1096,
8543,
17489,
7383,
17489,
466,
25,
12536,
58,
1199,
20805,
466,
60,
284,
2290,
8,
11651,
1796,
58,
7676,
60,
55609,
198,
6003,
9477,
323,
6859,
1139,
27855,
13
] | https://langchain.readthedocs.io/en/latest/document_loaders/langchain.document_loaders.notebook.NotebookLoader.html |
a52b603347a0-0 | langchain.document_loaders.blob_loaders.schema.BlobLoader¶
class langchain.document_loaders.blob_loaders.schema.BlobLoader[source]¶
Bases: ABC
Abstract interface for blob loaders implementation.
Implementer should be able to load raw content from a storage system according
to some criteria and return the raw content lazily as a stream of blobs.
Methods
__init__()
yield_blobs()
A lazy loader for raw data represented by LangChain's Blob object.
abstract yield_blobs() → Iterable[Blob][source]¶
A lazy loader for raw data represented by LangChain’s Blob object.
Returns
A generator over blobs | [
5317,
8995,
17926,
12693,
388,
97381,
12693,
388,
31992,
1823,
1718,
9360,
55609,
198,
1058,
8859,
8995,
17926,
12693,
388,
97381,
12693,
388,
31992,
1823,
1718,
9360,
76747,
60,
55609,
198,
33,
2315,
25,
19921,
198,
16328,
3834,
369,
24295,
69674,
8292,
627,
64080,
261,
1288,
387,
3025,
311,
2865,
7257,
2262,
505,
264,
5942,
1887,
4184,
198,
998,
1063,
13186,
323,
471,
279,
7257,
2262,
65536,
1570,
439,
264,
4365,
315,
77999,
627,
18337,
198,
565,
2381,
33716,
30796,
890,
69264,
746,
32,
16053,
16432,
369,
7257,
828,
15609,
555,
23272,
19368,
596,
50539,
1665,
627,
16647,
7692,
890,
69264,
368,
11651,
39116,
33722,
1718,
1483,
2484,
60,
55609,
198,
32,
16053,
16432,
369,
7257,
828,
15609,
555,
23272,
19368,
753,
50539,
1665,
627,
16851,
198,
32,
14143,
927,
77999
] | https://langchain.readthedocs.io/en/latest/document_loaders/langchain.document_loaders.blob_loaders.schema.BlobLoader.html |
df5bef360958-0 | langchain.document_loaders.azlyrics.AZLyricsLoader¶
class langchain.document_loaders.azlyrics.AZLyricsLoader(web_path: Union[str, List[str]], header_template: Optional[dict] = None, verify: Optional[bool] = True, proxies: Optional[dict] = None)[source]¶
Bases: WebBaseLoader
Loader that loads AZLyrics webpages.
Initialize with webpage path.
Methods
__init__(web_path[, header_template, ...])
Initialize with webpage path.
aload()
Load text from the urls in web_path async into Documents.
fetch_all(urls)
Fetch all urls concurrently with rate limiting.
lazy_load()
Lazy load text from the url(s) in web_path.
load()
Load webpage.
load_and_split([text_splitter])
Load documents and split into chunks.
scrape([parser])
Scrape data from webpage and return it in BeautifulSoup format.
scrape_all(urls[, parser])
Fetch all urls, then return soups for all results.
Attributes
bs_get_text_kwargs
kwargs for beatifulsoup4 get_text
default_parser
Default parser to use for BeautifulSoup.
raise_for_status
Raise an exception if http status code denotes an error.
requests_kwargs
kwargs for requests
requests_per_second
Max number of concurrent requests to make.
web_path
aload() → List[Document]¶
Load text from the urls in web_path async into Documents.
async fetch_all(urls: List[str]) → Any¶
Fetch all urls concurrently with rate limiting.
lazy_load() → Iterator[Document]¶
Lazy load text from the url(s) in web_path.
load() → List[Document][source]¶
Load webpage. | [
5317,
8995,
17926,
12693,
388,
13,
1394,
398,
6329,
885,
57,
48412,
6329,
9360,
55609,
198,
1058,
8859,
8995,
17926,
12693,
388,
13,
1394,
398,
6329,
885,
57,
48412,
6329,
9360,
40869,
2703,
25,
9323,
17752,
11,
1796,
17752,
21128,
4342,
8864,
25,
12536,
58,
8644,
60,
284,
2290,
11,
10356,
25,
12536,
58,
2707,
60,
284,
3082,
11,
60465,
25,
12536,
58,
8644,
60,
284,
2290,
6758,
2484,
60,
55609,
198,
33,
2315,
25,
5000,
4066,
9360,
198,
9360,
430,
21577,
31976,
48412,
6329,
3566,
11014,
627,
10130,
449,
45710,
1853,
627,
18337,
198,
565,
2381,
3889,
2984,
2703,
38372,
4194,
2775,
8864,
11,
4194,
1131,
2608,
10130,
449,
45710,
1853,
627,
55496,
746,
6003,
1495,
505,
279,
31084,
304,
3566,
2703,
3393,
1139,
45890,
627,
9838,
5823,
92282,
340,
21373,
682,
31084,
79126,
449,
4478,
33994,
627,
50113,
12693,
746,
40866,
2865,
1495,
505,
279,
2576,
1161,
8,
304,
3566,
2703,
627,
1096,
746,
6003,
45710,
627,
1096,
8543,
17489,
2625,
1342,
17489,
466,
2608,
6003,
9477,
323,
6859,
1139,
27855,
627,
2445,
20432,
2625,
9854,
2608,
3407,
20432,
828,
505,
45710,
323,
471,
433,
304,
37010,
3645,
627,
2445,
20432,
5823,
92282,
38372,
4194,
9854,
2608,
21373,
682,
31084,
11,
1243,
471,
5945,
1725,
369,
682,
3135,
627,
10738,
198,
1302,
3138,
4424,
37335,
198,
9872,
369,
9567,
5092,
90642,
19,
636,
4424,
198,
2309,
19024,
198,
3760,
6871,
311,
1005,
369,
37010,
627,
19223,
5595,
4878,
198,
94201,
459,
4788,
422,
1795,
2704,
2082,
72214,
459,
1493,
627,
37342,
37335,
198,
9872,
369,
7540,
198,
37342,
5796,
30744,
198,
6102,
1396,
315,
35135,
7540,
311,
1304,
627,
2984,
2703,
198,
55496,
368,
11651,
1796,
58,
7676,
60,
55609,
198,
6003,
1495,
505,
279,
31084,
304,
3566,
2703,
3393,
1139,
45890,
627,
7847,
7963,
5823,
92282,
25,
1796,
17752,
2526,
11651,
5884,
55609,
198,
21373,
682,
31084,
79126,
449,
4478,
33994,
627,
50113,
12693,
368,
11651,
23887,
58,
7676,
60,
55609,
198,
40866,
2865,
1495,
505,
279,
2576,
1161,
8,
304,
3566,
2703,
627,
1096,
368,
11651,
1796,
58,
7676,
1483,
2484,
60,
55609,
198,
6003,
45710,
13
] | https://langchain.readthedocs.io/en/latest/document_loaders/langchain.document_loaders.azlyrics.AZLyricsLoader.html |
df5bef360958-1 | load() → List[Document][source]¶
Load webpage.
load_and_split(text_splitter: Optional[TextSplitter] = None) → List[Document]¶
Load documents and split into chunks.
scrape(parser: Optional[str] = None) → Any¶
Scrape data from webpage and return it in BeautifulSoup format.
scrape_all(urls: List[str], parser: Optional[str] = None) → List[Any]¶
Fetch all urls, then return soups for all results.
bs_get_text_kwargs: Dict[str, Any] = {}¶
kwargs for beatifulsoup4 get_text
default_parser: str = 'html.parser'¶
Default parser to use for BeautifulSoup.
raise_for_status: bool = False¶
Raise an exception if http status code denotes an error.
requests_kwargs: Dict[str, Any] = {}¶
kwargs for requests
requests_per_second: int = 2¶
Max number of concurrent requests to make.
property web_path: str¶
web_paths: List[str]¶ | [
1096,
368,
11651,
1796,
58,
7676,
1483,
2484,
60,
55609,
198,
6003,
45710,
627,
1096,
8543,
17489,
7383,
17489,
466,
25,
12536,
58,
1199,
20805,
466,
60,
284,
2290,
8,
11651,
1796,
58,
7676,
60,
55609,
198,
6003,
9477,
323,
6859,
1139,
27855,
627,
2445,
20432,
36435,
25,
12536,
17752,
60,
284,
2290,
8,
11651,
5884,
55609,
198,
3407,
20432,
828,
505,
45710,
323,
471,
433,
304,
37010,
3645,
627,
2445,
20432,
5823,
92282,
25,
1796,
17752,
1145,
6871,
25,
12536,
17752,
60,
284,
2290,
8,
11651,
1796,
71401,
60,
55609,
198,
21373,
682,
31084,
11,
1243,
471,
5945,
1725,
369,
682,
3135,
627,
1302,
3138,
4424,
37335,
25,
30226,
17752,
11,
5884,
60,
284,
4792,
55609,
198,
9872,
369,
9567,
5092,
90642,
19,
636,
4424,
198,
2309,
19024,
25,
610,
284,
364,
1580,
26699,
6,
55609,
198,
3760,
6871,
311,
1005,
369,
37010,
627,
19223,
5595,
4878,
25,
1845,
284,
3641,
55609,
198,
94201,
459,
4788,
422,
1795,
2704,
2082,
72214,
459,
1493,
627,
37342,
37335,
25,
30226,
17752,
11,
5884,
60,
284,
4792,
55609,
198,
9872,
369,
7540,
198,
37342,
5796,
30744,
25,
528,
284,
220,
17,
55609,
198,
6102,
1396,
315,
35135,
7540,
311,
1304,
627,
3784,
3566,
2703,
25,
610,
55609,
198,
2984,
25124,
25,
1796,
17752,
60,
55609
] | https://langchain.readthedocs.io/en/latest/document_loaders/langchain.document_loaders.azlyrics.AZLyricsLoader.html |
0fc1eea36e97-0 | langchain.document_loaders.pyspark_dataframe.PySparkDataFrameLoader¶
class langchain.document_loaders.pyspark_dataframe.PySparkDataFrameLoader(spark_session: Optional[SparkSession] = None, df: Optional[Any] = None, page_content_column: str = 'text', fraction_of_memory: float = 0.1)[source]¶
Bases: BaseLoader
Load PySpark DataFrames
Initialize with a Spark DataFrame object.
Methods
__init__([spark_session, df, ...])
Initialize with a Spark DataFrame object.
get_num_rows()
Gets the amount of "feasible" rows for the DataFrame
lazy_load()
A lazy loader for document content.
load()
Load from the dataframe.
load_and_split([text_splitter])
Load documents and split into chunks.
get_num_rows() → Tuple[int, int][source]¶
Gets the amount of “feasible” rows for the DataFrame
lazy_load() → Iterator[Document][source]¶
A lazy loader for document content.
load() → List[Document][source]¶
Load from the dataframe.
load_and_split(text_splitter: Optional[TextSplitter] = None) → List[Document]¶
Load documents and split into chunks. | [
5317,
8995,
17926,
12693,
388,
558,
1065,
29836,
78670,
1087,
88,
68583,
100038,
9360,
55609,
198,
1058,
8859,
8995,
17926,
12693,
388,
558,
1065,
29836,
78670,
1087,
88,
68583,
100038,
9360,
23398,
847,
12596,
25,
12536,
58,
68583,
5396,
60,
284,
2290,
11,
6907,
25,
12536,
71401,
60,
284,
2290,
11,
2199,
7647,
8918,
25,
610,
284,
364,
1342,
518,
19983,
3659,
19745,
25,
2273,
284,
220,
15,
13,
16,
6758,
2484,
60,
55609,
198,
33,
2315,
25,
5464,
9360,
198,
6003,
5468,
68583,
2956,
35145,
198,
10130,
449,
264,
27565,
46886,
1665,
627,
18337,
198,
565,
2381,
565,
2625,
54882,
12596,
11,
4194,
3013,
11,
4194,
1131,
2608,
10130,
449,
264,
27565,
46886,
1665,
627,
456,
4369,
11189,
746,
50458,
279,
3392,
315,
330,
90377,
1260,
1,
7123,
369,
279,
46886,
198,
50113,
12693,
746,
32,
16053,
16432,
369,
2246,
2262,
627,
1096,
746,
6003,
505,
279,
39328,
627,
1096,
8543,
17489,
2625,
1342,
17489,
466,
2608,
6003,
9477,
323,
6859,
1139,
27855,
627,
456,
4369,
11189,
368,
11651,
25645,
19155,
11,
528,
1483,
2484,
60,
55609,
198,
50458,
279,
3392,
315,
1054,
90377,
1260,
863,
7123,
369,
279,
46886,
198,
50113,
12693,
368,
11651,
23887,
58,
7676,
1483,
2484,
60,
55609,
198,
32,
16053,
16432,
369,
2246,
2262,
627,
1096,
368,
11651,
1796,
58,
7676,
1483,
2484,
60,
55609,
198,
6003,
505,
279,
39328,
627,
1096,
8543,
17489,
7383,
17489,
466,
25,
12536,
58,
1199,
20805,
466,
60,
284,
2290,
8,
11651,
1796,
58,
7676,
60,
55609,
198,
6003,
9477,
323,
6859,
1139,
27855,
13
] | https://langchain.readthedocs.io/en/latest/document_loaders/langchain.document_loaders.pyspark_dataframe.PySparkDataFrameLoader.html |
67be1eefabda-0 | langchain.document_loaders.parsers.pdf.PyPDFium2Parser¶
class langchain.document_loaders.parsers.pdf.PyPDFium2Parser[source]¶
Bases: BaseBlobParser
Parse PDFs with PyPDFium2.
Initialize the parser.
Methods
__init__()
Initialize the parser.
lazy_parse(blob)
Lazily parse the blob.
parse(blob)
Eagerly parse the blob into a document or documents.
lazy_parse(blob: Blob) → Iterator[Document][source]¶
Lazily parse the blob.
parse(blob: Blob) → List[Document]¶
Eagerly parse the blob into a document or documents.
This is a convenience method for interactive development environment.
Production applications should favor the lazy_parse method instead.
Subclasses should generally not over-ride this parse method.
Parameters
blob – Blob instance
Returns
List of documents | [
5317,
8995,
17926,
12693,
388,
76592,
16378,
1087,
88,
24317,
2411,
17,
6707,
55609,
198,
1058,
8859,
8995,
17926,
12693,
388,
76592,
16378,
1087,
88,
24317,
2411,
17,
6707,
76747,
60,
55609,
198,
33,
2315,
25,
5464,
39085,
6707,
198,
14802,
11612,
82,
449,
5468,
24317,
2411,
17,
627,
10130,
279,
6871,
627,
18337,
198,
565,
2381,
33716,
10130,
279,
6871,
627,
50113,
21715,
69038,
340,
43,
1394,
1570,
4820,
279,
24295,
627,
6534,
69038,
340,
36,
1435,
398,
4820,
279,
24295,
1139,
264,
2246,
477,
9477,
627,
50113,
21715,
69038,
25,
50539,
8,
11651,
23887,
58,
7676,
1483,
2484,
60,
55609,
198,
43,
1394,
1570,
4820,
279,
24295,
627,
6534,
69038,
25,
50539,
8,
11651,
1796,
58,
7676,
60,
55609,
198,
36,
1435,
398,
4820,
279,
24295,
1139,
264,
2246,
477,
9477,
627,
2028,
374,
264,
19679,
1749,
369,
21416,
4500,
4676,
627,
46067,
8522,
1288,
4799,
279,
16053,
21715,
1749,
4619,
627,
3214,
9031,
1288,
8965,
539,
927,
12,
1425,
420,
4820,
1749,
627,
9905,
198,
36212,
1389,
50539,
2937,
198,
16851,
198,
861,
315,
9477
] | https://langchain.readthedocs.io/en/latest/document_loaders/langchain.document_loaders.parsers.pdf.PyPDFium2Parser.html |
2c061c05ce2c-0 | langchain.graphs.networkx_graph.parse_triples¶
langchain.graphs.networkx_graph.parse_triples(knowledge_str: str) → List[KnowledgeTriple][source]¶
Parse knowledge triples from the knowledge string. | [
5317,
8995,
10996,
82,
21216,
87,
15080,
4736,
3631,
38558,
55609,
198,
5317,
8995,
10996,
82,
21216,
87,
15080,
4736,
3631,
38558,
6097,
52286,
2966,
25,
610,
8,
11651,
1796,
58,
81434,
83926,
1483,
2484,
60,
55609,
198,
14802,
6677,
89661,
505,
279,
6677,
925,
13
] | https://langchain.readthedocs.io/en/latest/graphs/langchain.graphs.networkx_graph.parse_triples.html |
ece6e43742a0-0 | langchain.graphs.networkx_graph.KnowledgeTriple¶
class langchain.graphs.networkx_graph.KnowledgeTriple(subject: str, predicate: str, object_: str)[source]¶
Bases: NamedTuple
A triple in the graph.
Create new instance of KnowledgeTriple(subject, predicate, object_)
Methods
__init__()
count(value, /)
Return number of occurrences of value.
from_string(triple_string)
Create a KnowledgeTriple from a string.
index(value[, start, stop])
Return first index of value.
Attributes
object_
Alias for field number 2
predicate
Alias for field number 1
subject
Alias for field number 0
count(value, /)¶
Return number of occurrences of value.
classmethod from_string(triple_string: str) → KnowledgeTriple[source]¶
Create a KnowledgeTriple from a string.
index(value, start=0, stop=9223372036854775807, /)¶
Return first index of value.
Raises ValueError if the value is not present.
object_: str¶
Alias for field number 2
predicate: str¶
Alias for field number 1
subject: str¶
Alias for field number 0 | [
5317,
8995,
10996,
82,
21216,
87,
15080,
11606,
52286,
83926,
55609,
198,
1058,
8859,
8995,
10996,
82,
21216,
87,
15080,
11606,
52286,
83926,
30228,
25,
610,
11,
25269,
25,
610,
11,
1665,
24089,
610,
6758,
2484,
60,
55609,
198,
33,
2315,
25,
41559,
29781,
198,
32,
24657,
304,
279,
4876,
627,
4110,
502,
2937,
315,
33025,
83926,
30228,
11,
25269,
11,
1665,
24262,
18337,
198,
565,
2381,
33716,
1868,
3764,
11,
4194,
54660,
5715,
1396,
315,
57115,
315,
907,
627,
1527,
3991,
7779,
3863,
3991,
340,
4110,
264,
33025,
83926,
505,
264,
925,
627,
1275,
3764,
38372,
4194,
2527,
11,
4194,
9684,
2608,
5715,
1176,
1963,
315,
907,
627,
10738,
198,
1735,
13220,
23555,
369,
2115,
1396,
220,
17,
198,
62885,
198,
23555,
369,
2115,
1396,
220,
16,
198,
11760,
198,
23555,
369,
2115,
1396,
220,
15,
198,
1868,
3764,
11,
611,
8,
55609,
198,
5715,
1396,
315,
57115,
315,
907,
627,
27853,
505,
3991,
7779,
3863,
3991,
25,
610,
8,
11651,
33025,
83926,
76747,
60,
55609,
198,
4110,
264,
33025,
83926,
505,
264,
925,
627,
1275,
3764,
11,
1212,
28,
15,
11,
3009,
28,
20275,
17609,
9639,
23717,
21144,
18216,
22,
11,
611,
8,
55609,
198,
5715,
1176,
1963,
315,
907,
627,
36120,
15764,
422,
279,
907,
374,
539,
3118,
627,
1735,
24089,
610,
55609,
198,
23555,
369,
2115,
1396,
220,
17,
198,
62885,
25,
610,
55609,
198,
23555,
369,
2115,
1396,
220,
16,
198,
11760,
25,
610,
55609,
198,
23555,
369,
2115,
1396,
220,
15
] | https://langchain.readthedocs.io/en/latest/graphs/langchain.graphs.networkx_graph.KnowledgeTriple.html |
18709f8b53f6-0 | langchain.graphs.networkx_graph.get_entities¶
langchain.graphs.networkx_graph.get_entities(entity_str: str) → List[str][source]¶
Extract entities from entity string. | [
5317,
8995,
10996,
82,
21216,
87,
15080,
673,
48477,
55609,
198,
5317,
8995,
10996,
82,
21216,
87,
15080,
673,
48477,
17803,
2966,
25,
610,
8,
11651,
1796,
17752,
1483,
2484,
60,
55609,
198,
30059,
15086,
505,
5502,
925,
13
] | https://langchain.readthedocs.io/en/latest/graphs/langchain.graphs.networkx_graph.get_entities.html |
2090b9740ead-0 | langchain.llms.pipelineai.PipelineAI¶
class langchain.llms.pipelineai.PipelineAI(*, cache: Optional[bool] = None, verbose: bool = None, callbacks: Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]] = None, callback_manager: Optional[BaseCallbackManager] = None, tags: Optional[List[str]] = None, pipeline_key: str = '', pipeline_kwargs: Dict[str, Any] = None, pipeline_api_key: Optional[str] = None)[source]¶
Bases: LLM, BaseModel
Wrapper around PipelineAI large language models.
To use, you should have the pipeline-ai python package installed,
and the environment variable PIPELINE_API_KEY set with your API key.
Any parameters that are valid to be passed to the call can be passed
in, even if not explicitly saved on this class.
Example
from langchain import PipelineAI
pipeline = PipelineAI(pipeline_key="")
Create a new model by parsing and validating input data from keyword arguments.
Raises ValidationError if the input data cannot be parsed to form a valid model.
param cache: Optional[bool] = None¶
param callback_manager: Optional[BaseCallbackManager] = None¶
param callbacks: Callbacks = None¶
param pipeline_api_key: Optional[str] = None¶
param pipeline_key: str = ''¶
The id or tag of the target pipeline
param pipeline_kwargs: Dict[str, Any] [Optional]¶
Holds any pipeline parameters valid for create call not
explicitly specified.
param tags: Optional[List[str]] = None¶
Tags to add to the run trace.
param verbose: bool [Optional]¶
Whether to print out response text. | [
5317,
8995,
60098,
1026,
58833,
2192,
1087,
8966,
15836,
55609,
198,
1058,
8859,
8995,
60098,
1026,
58833,
2192,
1087,
8966,
15836,
4163,
11,
6636,
25,
12536,
58,
2707,
60,
284,
2290,
11,
14008,
25,
1845,
284,
2290,
11,
27777,
25,
12536,
58,
33758,
53094,
58,
4066,
7646,
3126,
1145,
5464,
7646,
2087,
5163,
284,
2290,
11,
4927,
12418,
25,
12536,
58,
4066,
7646,
2087,
60,
284,
2290,
11,
9681,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
15660,
3173,
25,
610,
284,
9158,
15660,
37335,
25,
30226,
17752,
11,
5884,
60,
284,
2290,
11,
15660,
11959,
3173,
25,
12536,
17752,
60,
284,
2290,
6758,
2484,
60,
55609,
198,
33,
2315,
25,
445,
11237,
11,
65705,
198,
11803,
2212,
42007,
15836,
3544,
4221,
4211,
627,
1271,
1005,
11,
499,
1288,
617,
279,
15660,
12,
2192,
10344,
6462,
10487,
345,
438,
279,
4676,
3977,
72774,
8429,
11669,
6738,
743,
449,
701,
5446,
1401,
627,
8780,
5137,
430,
527,
2764,
311,
387,
5946,
311,
279,
1650,
649,
387,
5946,
198,
258,
11,
1524,
422,
539,
21650,
6924,
389,
420,
538,
627,
13617,
198,
1527,
8859,
8995,
1179,
42007,
15836,
198,
52358,
284,
42007,
15836,
1319,
8966,
3173,
64841,
4110,
264,
502,
1646,
555,
23115,
323,
69772,
1988,
828,
505,
16570,
6105,
627,
36120,
54129,
422,
279,
1988,
828,
4250,
387,
16051,
311,
1376,
264,
2764,
1646,
627,
913,
6636,
25,
12536,
58,
2707,
60,
284,
2290,
55609,
198,
913,
4927,
12418,
25,
12536,
58,
4066,
7646,
2087,
60,
284,
2290,
55609,
198,
913,
27777,
25,
23499,
82,
284,
2290,
55609,
198,
913,
15660,
11959,
3173,
25,
12536,
17752,
60,
284,
2290,
55609,
198,
913,
15660,
3173,
25,
610,
284,
3436,
55609,
198,
791,
887,
477,
4877,
315,
279,
2218,
15660,
198,
913,
15660,
37335,
25,
30226,
17752,
11,
5884,
60,
510,
15669,
60,
55609,
198,
39,
18938,
904,
15660,
5137,
2764,
369,
1893,
1650,
539,
198,
94732,
398,
5300,
627,
913,
9681,
25,
12536,
53094,
17752,
5163,
284,
2290,
55609,
198,
16309,
311,
923,
311,
279,
1629,
11917,
627,
913,
14008,
25,
1845,
510,
15669,
60,
55609,
198,
25729,
311,
1194,
704,
2077,
1495,
13
] | https://langchain.readthedocs.io/en/latest/llms/langchain.llms.pipelineai.PipelineAI.html |
2090b9740ead-1 | param verbose: bool [Optional]¶
Whether to print out response text.
__call__(prompt: str, stop: Optional[List[str]] = None, callbacks: Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]] = None, **kwargs: Any) → str¶
Check Cache and run the LLM on the given prompt and input.
async agenerate(prompts: List[str], stop: Optional[List[str]] = None, callbacks: Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]] = None, *, tags: Optional[List[str]] = None, **kwargs: Any) → LLMResult¶
Run the LLM on the given prompt and input.
async agenerate_prompt(prompts: List[PromptValue], stop: Optional[List[str]] = None, callbacks: Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]] = None, **kwargs: Any) → LLMResult¶
Take in a list of prompt values and return an LLMResult.
classmethod all_required_field_names() → Set¶
async apredict(text: str, *, stop: Optional[Sequence[str]] = None, **kwargs: Any) → str¶
Predict text from text.
async apredict_messages(messages: List[BaseMessage], *, stop: Optional[Sequence[str]] = None, **kwargs: Any) → BaseMessage¶
Predict message from messages.
validator build_extra » all fields[source]¶
Build extra kwargs from additional params that were passed in.
dict(**kwargs: Any) → Dict¶
Return a dictionary of the LLM.
generate(prompts: List[str], stop: Optional[List[str]] = None, callbacks: Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]] = None, *, tags: Optional[List[str]] = None, **kwargs: Any) → LLMResult¶
Run the LLM on the given prompt and input. | [
913,
14008,
25,
1845,
510,
15669,
60,
55609,
198,
25729,
311,
1194,
704,
2077,
1495,
627,
565,
6797,
3889,
41681,
25,
610,
11,
3009,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
27777,
25,
12536,
58,
33758,
53094,
58,
4066,
7646,
3126,
1145,
5464,
7646,
2087,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
610,
55609,
198,
4061,
20044,
323,
1629,
279,
445,
11237,
389,
279,
2728,
10137,
323,
1988,
627,
7847,
945,
13523,
84432,
13044,
25,
1796,
17752,
1145,
3009,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
27777,
25,
12536,
58,
33758,
53094,
58,
4066,
7646,
3126,
1145,
5464,
7646,
2087,
5163,
284,
2290,
11,
12039,
9681,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
445,
11237,
2122,
55609,
198,
6869,
279,
445,
11237,
389,
279,
2728,
10137,
323,
1988,
627,
7847,
945,
13523,
62521,
84432,
13044,
25,
1796,
43447,
15091,
1150,
1145,
3009,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
27777,
25,
12536,
58,
33758,
53094,
58,
4066,
7646,
3126,
1145,
5464,
7646,
2087,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
445,
11237,
2122,
55609,
198,
18293,
304,
264,
1160,
315,
10137,
2819,
323,
471,
459,
445,
11237,
2122,
627,
27853,
682,
19265,
5121,
9366,
368,
11651,
2638,
55609,
198,
7847,
1469,
9037,
7383,
25,
610,
11,
12039,
3009,
25,
12536,
58,
14405,
17752,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
610,
55609,
198,
54644,
1495,
505,
1495,
627,
7847,
1469,
9037,
24321,
56805,
25,
1796,
58,
4066,
2097,
1145,
12039,
3009,
25,
12536,
58,
14405,
17752,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
5464,
2097,
55609,
198,
54644,
1984,
505,
6743,
627,
16503,
1977,
32958,
4194,
8345,
4194,
682,
5151,
76747,
60,
55609,
198,
11313,
5066,
16901,
505,
5217,
3712,
430,
1051,
5946,
304,
627,
8644,
22551,
9872,
25,
5884,
8,
11651,
30226,
55609,
198,
5715,
264,
11240,
315,
279,
445,
11237,
627,
19927,
84432,
13044,
25,
1796,
17752,
1145,
3009,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
27777,
25,
12536,
58,
33758,
53094,
58,
4066,
7646,
3126,
1145,
5464,
7646,
2087,
5163,
284,
2290,
11,
12039,
9681,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
445,
11237,
2122,
55609,
198,
6869,
279,
445,
11237,
389,
279,
2728,
10137,
323,
1988,
13
] | https://langchain.readthedocs.io/en/latest/llms/langchain.llms.pipelineai.PipelineAI.html |
2090b9740ead-2 | Run the LLM on the given prompt and input.
generate_prompt(prompts: List[PromptValue], stop: Optional[List[str]] = None, callbacks: Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]] = None, **kwargs: Any) → LLMResult¶
Take in a list of prompt values and return an LLMResult.
get_num_tokens(text: str) → int¶
Get the number of tokens present in the text.
get_num_tokens_from_messages(messages: List[BaseMessage]) → int¶
Get the number of tokens in the message.
get_token_ids(text: str) → List[int]¶
Get the token present in the text.
predict(text: str, *, stop: Optional[Sequence[str]] = None, **kwargs: Any) → str¶
Predict text from text.
predict_messages(messages: List[BaseMessage], *, stop: Optional[Sequence[str]] = None, **kwargs: Any) → BaseMessage¶
Predict message from messages.
validator raise_deprecation » all fields¶
Raise deprecation warning if callback_manager is used.
save(file_path: Union[Path, str]) → None¶
Save the LLM.
Parameters
file_path – Path to file to save the LLM to.
Example:
.. code-block:: python
llm.save(file_path=”path/llm.yaml”)
validator set_verbose » verbose¶
If verbose is None, set it.
This allows users to pass in None as verbose to access the global setting.
to_json() → Union[SerializedConstructor, SerializedNotImplemented]¶
to_json_not_implemented() → SerializedNotImplemented¶
validator validate_environment » all fields[source]¶
Validate that api key and python package exists in environment.
property lc_attributes: Dict¶
Return a list of attribute names that should be included in the | [
6869,
279,
445,
11237,
389,
279,
2728,
10137,
323,
1988,
627,
19927,
62521,
84432,
13044,
25,
1796,
43447,
15091,
1150,
1145,
3009,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
27777,
25,
12536,
58,
33758,
53094,
58,
4066,
7646,
3126,
1145,
5464,
7646,
2087,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
445,
11237,
2122,
55609,
198,
18293,
304,
264,
1160,
315,
10137,
2819,
323,
471,
459,
445,
11237,
2122,
627,
456,
4369,
29938,
7383,
25,
610,
8,
11651,
528,
55609,
198,
1991,
279,
1396,
315,
11460,
3118,
304,
279,
1495,
627,
456,
4369,
29938,
5791,
24321,
56805,
25,
1796,
58,
4066,
2097,
2526,
11651,
528,
55609,
198,
1991,
279,
1396,
315,
11460,
304,
279,
1984,
627,
456,
6594,
8237,
7383,
25,
610,
8,
11651,
1796,
19155,
60,
55609,
198,
1991,
279,
4037,
3118,
304,
279,
1495,
627,
35798,
7383,
25,
610,
11,
12039,
3009,
25,
12536,
58,
14405,
17752,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
610,
55609,
198,
54644,
1495,
505,
1495,
627,
35798,
24321,
56805,
25,
1796,
58,
4066,
2097,
1145,
12039,
3009,
25,
12536,
58,
14405,
17752,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
5464,
2097,
55609,
198,
54644,
1984,
505,
6743,
627,
16503,
4933,
2310,
70693,
4194,
8345,
4194,
682,
5151,
55609,
198,
94201,
409,
70693,
10163,
422,
4927,
12418,
374,
1511,
627,
6766,
4971,
2703,
25,
9323,
58,
1858,
11,
610,
2526,
11651,
2290,
55609,
198,
8960,
279,
445,
11237,
627,
9905,
198,
1213,
2703,
1389,
8092,
311,
1052,
311,
3665,
279,
445,
11237,
311,
627,
13617,
512,
497,
2082,
9612,
487,
10344,
198,
657,
76,
5799,
4971,
2703,
45221,
2398,
14,
657,
76,
34506,
863,
340,
16503,
743,
69021,
4194,
8345,
4194,
14008,
55609,
198,
2746,
14008,
374,
2290,
11,
743,
433,
627,
2028,
6276,
3932,
311,
1522,
304,
2290,
439,
14008,
311,
2680,
279,
3728,
6376,
627,
998,
9643,
368,
11651,
9323,
58,
78621,
13591,
11,
92572,
2688,
18804,
60,
55609,
198,
998,
9643,
8072,
18377,
14565,
368,
11651,
92572,
2688,
18804,
55609,
198,
16503,
9788,
52874,
4194,
8345,
4194,
682,
5151,
76747,
60,
55609,
198,
18409,
430,
6464,
1401,
323,
10344,
6462,
6866,
304,
4676,
627,
3784,
37313,
18741,
25,
30226,
55609,
198,
5715,
264,
1160,
315,
7180,
5144,
430,
1288,
387,
5343,
304,
279
] | https://langchain.readthedocs.io/en/latest/llms/langchain.llms.pipelineai.PipelineAI.html |
2090b9740ead-3 | property lc_attributes: Dict¶
Return a list of attribute names that should be included in the
serialized kwargs. These attributes must be accepted by the
constructor.
property lc_namespace: List[str]¶
Return the namespace of the langchain object.
eg. [“langchain”, “llms”, “openai”]
property lc_secrets: Dict[str, str]¶
Return a map of constructor argument names to secret ids.
eg. {“openai_api_key”: “OPENAI_API_KEY”}
property lc_serializable: bool¶
Return whether or not the class is serializable.
model Config[source]¶
Bases: object
Configuration for this pydantic config.
extra = 'forbid'¶ | [
3784,
37313,
18741,
25,
30226,
55609,
198,
5715,
264,
1160,
315,
7180,
5144,
430,
1288,
387,
5343,
304,
279,
198,
76377,
16901,
13,
4314,
8365,
2011,
387,
11928,
555,
279,
198,
22602,
627,
3784,
37313,
42671,
25,
1796,
17752,
60,
55609,
198,
5715,
279,
4573,
315,
279,
8859,
8995,
1665,
627,
797,
13,
510,
2118,
5317,
8995,
9520,
1054,
657,
1026,
9520,
1054,
2569,
2192,
863,
933,
3784,
37313,
3537,
53810,
25,
30226,
17752,
11,
610,
60,
55609,
198,
5715,
264,
2472,
315,
4797,
5811,
5144,
311,
6367,
14483,
627,
797,
13,
314,
2118,
2569,
2192,
11959,
3173,
57633,
1054,
32033,
15836,
11669,
6738,
863,
534,
3784,
37313,
26684,
8499,
25,
1845,
55609,
198,
5715,
3508,
477,
539,
279,
538,
374,
6275,
8499,
627,
2590,
5649,
76747,
60,
55609,
198,
33,
2315,
25,
1665,
198,
7843,
369,
420,
4611,
67,
8322,
2242,
627,
15824,
284,
364,
2000,
21301,
6,
55609
] | https://langchain.readthedocs.io/en/latest/llms/langchain.llms.pipelineai.PipelineAI.html |
326ba4aa3ed6-0 | langchain.llms.ai21.AI21¶
class langchain.llms.ai21.AI21(*, cache: Optional[bool] = None, verbose: bool = None, callbacks: Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]] = None, callback_manager: Optional[BaseCallbackManager] = None, tags: Optional[List[str]] = None, model: str = 'j2-jumbo-instruct', temperature: float = 0.7, maxTokens: int = 256, minTokens: int = 0, topP: float = 1.0, presencePenalty: AI21PenaltyData = AI21PenaltyData(scale=0, applyToWhitespaces=True, applyToPunctuations=True, applyToNumbers=True, applyToStopwords=True, applyToEmojis=True), countPenalty: AI21PenaltyData = AI21PenaltyData(scale=0, applyToWhitespaces=True, applyToPunctuations=True, applyToNumbers=True, applyToStopwords=True, applyToEmojis=True), frequencyPenalty: AI21PenaltyData = AI21PenaltyData(scale=0, applyToWhitespaces=True, applyToPunctuations=True, applyToNumbers=True, applyToStopwords=True, applyToEmojis=True), numResults: int = 1, logitBias: Optional[Dict[str, float]] = None, ai21_api_key: Optional[str] = None, stop: Optional[List[str]] = None, base_url: Optional[str] = None)[source]¶
Bases: LLM
Wrapper around AI21 large language models.
To use, you should have the environment variable AI21_API_KEY
set with your API key.
Example
from langchain.llms import AI21
ai21 = AI21(model="j2-jumbo-instruct") | [
5317,
8995,
60098,
1026,
41483,
1691,
89878,
1691,
55609,
198,
1058,
8859,
8995,
60098,
1026,
41483,
1691,
89878,
1691,
4163,
11,
6636,
25,
12536,
58,
2707,
60,
284,
2290,
11,
14008,
25,
1845,
284,
2290,
11,
27777,
25,
12536,
58,
33758,
53094,
58,
4066,
7646,
3126,
1145,
5464,
7646,
2087,
5163,
284,
2290,
11,
4927,
12418,
25,
12536,
58,
4066,
7646,
2087,
60,
284,
2290,
11,
9681,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
1646,
25,
610,
284,
364,
73,
17,
13636,
49709,
3502,
1257,
518,
9499,
25,
2273,
284,
220,
15,
13,
22,
11,
1973,
30400,
25,
528,
284,
220,
4146,
11,
1332,
30400,
25,
528,
284,
220,
15,
11,
1948,
47,
25,
2273,
284,
220,
16,
13,
15,
11,
9546,
29305,
10231,
25,
15592,
1691,
29305,
10231,
1061,
284,
15592,
1691,
29305,
10231,
1061,
43173,
28,
15,
11,
3881,
1271,
1671,
3695,
28438,
3702,
11,
3881,
1271,
47,
20526,
38170,
3702,
11,
3881,
1271,
28336,
3702,
11,
3881,
1271,
10903,
5880,
3702,
11,
3881,
1271,
2321,
84528,
3702,
705,
1797,
29305,
10231,
25,
15592,
1691,
29305,
10231,
1061,
284,
15592,
1691,
29305,
10231,
1061,
43173,
28,
15,
11,
3881,
1271,
1671,
3695,
28438,
3702,
11,
3881,
1271,
47,
20526,
38170,
3702,
11,
3881,
1271,
28336,
3702,
11,
3881,
1271,
10903,
5880,
3702,
11,
3881,
1271,
2321,
84528,
3702,
705,
11900,
29305,
10231,
25,
15592,
1691,
29305,
10231,
1061,
284,
15592,
1691,
29305,
10231,
1061,
43173,
28,
15,
11,
3881,
1271,
1671,
3695,
28438,
3702,
11,
3881,
1271,
47,
20526,
38170,
3702,
11,
3881,
1271,
28336,
3702,
11,
3881,
1271,
10903,
5880,
3702,
11,
3881,
1271,
2321,
84528,
3702,
705,
1661,
10001,
25,
528,
284,
220,
16,
11,
1515,
275,
83652,
25,
12536,
58,
13755,
17752,
11,
2273,
5163,
284,
2290,
11,
16796,
1691,
11959,
3173,
25,
12536,
17752,
60,
284,
2290,
11,
3009,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
2385,
2975,
25,
12536,
17752,
60,
284,
2290,
6758,
2484,
60,
55609,
198,
33,
2315,
25,
445,
11237,
198,
11803,
2212,
15592,
1691,
3544,
4221,
4211,
627,
1271,
1005,
11,
499,
1288,
617,
279,
4676,
3977,
15592,
1691,
11669,
6738,
198,
751,
449,
701,
5446,
1401,
627,
13617,
198,
1527,
8859,
8995,
60098,
1026,
1179,
15592,
1691,
198,
2192,
1691,
284,
15592,
1691,
7790,
429,
73,
17,
13636,
49709,
3502,
1257,
909
] | https://langchain.readthedocs.io/en/latest/llms/langchain.llms.ai21.AI21.html |
326ba4aa3ed6-1 | ai21 = AI21(model="j2-jumbo-instruct")
Create a new model by parsing and validating input data from keyword arguments.
Raises ValidationError if the input data cannot be parsed to form a valid model.
param ai21_api_key: Optional[str] = None¶
param base_url: Optional[str] = None¶
Base url to use, if None decides based on model name.
param cache: Optional[bool] = None¶
param callback_manager: Optional[BaseCallbackManager] = None¶
param callbacks: Callbacks = None¶
param countPenalty: langchain.llms.ai21.AI21PenaltyData = AI21PenaltyData(scale=0, applyToWhitespaces=True, applyToPunctuations=True, applyToNumbers=True, applyToStopwords=True, applyToEmojis=True)¶
Penalizes repeated tokens according to count.
param frequencyPenalty: langchain.llms.ai21.AI21PenaltyData = AI21PenaltyData(scale=0, applyToWhitespaces=True, applyToPunctuations=True, applyToNumbers=True, applyToStopwords=True, applyToEmojis=True)¶
Penalizes repeated tokens according to frequency.
param logitBias: Optional[Dict[str, float]] = None¶
Adjust the probability of specific tokens being generated.
param maxTokens: int = 256¶
The maximum number of tokens to generate in the completion.
param minTokens: int = 0¶
The minimum number of tokens to generate in the completion.
param model: str = 'j2-jumbo-instruct'¶
Model name to use.
param numResults: int = 1¶
How many completions to generate for each prompt. | [
2192,
1691,
284,
15592,
1691,
7790,
429,
73,
17,
13636,
49709,
3502,
1257,
1158,
4110,
264,
502,
1646,
555,
23115,
323,
69772,
1988,
828,
505,
16570,
6105,
627,
36120,
54129,
422,
279,
1988,
828,
4250,
387,
16051,
311,
1376,
264,
2764,
1646,
627,
913,
16796,
1691,
11959,
3173,
25,
12536,
17752,
60,
284,
2290,
55609,
198,
913,
2385,
2975,
25,
12536,
17752,
60,
284,
2290,
55609,
198,
4066,
2576,
311,
1005,
11,
422,
2290,
28727,
3196,
389,
1646,
836,
627,
913,
6636,
25,
12536,
58,
2707,
60,
284,
2290,
55609,
198,
913,
4927,
12418,
25,
12536,
58,
4066,
7646,
2087,
60,
284,
2290,
55609,
198,
913,
27777,
25,
23499,
82,
284,
2290,
55609,
198,
913,
1797,
29305,
10231,
25,
8859,
8995,
60098,
1026,
41483,
1691,
89878,
1691,
29305,
10231,
1061,
284,
15592,
1691,
29305,
10231,
1061,
43173,
28,
15,
11,
3881,
1271,
1671,
3695,
28438,
3702,
11,
3881,
1271,
47,
20526,
38170,
3702,
11,
3881,
1271,
28336,
3702,
11,
3881,
1271,
10903,
5880,
3702,
11,
3881,
1271,
2321,
84528,
3702,
8,
55609,
198,
29305,
278,
4861,
11763,
11460,
4184,
311,
1797,
627,
913,
11900,
29305,
10231,
25,
8859,
8995,
60098,
1026,
41483,
1691,
89878,
1691,
29305,
10231,
1061,
284,
15592,
1691,
29305,
10231,
1061,
43173,
28,
15,
11,
3881,
1271,
1671,
3695,
28438,
3702,
11,
3881,
1271,
47,
20526,
38170,
3702,
11,
3881,
1271,
28336,
3702,
11,
3881,
1271,
10903,
5880,
3702,
11,
3881,
1271,
2321,
84528,
3702,
8,
55609,
198,
29305,
278,
4861,
11763,
11460,
4184,
311,
11900,
627,
913,
1515,
275,
83652,
25,
12536,
58,
13755,
17752,
11,
2273,
5163,
284,
2290,
55609,
198,
39716,
279,
19463,
315,
3230,
11460,
1694,
8066,
627,
913,
1973,
30400,
25,
528,
284,
220,
4146,
55609,
198,
791,
7340,
1396,
315,
11460,
311,
7068,
304,
279,
9954,
627,
913,
1332,
30400,
25,
528,
284,
220,
15,
55609,
198,
791,
8187,
1396,
315,
11460,
311,
7068,
304,
279,
9954,
627,
913,
1646,
25,
610,
284,
364,
73,
17,
13636,
49709,
3502,
1257,
6,
55609,
198,
1747,
836,
311,
1005,
627,
913,
1661,
10001,
25,
528,
284,
220,
16,
55609,
198,
4438,
1690,
3543,
919,
311,
7068,
369,
1855,
10137,
13
] | https://langchain.readthedocs.io/en/latest/llms/langchain.llms.ai21.AI21.html |
326ba4aa3ed6-2 | How many completions to generate for each prompt.
param presencePenalty: langchain.llms.ai21.AI21PenaltyData = AI21PenaltyData(scale=0, applyToWhitespaces=True, applyToPunctuations=True, applyToNumbers=True, applyToStopwords=True, applyToEmojis=True)¶
Penalizes repeated tokens.
param stop: Optional[List[str]] = None¶
param tags: Optional[List[str]] = None¶
Tags to add to the run trace.
param temperature: float = 0.7¶
What sampling temperature to use.
param topP: float = 1.0¶
Total probability mass of tokens to consider at each step.
param verbose: bool [Optional]¶
Whether to print out response text.
__call__(prompt: str, stop: Optional[List[str]] = None, callbacks: Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]] = None, **kwargs: Any) → str¶
Check Cache and run the LLM on the given prompt and input.
async agenerate(prompts: List[str], stop: Optional[List[str]] = None, callbacks: Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]] = None, *, tags: Optional[List[str]] = None, **kwargs: Any) → LLMResult¶
Run the LLM on the given prompt and input.
async agenerate_prompt(prompts: List[PromptValue], stop: Optional[List[str]] = None, callbacks: Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]] = None, **kwargs: Any) → LLMResult¶
Take in a list of prompt values and return an LLMResult.
classmethod all_required_field_names() → Set¶
async apredict(text: str, *, stop: Optional[Sequence[str]] = None, **kwargs: Any) → str¶ | [
4438,
1690,
3543,
919,
311,
7068,
369,
1855,
10137,
627,
913,
9546,
29305,
10231,
25,
8859,
8995,
60098,
1026,
41483,
1691,
89878,
1691,
29305,
10231,
1061,
284,
15592,
1691,
29305,
10231,
1061,
43173,
28,
15,
11,
3881,
1271,
1671,
3695,
28438,
3702,
11,
3881,
1271,
47,
20526,
38170,
3702,
11,
3881,
1271,
28336,
3702,
11,
3881,
1271,
10903,
5880,
3702,
11,
3881,
1271,
2321,
84528,
3702,
8,
55609,
198,
29305,
278,
4861,
11763,
11460,
627,
913,
3009,
25,
12536,
53094,
17752,
5163,
284,
2290,
55609,
198,
913,
9681,
25,
12536,
53094,
17752,
5163,
284,
2290,
55609,
198,
16309,
311,
923,
311,
279,
1629,
11917,
627,
913,
9499,
25,
2273,
284,
220,
15,
13,
22,
55609,
198,
3923,
25936,
9499,
311,
1005,
627,
913,
1948,
47,
25,
2273,
284,
220,
16,
13,
15,
55609,
198,
7749,
19463,
3148,
315,
11460,
311,
2980,
520,
1855,
3094,
627,
913,
14008,
25,
1845,
510,
15669,
60,
55609,
198,
25729,
311,
1194,
704,
2077,
1495,
627,
565,
6797,
3889,
41681,
25,
610,
11,
3009,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
27777,
25,
12536,
58,
33758,
53094,
58,
4066,
7646,
3126,
1145,
5464,
7646,
2087,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
610,
55609,
198,
4061,
20044,
323,
1629,
279,
445,
11237,
389,
279,
2728,
10137,
323,
1988,
627,
7847,
945,
13523,
84432,
13044,
25,
1796,
17752,
1145,
3009,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
27777,
25,
12536,
58,
33758,
53094,
58,
4066,
7646,
3126,
1145,
5464,
7646,
2087,
5163,
284,
2290,
11,
12039,
9681,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
445,
11237,
2122,
55609,
198,
6869,
279,
445,
11237,
389,
279,
2728,
10137,
323,
1988,
627,
7847,
945,
13523,
62521,
84432,
13044,
25,
1796,
43447,
15091,
1150,
1145,
3009,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
27777,
25,
12536,
58,
33758,
53094,
58,
4066,
7646,
3126,
1145,
5464,
7646,
2087,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
445,
11237,
2122,
55609,
198,
18293,
304,
264,
1160,
315,
10137,
2819,
323,
471,
459,
445,
11237,
2122,
627,
27853,
682,
19265,
5121,
9366,
368,
11651,
2638,
55609,
198,
7847,
1469,
9037,
7383,
25,
610,
11,
12039,
3009,
25,
12536,
58,
14405,
17752,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
610,
55609
] | https://langchain.readthedocs.io/en/latest/llms/langchain.llms.ai21.AI21.html |
326ba4aa3ed6-3 | Predict text from text.
async apredict_messages(messages: List[BaseMessage], *, stop: Optional[Sequence[str]] = None, **kwargs: Any) → BaseMessage¶
Predict message from messages.
dict(**kwargs: Any) → Dict¶
Return a dictionary of the LLM.
generate(prompts: List[str], stop: Optional[List[str]] = None, callbacks: Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]] = None, *, tags: Optional[List[str]] = None, **kwargs: Any) → LLMResult¶
Run the LLM on the given prompt and input.
generate_prompt(prompts: List[PromptValue], stop: Optional[List[str]] = None, callbacks: Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]] = None, **kwargs: Any) → LLMResult¶
Take in a list of prompt values and return an LLMResult.
get_num_tokens(text: str) → int¶
Get the number of tokens present in the text.
get_num_tokens_from_messages(messages: List[BaseMessage]) → int¶
Get the number of tokens in the message.
get_token_ids(text: str) → List[int]¶
Get the token present in the text.
predict(text: str, *, stop: Optional[Sequence[str]] = None, **kwargs: Any) → str¶
Predict text from text.
predict_messages(messages: List[BaseMessage], *, stop: Optional[Sequence[str]] = None, **kwargs: Any) → BaseMessage¶
Predict message from messages.
validator raise_deprecation » all fields¶
Raise deprecation warning if callback_manager is used.
save(file_path: Union[Path, str]) → None¶
Save the LLM.
Parameters
file_path – Path to file to save the LLM to.
Example:
.. code-block:: python | [
54644,
1495,
505,
1495,
627,
7847,
1469,
9037,
24321,
56805,
25,
1796,
58,
4066,
2097,
1145,
12039,
3009,
25,
12536,
58,
14405,
17752,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
5464,
2097,
55609,
198,
54644,
1984,
505,
6743,
627,
8644,
22551,
9872,
25,
5884,
8,
11651,
30226,
55609,
198,
5715,
264,
11240,
315,
279,
445,
11237,
627,
19927,
84432,
13044,
25,
1796,
17752,
1145,
3009,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
27777,
25,
12536,
58,
33758,
53094,
58,
4066,
7646,
3126,
1145,
5464,
7646,
2087,
5163,
284,
2290,
11,
12039,
9681,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
445,
11237,
2122,
55609,
198,
6869,
279,
445,
11237,
389,
279,
2728,
10137,
323,
1988,
627,
19927,
62521,
84432,
13044,
25,
1796,
43447,
15091,
1150,
1145,
3009,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
27777,
25,
12536,
58,
33758,
53094,
58,
4066,
7646,
3126,
1145,
5464,
7646,
2087,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
445,
11237,
2122,
55609,
198,
18293,
304,
264,
1160,
315,
10137,
2819,
323,
471,
459,
445,
11237,
2122,
627,
456,
4369,
29938,
7383,
25,
610,
8,
11651,
528,
55609,
198,
1991,
279,
1396,
315,
11460,
3118,
304,
279,
1495,
627,
456,
4369,
29938,
5791,
24321,
56805,
25,
1796,
58,
4066,
2097,
2526,
11651,
528,
55609,
198,
1991,
279,
1396,
315,
11460,
304,
279,
1984,
627,
456,
6594,
8237,
7383,
25,
610,
8,
11651,
1796,
19155,
60,
55609,
198,
1991,
279,
4037,
3118,
304,
279,
1495,
627,
35798,
7383,
25,
610,
11,
12039,
3009,
25,
12536,
58,
14405,
17752,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
610,
55609,
198,
54644,
1495,
505,
1495,
627,
35798,
24321,
56805,
25,
1796,
58,
4066,
2097,
1145,
12039,
3009,
25,
12536,
58,
14405,
17752,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
5464,
2097,
55609,
198,
54644,
1984,
505,
6743,
627,
16503,
4933,
2310,
70693,
4194,
8345,
4194,
682,
5151,
55609,
198,
94201,
409,
70693,
10163,
422,
4927,
12418,
374,
1511,
627,
6766,
4971,
2703,
25,
9323,
58,
1858,
11,
610,
2526,
11651,
2290,
55609,
198,
8960,
279,
445,
11237,
627,
9905,
198,
1213,
2703,
1389,
8092,
311,
1052,
311,
3665,
279,
445,
11237,
311,
627,
13617,
512,
497,
2082,
9612,
487,
10344
] | https://langchain.readthedocs.io/en/latest/llms/langchain.llms.ai21.AI21.html |
326ba4aa3ed6-4 | Example:
.. code-block:: python
llm.save(file_path=”path/llm.yaml”)
validator set_verbose » verbose¶
If verbose is None, set it.
This allows users to pass in None as verbose to access the global setting.
to_json() → Union[SerializedConstructor, SerializedNotImplemented]¶
to_json_not_implemented() → SerializedNotImplemented¶
validator validate_environment » all fields[source]¶
Validate that api key exists in environment.
property lc_attributes: Dict¶
Return a list of attribute names that should be included in the
serialized kwargs. These attributes must be accepted by the
constructor.
property lc_namespace: List[str]¶
Return the namespace of the langchain object.
eg. [“langchain”, “llms”, “openai”]
property lc_secrets: Dict[str, str]¶
Return a map of constructor argument names to secret ids.
eg. {“openai_api_key”: “OPENAI_API_KEY”}
property lc_serializable: bool¶
Return whether or not the class is serializable.
model Config[source]¶
Bases: object
Configuration for this pydantic object.
extra = 'forbid'¶ | [
13617,
512,
497,
2082,
9612,
487,
10344,
198,
657,
76,
5799,
4971,
2703,
45221,
2398,
14,
657,
76,
34506,
863,
340,
16503,
743,
69021,
4194,
8345,
4194,
14008,
55609,
198,
2746,
14008,
374,
2290,
11,
743,
433,
627,
2028,
6276,
3932,
311,
1522,
304,
2290,
439,
14008,
311,
2680,
279,
3728,
6376,
627,
998,
9643,
368,
11651,
9323,
58,
78621,
13591,
11,
92572,
2688,
18804,
60,
55609,
198,
998,
9643,
8072,
18377,
14565,
368,
11651,
92572,
2688,
18804,
55609,
198,
16503,
9788,
52874,
4194,
8345,
4194,
682,
5151,
76747,
60,
55609,
198,
18409,
430,
6464,
1401,
6866,
304,
4676,
627,
3784,
37313,
18741,
25,
30226,
55609,
198,
5715,
264,
1160,
315,
7180,
5144,
430,
1288,
387,
5343,
304,
279,
198,
76377,
16901,
13,
4314,
8365,
2011,
387,
11928,
555,
279,
198,
22602,
627,
3784,
37313,
42671,
25,
1796,
17752,
60,
55609,
198,
5715,
279,
4573,
315,
279,
8859,
8995,
1665,
627,
797,
13,
510,
2118,
5317,
8995,
9520,
1054,
657,
1026,
9520,
1054,
2569,
2192,
863,
933,
3784,
37313,
3537,
53810,
25,
30226,
17752,
11,
610,
60,
55609,
198,
5715,
264,
2472,
315,
4797,
5811,
5144,
311,
6367,
14483,
627,
797,
13,
314,
2118,
2569,
2192,
11959,
3173,
57633,
1054,
32033,
15836,
11669,
6738,
863,
534,
3784,
37313,
26684,
8499,
25,
1845,
55609,
198,
5715,
3508,
477,
539,
279,
538,
374,
6275,
8499,
627,
2590,
5649,
76747,
60,
55609,
198,
33,
2315,
25,
1665,
198,
7843,
369,
420,
4611,
67,
8322,
1665,
627,
15824,
284,
364,
2000,
21301,
6,
55609
] | https://langchain.readthedocs.io/en/latest/llms/langchain.llms.ai21.AI21.html |
5926f56b426f-0 | langchain.llms.aviary.Aviary¶
class langchain.llms.aviary.Aviary(*, cache: Optional[bool] = None, verbose: bool = None, callbacks: Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]] = None, callback_manager: Optional[BaseCallbackManager] = None, tags: Optional[List[str]] = None, model: str = 'amazon/LightGPT', aviary_url: Optional[str] = None, aviary_token: Optional[str] = None, use_prompt_format: bool = True, version: Optional[str] = None)[source]¶
Bases: LLM
Allow you to use an Aviary.
Aviary is a backend for hosted models. You can
find out more about aviary at
http://github.com/ray-project/aviary
To get a list of the models supported on an
aviary, follow the instructions on the web site to
install the aviary CLI and then use:
aviary models
AVIARY_URL and AVIARY_TOKEN environement variables must be set.
Example
from langchain.llms import Aviary
os.environ["AVIARY_URL"] = "<URL>"
os.environ["AVIARY_TOKEN"] = "<TOKEN>"
light = Aviary(model='amazon/LightGPT')
output = light('How do you make fried rice?')
Create a new model by parsing and validating input data from keyword arguments.
Raises ValidationError if the input data cannot be parsed to form a valid model.
param aviary_token: Optional[str] = None¶
param aviary_url: Optional[str] = None¶
param cache: Optional[bool] = None¶
param callback_manager: Optional[BaseCallbackManager] = None¶
param callbacks: Callbacks = None¶ | [
5317,
8995,
60098,
1026,
85652,
661,
885,
10176,
661,
55609,
198,
1058,
8859,
8995,
60098,
1026,
85652,
661,
885,
10176,
661,
4163,
11,
6636,
25,
12536,
58,
2707,
60,
284,
2290,
11,
14008,
25,
1845,
284,
2290,
11,
27777,
25,
12536,
58,
33758,
53094,
58,
4066,
7646,
3126,
1145,
5464,
7646,
2087,
5163,
284,
2290,
11,
4927,
12418,
25,
12536,
58,
4066,
7646,
2087,
60,
284,
2290,
11,
9681,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
1646,
25,
610,
284,
364,
73853,
7586,
492,
38,
2898,
518,
99288,
661,
2975,
25,
12536,
17752,
60,
284,
2290,
11,
99288,
661,
6594,
25,
12536,
17752,
60,
284,
2290,
11,
1005,
62521,
9132,
25,
1845,
284,
3082,
11,
2373,
25,
12536,
17752,
60,
284,
2290,
6758,
2484,
60,
55609,
198,
33,
2315,
25,
445,
11237,
198,
19118,
499,
311,
1005,
459,
7671,
72,
661,
627,
10066,
72,
661,
374,
264,
19713,
369,
21685,
4211,
13,
1472,
649,
198,
3990,
704,
810,
922,
99288,
661,
520,
198,
1277,
1129,
5316,
916,
7534,
352,
34796,
14,
6321,
661,
198,
1271,
636,
264,
1160,
315,
279,
4211,
7396,
389,
459,
198,
6321,
661,
11,
1833,
279,
11470,
389,
279,
3566,
2816,
311,
198,
12527,
279,
99288,
661,
40377,
323,
1243,
1005,
512,
6321,
661,
4211,
198,
8253,
40,
8812,
8159,
323,
12431,
40,
8812,
19199,
50026,
1133,
7482,
2011,
387,
743,
627,
13617,
198,
1527,
8859,
8995,
60098,
1026,
1179,
7671,
72,
661,
198,
437,
24656,
1204,
8253,
40,
8812,
8159,
1365,
284,
4145,
3222,
19681,
437,
24656,
1204,
8253,
40,
8812,
19199,
1365,
284,
4145,
63953,
19681,
4238,
284,
7671,
72,
661,
7790,
1151,
73853,
7586,
492,
38,
2898,
1329,
3081,
284,
3177,
493,
4438,
656,
499,
1304,
41951,
20228,
30,
1329,
4110,
264,
502,
1646,
555,
23115,
323,
69772,
1988,
828,
505,
16570,
6105,
627,
36120,
54129,
422,
279,
1988,
828,
4250,
387,
16051,
311,
1376,
264,
2764,
1646,
627,
913,
99288,
661,
6594,
25,
12536,
17752,
60,
284,
2290,
55609,
198,
913,
99288,
661,
2975,
25,
12536,
17752,
60,
284,
2290,
55609,
198,
913,
6636,
25,
12536,
58,
2707,
60,
284,
2290,
55609,
198,
913,
4927,
12418,
25,
12536,
58,
4066,
7646,
2087,
60,
284,
2290,
55609,
198,
913,
27777,
25,
23499,
82,
284,
2290,
55609
] | https://langchain.readthedocs.io/en/latest/llms/langchain.llms.aviary.Aviary.html |
5926f56b426f-1 | param callbacks: Callbacks = None¶
param model: str = 'amazon/LightGPT'¶
param tags: Optional[List[str]] = None¶
Tags to add to the run trace.
param use_prompt_format: bool = True¶
param verbose: bool [Optional]¶
Whether to print out response text.
param version: Optional[str] = None¶
__call__(prompt: str, stop: Optional[List[str]] = None, callbacks: Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]] = None, **kwargs: Any) → str¶
Check Cache and run the LLM on the given prompt and input.
async agenerate(prompts: List[str], stop: Optional[List[str]] = None, callbacks: Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]] = None, *, tags: Optional[List[str]] = None, **kwargs: Any) → LLMResult¶
Run the LLM on the given prompt and input.
async agenerate_prompt(prompts: List[PromptValue], stop: Optional[List[str]] = None, callbacks: Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]] = None, **kwargs: Any) → LLMResult¶
Take in a list of prompt values and return an LLMResult.
classmethod all_required_field_names() → Set¶
async apredict(text: str, *, stop: Optional[Sequence[str]] = None, **kwargs: Any) → str¶
Predict text from text.
async apredict_messages(messages: List[BaseMessage], *, stop: Optional[Sequence[str]] = None, **kwargs: Any) → BaseMessage¶
Predict message from messages.
dict(**kwargs: Any) → Dict¶
Return a dictionary of the LLM. | [
913,
27777,
25,
23499,
82,
284,
2290,
55609,
198,
913,
1646,
25,
610,
284,
364,
73853,
7586,
492,
38,
2898,
6,
55609,
198,
913,
9681,
25,
12536,
53094,
17752,
5163,
284,
2290,
55609,
198,
16309,
311,
923,
311,
279,
1629,
11917,
627,
913,
1005,
62521,
9132,
25,
1845,
284,
3082,
55609,
198,
913,
14008,
25,
1845,
510,
15669,
60,
55609,
198,
25729,
311,
1194,
704,
2077,
1495,
627,
913,
2373,
25,
12536,
17752,
60,
284,
2290,
55609,
198,
565,
6797,
3889,
41681,
25,
610,
11,
3009,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
27777,
25,
12536,
58,
33758,
53094,
58,
4066,
7646,
3126,
1145,
5464,
7646,
2087,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
610,
55609,
198,
4061,
20044,
323,
1629,
279,
445,
11237,
389,
279,
2728,
10137,
323,
1988,
627,
7847,
945,
13523,
84432,
13044,
25,
1796,
17752,
1145,
3009,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
27777,
25,
12536,
58,
33758,
53094,
58,
4066,
7646,
3126,
1145,
5464,
7646,
2087,
5163,
284,
2290,
11,
12039,
9681,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
445,
11237,
2122,
55609,
198,
6869,
279,
445,
11237,
389,
279,
2728,
10137,
323,
1988,
627,
7847,
945,
13523,
62521,
84432,
13044,
25,
1796,
43447,
15091,
1150,
1145,
3009,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
27777,
25,
12536,
58,
33758,
53094,
58,
4066,
7646,
3126,
1145,
5464,
7646,
2087,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
445,
11237,
2122,
55609,
198,
18293,
304,
264,
1160,
315,
10137,
2819,
323,
471,
459,
445,
11237,
2122,
627,
27853,
682,
19265,
5121,
9366,
368,
11651,
2638,
55609,
198,
7847,
1469,
9037,
7383,
25,
610,
11,
12039,
3009,
25,
12536,
58,
14405,
17752,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
610,
55609,
198,
54644,
1495,
505,
1495,
627,
7847,
1469,
9037,
24321,
56805,
25,
1796,
58,
4066,
2097,
1145,
12039,
3009,
25,
12536,
58,
14405,
17752,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
5464,
2097,
55609,
198,
54644,
1984,
505,
6743,
627,
8644,
22551,
9872,
25,
5884,
8,
11651,
30226,
55609,
198,
5715,
264,
11240,
315,
279,
445,
11237,
13
] | https://langchain.readthedocs.io/en/latest/llms/langchain.llms.aviary.Aviary.html |
5926f56b426f-2 | dict(**kwargs: Any) → Dict¶
Return a dictionary of the LLM.
generate(prompts: List[str], stop: Optional[List[str]] = None, callbacks: Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]] = None, *, tags: Optional[List[str]] = None, **kwargs: Any) → LLMResult¶
Run the LLM on the given prompt and input.
generate_prompt(prompts: List[PromptValue], stop: Optional[List[str]] = None, callbacks: Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]] = None, **kwargs: Any) → LLMResult¶
Take in a list of prompt values and return an LLMResult.
get_num_tokens(text: str) → int¶
Get the number of tokens present in the text.
get_num_tokens_from_messages(messages: List[BaseMessage]) → int¶
Get the number of tokens in the message.
get_token_ids(text: str) → List[int]¶
Get the token present in the text.
predict(text: str, *, stop: Optional[Sequence[str]] = None, **kwargs: Any) → str¶
Predict text from text.
predict_messages(messages: List[BaseMessage], *, stop: Optional[Sequence[str]] = None, **kwargs: Any) → BaseMessage¶
Predict message from messages.
validator raise_deprecation » all fields¶
Raise deprecation warning if callback_manager is used.
save(file_path: Union[Path, str]) → None¶
Save the LLM.
Parameters
file_path – Path to file to save the LLM to.
Example:
.. code-block:: python
llm.save(file_path=”path/llm.yaml”)
validator set_verbose » verbose¶
If verbose is None, set it.
This allows users to pass in None as verbose to access the global setting. | [
8644,
22551,
9872,
25,
5884,
8,
11651,
30226,
55609,
198,
5715,
264,
11240,
315,
279,
445,
11237,
627,
19927,
84432,
13044,
25,
1796,
17752,
1145,
3009,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
27777,
25,
12536,
58,
33758,
53094,
58,
4066,
7646,
3126,
1145,
5464,
7646,
2087,
5163,
284,
2290,
11,
12039,
9681,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
445,
11237,
2122,
55609,
198,
6869,
279,
445,
11237,
389,
279,
2728,
10137,
323,
1988,
627,
19927,
62521,
84432,
13044,
25,
1796,
43447,
15091,
1150,
1145,
3009,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
27777,
25,
12536,
58,
33758,
53094,
58,
4066,
7646,
3126,
1145,
5464,
7646,
2087,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
445,
11237,
2122,
55609,
198,
18293,
304,
264,
1160,
315,
10137,
2819,
323,
471,
459,
445,
11237,
2122,
627,
456,
4369,
29938,
7383,
25,
610,
8,
11651,
528,
55609,
198,
1991,
279,
1396,
315,
11460,
3118,
304,
279,
1495,
627,
456,
4369,
29938,
5791,
24321,
56805,
25,
1796,
58,
4066,
2097,
2526,
11651,
528,
55609,
198,
1991,
279,
1396,
315,
11460,
304,
279,
1984,
627,
456,
6594,
8237,
7383,
25,
610,
8,
11651,
1796,
19155,
60,
55609,
198,
1991,
279,
4037,
3118,
304,
279,
1495,
627,
35798,
7383,
25,
610,
11,
12039,
3009,
25,
12536,
58,
14405,
17752,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
610,
55609,
198,
54644,
1495,
505,
1495,
627,
35798,
24321,
56805,
25,
1796,
58,
4066,
2097,
1145,
12039,
3009,
25,
12536,
58,
14405,
17752,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
5464,
2097,
55609,
198,
54644,
1984,
505,
6743,
627,
16503,
4933,
2310,
70693,
4194,
8345,
4194,
682,
5151,
55609,
198,
94201,
409,
70693,
10163,
422,
4927,
12418,
374,
1511,
627,
6766,
4971,
2703,
25,
9323,
58,
1858,
11,
610,
2526,
11651,
2290,
55609,
198,
8960,
279,
445,
11237,
627,
9905,
198,
1213,
2703,
1389,
8092,
311,
1052,
311,
3665,
279,
445,
11237,
311,
627,
13617,
512,
497,
2082,
9612,
487,
10344,
198,
657,
76,
5799,
4971,
2703,
45221,
2398,
14,
657,
76,
34506,
863,
340,
16503,
743,
69021,
4194,
8345,
4194,
14008,
55609,
198,
2746,
14008,
374,
2290,
11,
743,
433,
627,
2028,
6276,
3932,
311,
1522,
304,
2290,
439,
14008,
311,
2680,
279,
3728,
6376,
13
] | https://langchain.readthedocs.io/en/latest/llms/langchain.llms.aviary.Aviary.html |
5926f56b426f-3 | This allows users to pass in None as verbose to access the global setting.
to_json() → Union[SerializedConstructor, SerializedNotImplemented]¶
to_json_not_implemented() → SerializedNotImplemented¶
validator validate_environment » all fields[source]¶
Validate that api key and python package exists in environment.
property lc_attributes: Dict¶
Return a list of attribute names that should be included in the
serialized kwargs. These attributes must be accepted by the
constructor.
property lc_namespace: List[str]¶
Return the namespace of the langchain object.
eg. [“langchain”, “llms”, “openai”]
property lc_secrets: Dict[str, str]¶
Return a map of constructor argument names to secret ids.
eg. {“openai_api_key”: “OPENAI_API_KEY”}
property lc_serializable: bool¶
Return whether or not the class is serializable.
model Config[source]¶
Bases: object
Configuration for this pydantic object.
extra = 'forbid'¶ | [
2028,
6276,
3932,
311,
1522,
304,
2290,
439,
14008,
311,
2680,
279,
3728,
6376,
627,
998,
9643,
368,
11651,
9323,
58,
78621,
13591,
11,
92572,
2688,
18804,
60,
55609,
198,
998,
9643,
8072,
18377,
14565,
368,
11651,
92572,
2688,
18804,
55609,
198,
16503,
9788,
52874,
4194,
8345,
4194,
682,
5151,
76747,
60,
55609,
198,
18409,
430,
6464,
1401,
323,
10344,
6462,
6866,
304,
4676,
627,
3784,
37313,
18741,
25,
30226,
55609,
198,
5715,
264,
1160,
315,
7180,
5144,
430,
1288,
387,
5343,
304,
279,
198,
76377,
16901,
13,
4314,
8365,
2011,
387,
11928,
555,
279,
198,
22602,
627,
3784,
37313,
42671,
25,
1796,
17752,
60,
55609,
198,
5715,
279,
4573,
315,
279,
8859,
8995,
1665,
627,
797,
13,
510,
2118,
5317,
8995,
9520,
1054,
657,
1026,
9520,
1054,
2569,
2192,
863,
933,
3784,
37313,
3537,
53810,
25,
30226,
17752,
11,
610,
60,
55609,
198,
5715,
264,
2472,
315,
4797,
5811,
5144,
311,
6367,
14483,
627,
797,
13,
314,
2118,
2569,
2192,
11959,
3173,
57633,
1054,
32033,
15836,
11669,
6738,
863,
534,
3784,
37313,
26684,
8499,
25,
1845,
55609,
198,
5715,
3508,
477,
539,
279,
538,
374,
6275,
8499,
627,
2590,
5649,
76747,
60,
55609,
198,
33,
2315,
25,
1665,
198,
7843,
369,
420,
4611,
67,
8322,
1665,
627,
15824,
284,
364,
2000,
21301,
6,
55609
] | https://langchain.readthedocs.io/en/latest/llms/langchain.llms.aviary.Aviary.html |
3a744f2abb6e-0 | langchain.llms.ctransformers.CTransformers¶
class langchain.llms.ctransformers.CTransformers(*, cache: Optional[bool] = None, verbose: bool = None, callbacks: Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]] = None, callback_manager: Optional[BaseCallbackManager] = None, tags: Optional[List[str]] = None, client: Any = None, model: str, model_type: Optional[str] = None, model_file: Optional[str] = None, config: Optional[Dict[str, Any]] = None, lib: Optional[str] = None)[source]¶
Bases: LLM
Wrapper around the C Transformers LLM interface.
To use, you should have the ctransformers python package installed.
See https://github.com/marella/ctransformers
Example
from langchain.llms import CTransformers
llm = CTransformers(model="/path/to/ggml-gpt-2.bin", model_type="gpt2")
Create a new model by parsing and validating input data from keyword arguments.
Raises ValidationError if the input data cannot be parsed to form a valid model.
param cache: Optional[bool] = None¶
param callback_manager: Optional[BaseCallbackManager] = None¶
param callbacks: Callbacks = None¶
param config: Optional[Dict[str, Any]] = None¶
The config parameters.
See https://github.com/marella/ctransformers#config
param lib: Optional[str] = None¶
The path to a shared library or one of avx2, avx, basic.
param model: str [Required]¶
The path to a model file or directory or the name of a Hugging Face Hub
model repo.
param model_file: Optional[str] = None¶ | [
5317,
8995,
60098,
1026,
13,
10820,
598,
630,
388,
732,
9140,
388,
55609,
198,
1058,
8859,
8995,
60098,
1026,
13,
10820,
598,
630,
388,
732,
9140,
388,
4163,
11,
6636,
25,
12536,
58,
2707,
60,
284,
2290,
11,
14008,
25,
1845,
284,
2290,
11,
27777,
25,
12536,
58,
33758,
53094,
58,
4066,
7646,
3126,
1145,
5464,
7646,
2087,
5163,
284,
2290,
11,
4927,
12418,
25,
12536,
58,
4066,
7646,
2087,
60,
284,
2290,
11,
9681,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
3016,
25,
5884,
284,
2290,
11,
1646,
25,
610,
11,
1646,
1857,
25,
12536,
17752,
60,
284,
2290,
11,
1646,
2517,
25,
12536,
17752,
60,
284,
2290,
11,
2242,
25,
12536,
58,
13755,
17752,
11,
5884,
5163,
284,
2290,
11,
3127,
25,
12536,
17752,
60,
284,
2290,
6758,
2484,
60,
55609,
198,
33,
2315,
25,
445,
11237,
198,
11803,
2212,
279,
356,
81632,
445,
11237,
3834,
627,
1271,
1005,
11,
499,
1288,
617,
279,
272,
4806,
388,
10344,
6462,
10487,
627,
10031,
3788,
1129,
5316,
916,
3262,
76031,
14,
10820,
598,
630,
388,
198,
13617,
198,
1527,
8859,
8995,
60098,
1026,
1179,
356,
9140,
388,
198,
657,
76,
284,
356,
9140,
388,
7790,
6039,
2398,
33529,
4951,
70,
1029,
2427,
418,
12,
17,
30494,
498,
1646,
1857,
429,
70,
418,
17,
1158,
4110,
264,
502,
1646,
555,
23115,
323,
69772,
1988,
828,
505,
16570,
6105,
627,
36120,
54129,
422,
279,
1988,
828,
4250,
387,
16051,
311,
1376,
264,
2764,
1646,
627,
913,
6636,
25,
12536,
58,
2707,
60,
284,
2290,
55609,
198,
913,
4927,
12418,
25,
12536,
58,
4066,
7646,
2087,
60,
284,
2290,
55609,
198,
913,
27777,
25,
23499,
82,
284,
2290,
55609,
198,
913,
2242,
25,
12536,
58,
13755,
17752,
11,
5884,
5163,
284,
2290,
55609,
198,
791,
2242,
5137,
627,
10031,
3788,
1129,
5316,
916,
3262,
76031,
14,
10820,
598,
630,
388,
2,
1710,
198,
913,
3127,
25,
12536,
17752,
60,
284,
2290,
55609,
198,
791,
1853,
311,
264,
6222,
6875,
477,
832,
315,
1860,
87,
17,
11,
1860,
87,
11,
6913,
627,
913,
1646,
25,
610,
510,
8327,
60,
55609,
198,
791,
1853,
311,
264,
1646,
1052,
477,
6352,
477,
279,
836,
315,
264,
473,
36368,
19109,
27636,
198,
2590,
16246,
627,
913,
1646,
2517,
25,
12536,
17752,
60,
284,
2290,
55609
] | https://langchain.readthedocs.io/en/latest/llms/langchain.llms.ctransformers.CTransformers.html |
3a744f2abb6e-1 | model repo.
param model_file: Optional[str] = None¶
The name of the model file in repo or directory.
param model_type: Optional[str] = None¶
The model type.
param tags: Optional[List[str]] = None¶
Tags to add to the run trace.
param verbose: bool [Optional]¶
Whether to print out response text.
__call__(prompt: str, stop: Optional[List[str]] = None, callbacks: Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]] = None, **kwargs: Any) → str¶
Check Cache and run the LLM on the given prompt and input.
async agenerate(prompts: List[str], stop: Optional[List[str]] = None, callbacks: Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]] = None, *, tags: Optional[List[str]] = None, **kwargs: Any) → LLMResult¶
Run the LLM on the given prompt and input.
async agenerate_prompt(prompts: List[PromptValue], stop: Optional[List[str]] = None, callbacks: Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]] = None, **kwargs: Any) → LLMResult¶
Take in a list of prompt values and return an LLMResult.
classmethod all_required_field_names() → Set¶
async apredict(text: str, *, stop: Optional[Sequence[str]] = None, **kwargs: Any) → str¶
Predict text from text.
async apredict_messages(messages: List[BaseMessage], *, stop: Optional[Sequence[str]] = None, **kwargs: Any) → BaseMessage¶
Predict message from messages.
dict(**kwargs: Any) → Dict¶
Return a dictionary of the LLM. | [
2590,
16246,
627,
913,
1646,
2517,
25,
12536,
17752,
60,
284,
2290,
55609,
198,
791,
836,
315,
279,
1646,
1052,
304,
16246,
477,
6352,
627,
913,
1646,
1857,
25,
12536,
17752,
60,
284,
2290,
55609,
198,
791,
1646,
955,
627,
913,
9681,
25,
12536,
53094,
17752,
5163,
284,
2290,
55609,
198,
16309,
311,
923,
311,
279,
1629,
11917,
627,
913,
14008,
25,
1845,
510,
15669,
60,
55609,
198,
25729,
311,
1194,
704,
2077,
1495,
627,
565,
6797,
3889,
41681,
25,
610,
11,
3009,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
27777,
25,
12536,
58,
33758,
53094,
58,
4066,
7646,
3126,
1145,
5464,
7646,
2087,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
610,
55609,
198,
4061,
20044,
323,
1629,
279,
445,
11237,
389,
279,
2728,
10137,
323,
1988,
627,
7847,
945,
13523,
84432,
13044,
25,
1796,
17752,
1145,
3009,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
27777,
25,
12536,
58,
33758,
53094,
58,
4066,
7646,
3126,
1145,
5464,
7646,
2087,
5163,
284,
2290,
11,
12039,
9681,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
445,
11237,
2122,
55609,
198,
6869,
279,
445,
11237,
389,
279,
2728,
10137,
323,
1988,
627,
7847,
945,
13523,
62521,
84432,
13044,
25,
1796,
43447,
15091,
1150,
1145,
3009,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
27777,
25,
12536,
58,
33758,
53094,
58,
4066,
7646,
3126,
1145,
5464,
7646,
2087,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
445,
11237,
2122,
55609,
198,
18293,
304,
264,
1160,
315,
10137,
2819,
323,
471,
459,
445,
11237,
2122,
627,
27853,
682,
19265,
5121,
9366,
368,
11651,
2638,
55609,
198,
7847,
1469,
9037,
7383,
25,
610,
11,
12039,
3009,
25,
12536,
58,
14405,
17752,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
610,
55609,
198,
54644,
1495,
505,
1495,
627,
7847,
1469,
9037,
24321,
56805,
25,
1796,
58,
4066,
2097,
1145,
12039,
3009,
25,
12536,
58,
14405,
17752,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
5464,
2097,
55609,
198,
54644,
1984,
505,
6743,
627,
8644,
22551,
9872,
25,
5884,
8,
11651,
30226,
55609,
198,
5715,
264,
11240,
315,
279,
445,
11237,
13
] | https://langchain.readthedocs.io/en/latest/llms/langchain.llms.ctransformers.CTransformers.html |
3a744f2abb6e-2 | dict(**kwargs: Any) → Dict¶
Return a dictionary of the LLM.
generate(prompts: List[str], stop: Optional[List[str]] = None, callbacks: Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]] = None, *, tags: Optional[List[str]] = None, **kwargs: Any) → LLMResult¶
Run the LLM on the given prompt and input.
generate_prompt(prompts: List[PromptValue], stop: Optional[List[str]] = None, callbacks: Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]] = None, **kwargs: Any) → LLMResult¶
Take in a list of prompt values and return an LLMResult.
get_num_tokens(text: str) → int¶
Get the number of tokens present in the text.
get_num_tokens_from_messages(messages: List[BaseMessage]) → int¶
Get the number of tokens in the message.
get_token_ids(text: str) → List[int]¶
Get the token present in the text.
predict(text: str, *, stop: Optional[Sequence[str]] = None, **kwargs: Any) → str¶
Predict text from text.
predict_messages(messages: List[BaseMessage], *, stop: Optional[Sequence[str]] = None, **kwargs: Any) → BaseMessage¶
Predict message from messages.
validator raise_deprecation » all fields¶
Raise deprecation warning if callback_manager is used.
save(file_path: Union[Path, str]) → None¶
Save the LLM.
Parameters
file_path – Path to file to save the LLM to.
Example:
.. code-block:: python
llm.save(file_path=”path/llm.yaml”)
validator set_verbose » verbose¶
If verbose is None, set it.
This allows users to pass in None as verbose to access the global setting. | [
8644,
22551,
9872,
25,
5884,
8,
11651,
30226,
55609,
198,
5715,
264,
11240,
315,
279,
445,
11237,
627,
19927,
84432,
13044,
25,
1796,
17752,
1145,
3009,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
27777,
25,
12536,
58,
33758,
53094,
58,
4066,
7646,
3126,
1145,
5464,
7646,
2087,
5163,
284,
2290,
11,
12039,
9681,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
445,
11237,
2122,
55609,
198,
6869,
279,
445,
11237,
389,
279,
2728,
10137,
323,
1988,
627,
19927,
62521,
84432,
13044,
25,
1796,
43447,
15091,
1150,
1145,
3009,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
27777,
25,
12536,
58,
33758,
53094,
58,
4066,
7646,
3126,
1145,
5464,
7646,
2087,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
445,
11237,
2122,
55609,
198,
18293,
304,
264,
1160,
315,
10137,
2819,
323,
471,
459,
445,
11237,
2122,
627,
456,
4369,
29938,
7383,
25,
610,
8,
11651,
528,
55609,
198,
1991,
279,
1396,
315,
11460,
3118,
304,
279,
1495,
627,
456,
4369,
29938,
5791,
24321,
56805,
25,
1796,
58,
4066,
2097,
2526,
11651,
528,
55609,
198,
1991,
279,
1396,
315,
11460,
304,
279,
1984,
627,
456,
6594,
8237,
7383,
25,
610,
8,
11651,
1796,
19155,
60,
55609,
198,
1991,
279,
4037,
3118,
304,
279,
1495,
627,
35798,
7383,
25,
610,
11,
12039,
3009,
25,
12536,
58,
14405,
17752,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
610,
55609,
198,
54644,
1495,
505,
1495,
627,
35798,
24321,
56805,
25,
1796,
58,
4066,
2097,
1145,
12039,
3009,
25,
12536,
58,
14405,
17752,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
5464,
2097,
55609,
198,
54644,
1984,
505,
6743,
627,
16503,
4933,
2310,
70693,
4194,
8345,
4194,
682,
5151,
55609,
198,
94201,
409,
70693,
10163,
422,
4927,
12418,
374,
1511,
627,
6766,
4971,
2703,
25,
9323,
58,
1858,
11,
610,
2526,
11651,
2290,
55609,
198,
8960,
279,
445,
11237,
627,
9905,
198,
1213,
2703,
1389,
8092,
311,
1052,
311,
3665,
279,
445,
11237,
311,
627,
13617,
512,
497,
2082,
9612,
487,
10344,
198,
657,
76,
5799,
4971,
2703,
45221,
2398,
14,
657,
76,
34506,
863,
340,
16503,
743,
69021,
4194,
8345,
4194,
14008,
55609,
198,
2746,
14008,
374,
2290,
11,
743,
433,
627,
2028,
6276,
3932,
311,
1522,
304,
2290,
439,
14008,
311,
2680,
279,
3728,
6376,
13
] | https://langchain.readthedocs.io/en/latest/llms/langchain.llms.ctransformers.CTransformers.html |
3a744f2abb6e-3 | This allows users to pass in None as verbose to access the global setting.
to_json() → Union[SerializedConstructor, SerializedNotImplemented]¶
to_json_not_implemented() → SerializedNotImplemented¶
validator validate_environment » all fields[source]¶
Validate that ctransformers package is installed.
property lc_attributes: Dict¶
Return a list of attribute names that should be included in the
serialized kwargs. These attributes must be accepted by the
constructor.
property lc_namespace: List[str]¶
Return the namespace of the langchain object.
eg. [“langchain”, “llms”, “openai”]
property lc_secrets: Dict[str, str]¶
Return a map of constructor argument names to secret ids.
eg. {“openai_api_key”: “OPENAI_API_KEY”}
property lc_serializable: bool¶
Return whether or not the class is serializable.
model Config¶
Bases: object
Configuration for this pydantic object.
arbitrary_types_allowed = True¶ | [
2028,
6276,
3932,
311,
1522,
304,
2290,
439,
14008,
311,
2680,
279,
3728,
6376,
627,
998,
9643,
368,
11651,
9323,
58,
78621,
13591,
11,
92572,
2688,
18804,
60,
55609,
198,
998,
9643,
8072,
18377,
14565,
368,
11651,
92572,
2688,
18804,
55609,
198,
16503,
9788,
52874,
4194,
8345,
4194,
682,
5151,
76747,
60,
55609,
198,
18409,
430,
272,
4806,
388,
6462,
374,
10487,
627,
3784,
37313,
18741,
25,
30226,
55609,
198,
5715,
264,
1160,
315,
7180,
5144,
430,
1288,
387,
5343,
304,
279,
198,
76377,
16901,
13,
4314,
8365,
2011,
387,
11928,
555,
279,
198,
22602,
627,
3784,
37313,
42671,
25,
1796,
17752,
60,
55609,
198,
5715,
279,
4573,
315,
279,
8859,
8995,
1665,
627,
797,
13,
510,
2118,
5317,
8995,
9520,
1054,
657,
1026,
9520,
1054,
2569,
2192,
863,
933,
3784,
37313,
3537,
53810,
25,
30226,
17752,
11,
610,
60,
55609,
198,
5715,
264,
2472,
315,
4797,
5811,
5144,
311,
6367,
14483,
627,
797,
13,
314,
2118,
2569,
2192,
11959,
3173,
57633,
1054,
32033,
15836,
11669,
6738,
863,
534,
3784,
37313,
26684,
8499,
25,
1845,
55609,
198,
5715,
3508,
477,
539,
279,
538,
374,
6275,
8499,
627,
2590,
5649,
55609,
198,
33,
2315,
25,
1665,
198,
7843,
369,
420,
4611,
67,
8322,
1665,
627,
277,
88951,
9962,
43255,
284,
3082,
55609
] | https://langchain.readthedocs.io/en/latest/llms/langchain.llms.ctransformers.CTransformers.html |
f4b0df31ff60-0 | langchain.llms.databricks.get_default_host¶
langchain.llms.databricks.get_default_host() → str[source]¶
Gets the default Databricks workspace hostname.
Raises an error if the hostname cannot be automatically determined. | [
5317,
8995,
60098,
1026,
32112,
78889,
673,
10198,
13144,
55609,
198,
5317,
8995,
60098,
1026,
32112,
78889,
673,
10198,
13144,
368,
11651,
610,
76747,
60,
55609,
198,
50458,
279,
1670,
423,
2143,
78889,
28614,
29215,
627,
36120,
459,
1493,
422,
279,
29215,
4250,
387,
9651,
11075,
13
] | https://langchain.readthedocs.io/en/latest/llms/langchain.llms.databricks.get_default_host.html |
d008eaa7d269-0 | langchain.llms.sagemaker_endpoint.ContentHandlerBase¶
class langchain.llms.sagemaker_endpoint.ContentHandlerBase[source]¶
Bases: Generic[INPUT_TYPE, OUTPUT_TYPE]
A handler class to transform input from LLM to a
format that SageMaker endpoint expects. Similarily,
the class also handles transforming output from the
SageMaker endpoint to a format that LLM class expects.
Methods
__init__()
transform_input(prompt, model_kwargs)
Transforms the input to a format that model can accept as the request Body.
transform_output(output)
Transforms the output from the model to string that the LLM class expects.
Attributes
accepts
The MIME type of the response data returned from endpoint
content_type
The MIME type of the input data passed to endpoint
abstract transform_input(prompt: INPUT_TYPE, model_kwargs: Dict) → bytes[source]¶
Transforms the input to a format that model can accept
as the request Body. Should return bytes or seekable file
like object in the format specified in the content_type
request header.
abstract transform_output(output: bytes) → OUTPUT_TYPE[source]¶
Transforms the output from the model to string that
the LLM class expects.
accepts: Optional[str] = 'text/plain'¶
The MIME type of the response data returned from endpoint
content_type: Optional[str] = 'text/plain'¶
The MIME type of the input data passed to endpoint | [
5317,
8995,
60098,
1026,
516,
15003,
4506,
37799,
12900,
3126,
4066,
55609,
198,
1058,
8859,
8995,
60098,
1026,
516,
15003,
4506,
37799,
12900,
3126,
4066,
76747,
60,
55609,
198,
33,
2315,
25,
21981,
58,
30521,
4283,
11,
32090,
4283,
933,
32,
7158,
538,
311,
5276,
1988,
505,
445,
11237,
311,
264,
198,
2293,
430,
54384,
34359,
15233,
25283,
13,
22196,
1570,
345,
1820,
538,
1101,
13777,
46890,
2612,
505,
279,
198,
50,
425,
34359,
15233,
311,
264,
3645,
430,
445,
11237,
538,
25283,
627,
18337,
198,
565,
2381,
33716,
4806,
6022,
73353,
11,
4194,
2590,
37335,
340,
9140,
82,
279,
1988,
311,
264,
3645,
430,
1646,
649,
4287,
439,
279,
1715,
14285,
627,
4806,
7800,
11304,
340,
9140,
82,
279,
2612,
505,
279,
1646,
311,
925,
430,
279,
445,
11237,
538,
25283,
627,
10738,
198,
10543,
82,
198,
791,
58577,
955,
315,
279,
2077,
828,
6052,
505,
15233,
198,
1834,
1857,
198,
791,
58577,
955,
315,
279,
1988,
828,
5946,
311,
15233,
198,
16647,
5276,
6022,
73353,
25,
27241,
4283,
11,
1646,
37335,
25,
30226,
8,
11651,
5943,
76747,
60,
55609,
198,
9140,
82,
279,
1988,
311,
264,
3645,
430,
1646,
649,
4287,
198,
300,
279,
1715,
14285,
13,
12540,
471,
5943,
477,
6056,
481,
1052,
198,
4908,
1665,
304,
279,
3645,
5300,
304,
279,
2262,
1857,
198,
2079,
4342,
627,
16647,
5276,
7800,
11304,
25,
5943,
8,
11651,
32090,
4283,
76747,
60,
55609,
198,
9140,
82,
279,
2612,
505,
279,
1646,
311,
925,
430,
198,
1820,
445,
11237,
538,
25283,
627,
10543,
82,
25,
12536,
17752,
60,
284,
364,
1342,
38071,
6,
55609,
198,
791,
58577,
955,
315,
279,
2077,
828,
6052,
505,
15233,
198,
1834,
1857,
25,
12536,
17752,
60,
284,
364,
1342,
38071,
6,
55609,
198,
791,
58577,
955,
315,
279,
1988,
828,
5946,
311,
15233
] | https://langchain.readthedocs.io/en/latest/llms/langchain.llms.sagemaker_endpoint.ContentHandlerBase.html |
badeff55f12e-0 | langchain.llms.base.LLM¶
class langchain.llms.base.LLM(*, cache: Optional[bool] = None, verbose: bool = None, callbacks: Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]] = None, callback_manager: Optional[BaseCallbackManager] = None, tags: Optional[List[str]] = None)[source]¶
Bases: BaseLLM
LLM class that expect subclasses to implement a simpler call method.
The purpose of this class is to expose a simpler interface for working
with LLMs, rather than expect the user to implement the full _generate method.
Create a new model by parsing and validating input data from keyword arguments.
Raises ValidationError if the input data cannot be parsed to form a valid model.
param cache: Optional[bool] = None¶
param callback_manager: Optional[langchain.callbacks.base.BaseCallbackManager] = None¶
param callbacks: Optional[Union[List[langchain.callbacks.base.BaseCallbackHandler], langchain.callbacks.base.BaseCallbackManager]] = None¶
param tags: Optional[List[str]] = None¶
Tags to add to the run trace.
param verbose: bool [Optional]¶
Whether to print out response text.
__call__(prompt: str, stop: Optional[List[str]] = None, callbacks: Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]] = None, **kwargs: Any) → str¶
Check Cache and run the LLM on the given prompt and input.
async agenerate(prompts: List[str], stop: Optional[List[str]] = None, callbacks: Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]] = None, *, tags: Optional[List[str]] = None, **kwargs: Any) → LLMResult¶
Run the LLM on the given prompt and input. | [
5317,
8995,
60098,
1026,
9105,
1236,
11237,
55609,
198,
1058,
8859,
8995,
60098,
1026,
9105,
1236,
11237,
4163,
11,
6636,
25,
12536,
58,
2707,
60,
284,
2290,
11,
14008,
25,
1845,
284,
2290,
11,
27777,
25,
12536,
58,
33758,
53094,
58,
4066,
7646,
3126,
1145,
5464,
7646,
2087,
5163,
284,
2290,
11,
4927,
12418,
25,
12536,
58,
4066,
7646,
2087,
60,
284,
2290,
11,
9681,
25,
12536,
53094,
17752,
5163,
284,
2290,
6758,
2484,
60,
55609,
198,
33,
2315,
25,
5464,
4178,
44,
198,
4178,
44,
538,
430,
1755,
69283,
311,
4305,
264,
35388,
1650,
1749,
627,
791,
7580,
315,
420,
538,
374,
311,
29241,
264,
35388,
3834,
369,
3318,
198,
4291,
445,
11237,
82,
11,
4856,
1109,
1755,
279,
1217,
311,
4305,
279,
2539,
721,
19927,
1749,
627,
4110,
264,
502,
1646,
555,
23115,
323,
69772,
1988,
828,
505,
16570,
6105,
627,
36120,
54129,
422,
279,
1988,
828,
4250,
387,
16051,
311,
1376,
264,
2764,
1646,
627,
913,
6636,
25,
12536,
58,
2707,
60,
284,
2290,
55609,
198,
913,
4927,
12418,
25,
12536,
58,
5317,
8995,
72134,
9105,
13316,
7646,
2087,
60,
284,
2290,
55609,
198,
913,
27777,
25,
12536,
58,
33758,
53094,
58,
5317,
8995,
72134,
9105,
13316,
7646,
3126,
1145,
8859,
8995,
72134,
9105,
13316,
7646,
2087,
5163,
284,
2290,
55609,
198,
913,
9681,
25,
12536,
53094,
17752,
5163,
284,
2290,
55609,
198,
16309,
311,
923,
311,
279,
1629,
11917,
627,
913,
14008,
25,
1845,
510,
15669,
60,
55609,
198,
25729,
311,
1194,
704,
2077,
1495,
627,
565,
6797,
3889,
41681,
25,
610,
11,
3009,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
27777,
25,
12536,
58,
33758,
53094,
58,
4066,
7646,
3126,
1145,
5464,
7646,
2087,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
610,
55609,
198,
4061,
20044,
323,
1629,
279,
445,
11237,
389,
279,
2728,
10137,
323,
1988,
627,
7847,
945,
13523,
84432,
13044,
25,
1796,
17752,
1145,
3009,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
27777,
25,
12536,
58,
33758,
53094,
58,
4066,
7646,
3126,
1145,
5464,
7646,
2087,
5163,
284,
2290,
11,
12039,
9681,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
445,
11237,
2122,
55609,
198,
6869,
279,
445,
11237,
389,
279,
2728,
10137,
323,
1988,
13
] | https://langchain.readthedocs.io/en/latest/llms/langchain.llms.base.LLM.html |
badeff55f12e-1 | Run the LLM on the given prompt and input.
async agenerate_prompt(prompts: List[PromptValue], stop: Optional[List[str]] = None, callbacks: Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]] = None, **kwargs: Any) → LLMResult¶
Take in a list of prompt values and return an LLMResult.
classmethod all_required_field_names() → Set¶
async apredict(text: str, *, stop: Optional[Sequence[str]] = None, **kwargs: Any) → str¶
Predict text from text.
async apredict_messages(messages: List[BaseMessage], *, stop: Optional[Sequence[str]] = None, **kwargs: Any) → BaseMessage¶
Predict message from messages.
dict(**kwargs: Any) → Dict¶
Return a dictionary of the LLM.
generate(prompts: List[str], stop: Optional[List[str]] = None, callbacks: Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]] = None, *, tags: Optional[List[str]] = None, **kwargs: Any) → LLMResult¶
Run the LLM on the given prompt and input.
generate_prompt(prompts: List[PromptValue], stop: Optional[List[str]] = None, callbacks: Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]] = None, **kwargs: Any) → LLMResult¶
Take in a list of prompt values and return an LLMResult.
get_num_tokens(text: str) → int¶
Get the number of tokens present in the text.
get_num_tokens_from_messages(messages: List[BaseMessage]) → int¶
Get the number of tokens in the message.
get_token_ids(text: str) → List[int]¶
Get the token present in the text. | [
6869,
279,
445,
11237,
389,
279,
2728,
10137,
323,
1988,
627,
7847,
945,
13523,
62521,
84432,
13044,
25,
1796,
43447,
15091,
1150,
1145,
3009,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
27777,
25,
12536,
58,
33758,
53094,
58,
4066,
7646,
3126,
1145,
5464,
7646,
2087,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
445,
11237,
2122,
55609,
198,
18293,
304,
264,
1160,
315,
10137,
2819,
323,
471,
459,
445,
11237,
2122,
627,
27853,
682,
19265,
5121,
9366,
368,
11651,
2638,
55609,
198,
7847,
1469,
9037,
7383,
25,
610,
11,
12039,
3009,
25,
12536,
58,
14405,
17752,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
610,
55609,
198,
54644,
1495,
505,
1495,
627,
7847,
1469,
9037,
24321,
56805,
25,
1796,
58,
4066,
2097,
1145,
12039,
3009,
25,
12536,
58,
14405,
17752,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
5464,
2097,
55609,
198,
54644,
1984,
505,
6743,
627,
8644,
22551,
9872,
25,
5884,
8,
11651,
30226,
55609,
198,
5715,
264,
11240,
315,
279,
445,
11237,
627,
19927,
84432,
13044,
25,
1796,
17752,
1145,
3009,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
27777,
25,
12536,
58,
33758,
53094,
58,
4066,
7646,
3126,
1145,
5464,
7646,
2087,
5163,
284,
2290,
11,
12039,
9681,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
445,
11237,
2122,
55609,
198,
6869,
279,
445,
11237,
389,
279,
2728,
10137,
323,
1988,
627,
19927,
62521,
84432,
13044,
25,
1796,
43447,
15091,
1150,
1145,
3009,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
27777,
25,
12536,
58,
33758,
53094,
58,
4066,
7646,
3126,
1145,
5464,
7646,
2087,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
445,
11237,
2122,
55609,
198,
18293,
304,
264,
1160,
315,
10137,
2819,
323,
471,
459,
445,
11237,
2122,
627,
456,
4369,
29938,
7383,
25,
610,
8,
11651,
528,
55609,
198,
1991,
279,
1396,
315,
11460,
3118,
304,
279,
1495,
627,
456,
4369,
29938,
5791,
24321,
56805,
25,
1796,
58,
4066,
2097,
2526,
11651,
528,
55609,
198,
1991,
279,
1396,
315,
11460,
304,
279,
1984,
627,
456,
6594,
8237,
7383,
25,
610,
8,
11651,
1796,
19155,
60,
55609,
198,
1991,
279,
4037,
3118,
304,
279,
1495,
13
] | https://langchain.readthedocs.io/en/latest/llms/langchain.llms.base.LLM.html |
badeff55f12e-2 | Get the token present in the text.
predict(text: str, *, stop: Optional[Sequence[str]] = None, **kwargs: Any) → str¶
Predict text from text.
predict_messages(messages: List[BaseMessage], *, stop: Optional[Sequence[str]] = None, **kwargs: Any) → BaseMessage¶
Predict message from messages.
validator raise_deprecation » all fields¶
Raise deprecation warning if callback_manager is used.
save(file_path: Union[Path, str]) → None¶
Save the LLM.
Parameters
file_path – Path to file to save the LLM to.
Example:
.. code-block:: python
llm.save(file_path=”path/llm.yaml”)
validator set_verbose » verbose¶
If verbose is None, set it.
This allows users to pass in None as verbose to access the global setting.
to_json() → Union[SerializedConstructor, SerializedNotImplemented]¶
to_json_not_implemented() → SerializedNotImplemented¶
property lc_attributes: Dict¶
Return a list of attribute names that should be included in the
serialized kwargs. These attributes must be accepted by the
constructor.
property lc_namespace: List[str]¶
Return the namespace of the langchain object.
eg. [“langchain”, “llms”, “openai”]
property lc_secrets: Dict[str, str]¶
Return a map of constructor argument names to secret ids.
eg. {“openai_api_key”: “OPENAI_API_KEY”}
property lc_serializable: bool¶
Return whether or not the class is serializable.
model Config¶
Bases: object
Configuration for this pydantic object.
arbitrary_types_allowed = True¶ | [
1991,
279,
4037,
3118,
304,
279,
1495,
627,
35798,
7383,
25,
610,
11,
12039,
3009,
25,
12536,
58,
14405,
17752,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
610,
55609,
198,
54644,
1495,
505,
1495,
627,
35798,
24321,
56805,
25,
1796,
58,
4066,
2097,
1145,
12039,
3009,
25,
12536,
58,
14405,
17752,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
5464,
2097,
55609,
198,
54644,
1984,
505,
6743,
627,
16503,
4933,
2310,
70693,
4194,
8345,
4194,
682,
5151,
55609,
198,
94201,
409,
70693,
10163,
422,
4927,
12418,
374,
1511,
627,
6766,
4971,
2703,
25,
9323,
58,
1858,
11,
610,
2526,
11651,
2290,
55609,
198,
8960,
279,
445,
11237,
627,
9905,
198,
1213,
2703,
1389,
8092,
311,
1052,
311,
3665,
279,
445,
11237,
311,
627,
13617,
512,
497,
2082,
9612,
487,
10344,
198,
657,
76,
5799,
4971,
2703,
45221,
2398,
14,
657,
76,
34506,
863,
340,
16503,
743,
69021,
4194,
8345,
4194,
14008,
55609,
198,
2746,
14008,
374,
2290,
11,
743,
433,
627,
2028,
6276,
3932,
311,
1522,
304,
2290,
439,
14008,
311,
2680,
279,
3728,
6376,
627,
998,
9643,
368,
11651,
9323,
58,
78621,
13591,
11,
92572,
2688,
18804,
60,
55609,
198,
998,
9643,
8072,
18377,
14565,
368,
11651,
92572,
2688,
18804,
55609,
198,
3784,
37313,
18741,
25,
30226,
55609,
198,
5715,
264,
1160,
315,
7180,
5144,
430,
1288,
387,
5343,
304,
279,
198,
76377,
16901,
13,
4314,
8365,
2011,
387,
11928,
555,
279,
198,
22602,
627,
3784,
37313,
42671,
25,
1796,
17752,
60,
55609,
198,
5715,
279,
4573,
315,
279,
8859,
8995,
1665,
627,
797,
13,
510,
2118,
5317,
8995,
9520,
1054,
657,
1026,
9520,
1054,
2569,
2192,
863,
933,
3784,
37313,
3537,
53810,
25,
30226,
17752,
11,
610,
60,
55609,
198,
5715,
264,
2472,
315,
4797,
5811,
5144,
311,
6367,
14483,
627,
797,
13,
314,
2118,
2569,
2192,
11959,
3173,
57633,
1054,
32033,
15836,
11669,
6738,
863,
534,
3784,
37313,
26684,
8499,
25,
1845,
55609,
198,
5715,
3508,
477,
539,
279,
538,
374,
6275,
8499,
627,
2590,
5649,
55609,
198,
33,
2315,
25,
1665,
198,
7843,
369,
420,
4611,
67,
8322,
1665,
627,
277,
88951,
9962,
43255,
284,
3082,
55609
] | https://langchain.readthedocs.io/en/latest/llms/langchain.llms.base.LLM.html |
48773037688a-0 | langchain.llms.databricks.Databricks¶
class langchain.llms.databricks.Databricks(*, cache: Optional[bool] = None, verbose: bool = None, callbacks: Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]] = None, callback_manager: Optional[BaseCallbackManager] = None, tags: Optional[List[str]] = None, host: str = None, api_token: str = None, endpoint_name: Optional[str] = None, cluster_id: Optional[str] = None, cluster_driver_port: Optional[str] = None, model_kwargs: Optional[Dict[str, Any]] = None, transform_input_fn: Optional[Callable] = None, transform_output_fn: Optional[Callable[[...], str]] = None)[source]¶
Bases: LLM
LLM wrapper around a Databricks serving endpoint or a cluster driver proxy app.
It supports two endpoint types:
Serving endpoint (recommended for both production and development).
We assume that an LLM was registered and deployed to a serving endpoint.
To wrap it as an LLM you must have “Can Query” permission to the endpoint.
Set endpoint_name accordingly and do not set cluster_id and
cluster_driver_port.
The expected model signature is:
inputs:
[{"name": "prompt", "type": "string"},
{"name": "stop", "type": "list[string]"}]
outputs: [{"type": "string"}]
Cluster driver proxy app (recommended for interactive development).
One can load an LLM on a Databricks interactive cluster and start a local HTTP
server on the driver node to serve the model at / using HTTP POST method
with JSON input/output.
Please use a port number between [3000, 8000] and let the server listen to | [
5317,
8995,
60098,
1026,
32112,
78889,
920,
2143,
78889,
55609,
198,
1058,
8859,
8995,
60098,
1026,
32112,
78889,
920,
2143,
78889,
4163,
11,
6636,
25,
12536,
58,
2707,
60,
284,
2290,
11,
14008,
25,
1845,
284,
2290,
11,
27777,
25,
12536,
58,
33758,
53094,
58,
4066,
7646,
3126,
1145,
5464,
7646,
2087,
5163,
284,
2290,
11,
4927,
12418,
25,
12536,
58,
4066,
7646,
2087,
60,
284,
2290,
11,
9681,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
3552,
25,
610,
284,
2290,
11,
6464,
6594,
25,
610,
284,
2290,
11,
15233,
1292,
25,
12536,
17752,
60,
284,
2290,
11,
10879,
851,
25,
12536,
17752,
60,
284,
2290,
11,
10879,
21250,
8889,
25,
12536,
17752,
60,
284,
2290,
11,
1646,
37335,
25,
12536,
58,
13755,
17752,
11,
5884,
5163,
284,
2290,
11,
5276,
6022,
15604,
25,
12536,
58,
41510,
60,
284,
2290,
11,
5276,
7800,
15604,
25,
12536,
58,
41510,
15873,
1131,
1145,
610,
5163,
284,
2290,
6758,
2484,
60,
55609,
198,
33,
2315,
25,
445,
11237,
198,
4178,
44,
13564,
2212,
264,
423,
2143,
78889,
13788,
15233,
477,
264,
10879,
5696,
13594,
917,
627,
2181,
11815,
1403,
15233,
4595,
512,
50,
20073,
15233,
320,
86447,
369,
2225,
5788,
323,
4500,
4390,
1687,
9855,
430,
459,
445,
11237,
574,
9879,
323,
27167,
311,
264,
13788,
15233,
627,
1271,
15411,
433,
439,
459,
445,
11237,
499,
2011,
617,
1054,
6854,
11615,
863,
8041,
311,
279,
15233,
627,
1681,
15233,
1292,
28178,
323,
656,
539,
743,
10879,
851,
323,
198,
19386,
21250,
8889,
627,
791,
3685,
1646,
12223,
374,
512,
25986,
512,
58,
5018,
609,
794,
330,
41681,
498,
330,
1337,
794,
330,
928,
7260,
5324,
609,
794,
330,
9684,
498,
330,
1337,
794,
330,
1638,
14359,
60,
9388,
933,
42106,
25,
62853,
1337,
794,
330,
928,
9388,
933,
29778,
5696,
13594,
917,
320,
86447,
369,
21416,
4500,
4390,
4054,
649,
2865,
459,
445,
11237,
389,
264,
423,
2143,
78889,
21416,
10879,
323,
1212,
264,
2254,
10339,
198,
4120,
389,
279,
5696,
2494,
311,
8854,
279,
1646,
520,
611,
1701,
10339,
13165,
1749,
198,
4291,
4823,
1988,
48116,
627,
5618,
1005,
264,
2700,
1396,
1990,
510,
3101,
15,
11,
220,
4728,
15,
60,
323,
1095,
279,
3622,
9020,
311
] | https://langchain.readthedocs.io/en/latest/llms/langchain.llms.databricks.Databricks.html |
48773037688a-1 | the driver IP address or simply 0.0.0.0 instead of localhost only.
To wrap it as an LLM you must have “Can Attach To” permission to the cluster.
Set cluster_id and cluster_driver_port and do not set endpoint_name.
The expected server schema (using JSON schema) is:
inputs:
{"type": "object",
"properties": {
"prompt": {"type": "string"},
"stop": {"type": "array", "items": {"type": "string"}}},
"required": ["prompt"]}`
outputs: {"type": "string"}
If the endpoint model signature is different or you want to set extra params,
you can use transform_input_fn and transform_output_fn to apply necessary
transformations before and after the query.
Create a new model by parsing and validating input data from keyword arguments.
Raises ValidationError if the input data cannot be parsed to form a valid model.
param api_token: str [Optional]¶
Databricks personal access token.
If not provided, the default value is determined by
the DATABRICKS_TOKEN environment variable if present, or
an automatically generated temporary token if running inside a Databricks
notebook attached to an interactive cluster in “single user” or
“no isolation shared” mode.
param cache: Optional[bool] = None¶
param callback_manager: Optional[BaseCallbackManager] = None¶
param callbacks: Callbacks = None¶
param cluster_driver_port: Optional[str] = None¶
The port number used by the HTTP server running on the cluster driver node.
The server should listen on the driver IP address or simply 0.0.0.0 to connect.
We recommend the server using a port number between [3000, 8000].
param cluster_id: Optional[str] = None¶ | [
1820,
5696,
6933,
2686,
477,
5042,
220,
15,
13,
15,
13,
15,
13,
15,
4619,
315,
48522,
1193,
627,
1271,
15411,
433,
439,
459,
445,
11237,
499,
2011,
617,
1054,
6854,
49484,
2057,
863,
8041,
311,
279,
10879,
627,
1681,
10879,
851,
323,
10879,
21250,
8889,
323,
656,
539,
743,
15233,
1292,
627,
791,
3685,
3622,
11036,
320,
985,
4823,
11036,
8,
374,
512,
25986,
512,
5018,
1337,
794,
330,
1735,
761,
330,
13495,
794,
341,
262,
330,
41681,
794,
5324,
1337,
794,
330,
928,
7260,
262,
330,
9684,
794,
5324,
1337,
794,
330,
1686,
498,
330,
3699,
794,
5324,
1337,
794,
330,
928,
32075,
1613,
330,
6413,
794,
4482,
41681,
1365,
32357,
42106,
25,
5324,
1337,
794,
330,
928,
17122,
2746,
279,
15233,
1646,
12223,
374,
2204,
477,
499,
1390,
311,
743,
5066,
3712,
345,
9514,
649,
1005,
5276,
6022,
15604,
323,
5276,
7800,
15604,
311,
3881,
5995,
198,
4806,
811,
1603,
323,
1306,
279,
3319,
627,
4110,
264,
502,
1646,
555,
23115,
323,
69772,
1988,
828,
505,
16570,
6105,
627,
36120,
54129,
422,
279,
1988,
828,
4250,
387,
16051,
311,
1376,
264,
2764,
1646,
627,
913,
6464,
6594,
25,
610,
510,
15669,
60,
55609,
198,
35,
2143,
78889,
4443,
2680,
4037,
627,
2746,
539,
3984,
11,
279,
1670,
907,
374,
11075,
555,
198,
1820,
423,
20133,
49,
10915,
50,
19199,
4676,
3977,
422,
3118,
11,
477,
198,
276,
9651,
8066,
13643,
4037,
422,
4401,
4871,
264,
423,
2143,
78889,
198,
10179,
2239,
12673,
311,
459,
21416,
10879,
304,
1054,
15698,
1217,
863,
477,
198,
2118,
2201,
31398,
6222,
863,
3941,
627,
913,
6636,
25,
12536,
58,
2707,
60,
284,
2290,
55609,
198,
913,
4927,
12418,
25,
12536,
58,
4066,
7646,
2087,
60,
284,
2290,
55609,
198,
913,
27777,
25,
23499,
82,
284,
2290,
55609,
198,
913,
10879,
21250,
8889,
25,
12536,
17752,
60,
284,
2290,
55609,
198,
791,
2700,
1396,
1511,
555,
279,
10339,
3622,
4401,
389,
279,
10879,
5696,
2494,
627,
791,
3622,
1288,
9020,
389,
279,
5696,
6933,
2686,
477,
5042,
220,
15,
13,
15,
13,
15,
13,
15,
311,
4667,
627,
1687,
7079,
279,
3622,
1701,
264,
2700,
1396,
1990,
510,
3101,
15,
11,
220,
4728,
15,
27218,
913,
10879,
851,
25,
12536,
17752,
60,
284,
2290,
55609
] | https://langchain.readthedocs.io/en/latest/llms/langchain.llms.databricks.Databricks.html |
48773037688a-2 | param cluster_id: Optional[str] = None¶
ID of the cluster if connecting to a cluster driver proxy app.
If neither endpoint_name nor cluster_id is not provided and the code runs
inside a Databricks notebook attached to an interactive cluster in “single user”
or “no isolation shared” mode, the current cluster ID is used as default.
You must not set both endpoint_name and cluster_id.
param endpoint_name: Optional[str] = None¶
Name of the model serving endpont.
You must specify the endpoint name to connect to a model serving endpoint.
You must not set both endpoint_name and cluster_id.
param host: str [Optional]¶
Databricks workspace hostname.
If not provided, the default value is determined by
the DATABRICKS_HOST environment variable if present, or
the hostname of the current Databricks workspace if running inside
a Databricks notebook attached to an interactive cluster in “single user”
or “no isolation shared” mode.
param model_kwargs: Optional[Dict[str, Any]] = None¶
Extra parameters to pass to the endpoint.
param tags: Optional[List[str]] = None¶
Tags to add to the run trace.
param transform_input_fn: Optional[Callable] = None¶
A function that transforms {prompt, stop, **kwargs} into a JSON-compatible
request object that the endpoint accepts.
For example, you can apply a prompt template to the input prompt.
param transform_output_fn: Optional[Callable[[...], str]] = None¶
A function that transforms the output from the endpoint to the generated text.
param verbose: bool [Optional]¶
Whether to print out response text. | [
913,
10879,
851,
25,
12536,
17752,
60,
284,
2290,
55609,
198,
926,
315,
279,
10879,
422,
21583,
311,
264,
10879,
5696,
13594,
917,
627,
2746,
14188,
15233,
1292,
6463,
10879,
851,
374,
539,
3984,
323,
279,
2082,
8640,
198,
42450,
264,
423,
2143,
78889,
38266,
12673,
311,
459,
21416,
10879,
304,
1054,
15698,
1217,
89874,
269,
1054,
2201,
31398,
6222,
863,
3941,
11,
279,
1510,
10879,
3110,
374,
1511,
439,
1670,
627,
2675,
2011,
539,
743,
2225,
15233,
1292,
323,
10879,
851,
627,
913,
15233,
1292,
25,
12536,
17752,
60,
284,
2290,
55609,
198,
678,
315,
279,
1646,
13788,
842,
79,
546,
627,
2675,
2011,
14158,
279,
15233,
836,
311,
4667,
311,
264,
1646,
13788,
15233,
627,
2675,
2011,
539,
743,
2225,
15233,
1292,
323,
10879,
851,
627,
913,
3552,
25,
610,
510,
15669,
60,
55609,
198,
35,
2143,
78889,
28614,
29215,
627,
2746,
539,
3984,
11,
279,
1670,
907,
374,
11075,
555,
198,
1820,
423,
20133,
49,
10915,
50,
17656,
4676,
3977,
422,
3118,
11,
477,
198,
1820,
29215,
315,
279,
1510,
423,
2143,
78889,
28614,
422,
4401,
4871,
198,
64,
423,
2143,
78889,
38266,
12673,
311,
459,
21416,
10879,
304,
1054,
15698,
1217,
89874,
269,
1054,
2201,
31398,
6222,
863,
3941,
627,
913,
1646,
37335,
25,
12536,
58,
13755,
17752,
11,
5884,
5163,
284,
2290,
55609,
198,
11873,
5137,
311,
1522,
311,
279,
15233,
627,
913,
9681,
25,
12536,
53094,
17752,
5163,
284,
2290,
55609,
198,
16309,
311,
923,
311,
279,
1629,
11917,
627,
913,
5276,
6022,
15604,
25,
12536,
58,
41510,
60,
284,
2290,
55609,
198,
32,
734,
430,
29575,
314,
41681,
11,
3009,
11,
3146,
9872,
92,
1139,
264,
4823,
81315,
198,
2079,
1665,
430,
279,
15233,
27441,
627,
2520,
3187,
11,
499,
649,
3881,
264,
10137,
3896,
311,
279,
1988,
10137,
627,
913,
5276,
7800,
15604,
25,
12536,
58,
41510,
15873,
1131,
1145,
610,
5163,
284,
2290,
55609,
198,
32,
734,
430,
29575,
279,
2612,
505,
279,
15233,
311,
279,
8066,
1495,
627,
913,
14008,
25,
1845,
510,
15669,
60,
55609,
198,
25729,
311,
1194,
704,
2077,
1495,
13
] | https://langchain.readthedocs.io/en/latest/llms/langchain.llms.databricks.Databricks.html |
48773037688a-3 | param verbose: bool [Optional]¶
Whether to print out response text.
__call__(prompt: str, stop: Optional[List[str]] = None, callbacks: Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]] = None, **kwargs: Any) → str¶
Check Cache and run the LLM on the given prompt and input.
async agenerate(prompts: List[str], stop: Optional[List[str]] = None, callbacks: Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]] = None, *, tags: Optional[List[str]] = None, **kwargs: Any) → LLMResult¶
Run the LLM on the given prompt and input.
async agenerate_prompt(prompts: List[PromptValue], stop: Optional[List[str]] = None, callbacks: Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]] = None, **kwargs: Any) → LLMResult¶
Take in a list of prompt values and return an LLMResult.
classmethod all_required_field_names() → Set¶
async apredict(text: str, *, stop: Optional[Sequence[str]] = None, **kwargs: Any) → str¶
Predict text from text.
async apredict_messages(messages: List[BaseMessage], *, stop: Optional[Sequence[str]] = None, **kwargs: Any) → BaseMessage¶
Predict message from messages.
dict(**kwargs: Any) → Dict¶
Return a dictionary of the LLM.
generate(prompts: List[str], stop: Optional[List[str]] = None, callbacks: Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]] = None, *, tags: Optional[List[str]] = None, **kwargs: Any) → LLMResult¶
Run the LLM on the given prompt and input. | [
913,
14008,
25,
1845,
510,
15669,
60,
55609,
198,
25729,
311,
1194,
704,
2077,
1495,
627,
565,
6797,
3889,
41681,
25,
610,
11,
3009,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
27777,
25,
12536,
58,
33758,
53094,
58,
4066,
7646,
3126,
1145,
5464,
7646,
2087,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
610,
55609,
198,
4061,
20044,
323,
1629,
279,
445,
11237,
389,
279,
2728,
10137,
323,
1988,
627,
7847,
945,
13523,
84432,
13044,
25,
1796,
17752,
1145,
3009,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
27777,
25,
12536,
58,
33758,
53094,
58,
4066,
7646,
3126,
1145,
5464,
7646,
2087,
5163,
284,
2290,
11,
12039,
9681,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
445,
11237,
2122,
55609,
198,
6869,
279,
445,
11237,
389,
279,
2728,
10137,
323,
1988,
627,
7847,
945,
13523,
62521,
84432,
13044,
25,
1796,
43447,
15091,
1150,
1145,
3009,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
27777,
25,
12536,
58,
33758,
53094,
58,
4066,
7646,
3126,
1145,
5464,
7646,
2087,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
445,
11237,
2122,
55609,
198,
18293,
304,
264,
1160,
315,
10137,
2819,
323,
471,
459,
445,
11237,
2122,
627,
27853,
682,
19265,
5121,
9366,
368,
11651,
2638,
55609,
198,
7847,
1469,
9037,
7383,
25,
610,
11,
12039,
3009,
25,
12536,
58,
14405,
17752,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
610,
55609,
198,
54644,
1495,
505,
1495,
627,
7847,
1469,
9037,
24321,
56805,
25,
1796,
58,
4066,
2097,
1145,
12039,
3009,
25,
12536,
58,
14405,
17752,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
5464,
2097,
55609,
198,
54644,
1984,
505,
6743,
627,
8644,
22551,
9872,
25,
5884,
8,
11651,
30226,
55609,
198,
5715,
264,
11240,
315,
279,
445,
11237,
627,
19927,
84432,
13044,
25,
1796,
17752,
1145,
3009,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
27777,
25,
12536,
58,
33758,
53094,
58,
4066,
7646,
3126,
1145,
5464,
7646,
2087,
5163,
284,
2290,
11,
12039,
9681,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
445,
11237,
2122,
55609,
198,
6869,
279,
445,
11237,
389,
279,
2728,
10137,
323,
1988,
13
] | https://langchain.readthedocs.io/en/latest/llms/langchain.llms.databricks.Databricks.html |
48773037688a-4 | Run the LLM on the given prompt and input.
generate_prompt(prompts: List[PromptValue], stop: Optional[List[str]] = None, callbacks: Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]] = None, **kwargs: Any) → LLMResult¶
Take in a list of prompt values and return an LLMResult.
get_num_tokens(text: str) → int¶
Get the number of tokens present in the text.
get_num_tokens_from_messages(messages: List[BaseMessage]) → int¶
Get the number of tokens in the message.
get_token_ids(text: str) → List[int]¶
Get the token present in the text.
predict(text: str, *, stop: Optional[Sequence[str]] = None, **kwargs: Any) → str¶
Predict text from text.
predict_messages(messages: List[BaseMessage], *, stop: Optional[Sequence[str]] = None, **kwargs: Any) → BaseMessage¶
Predict message from messages.
validator raise_deprecation » all fields¶
Raise deprecation warning if callback_manager is used.
save(file_path: Union[Path, str]) → None¶
Save the LLM.
Parameters
file_path – Path to file to save the LLM to.
Example:
.. code-block:: python
llm.save(file_path=”path/llm.yaml”)
validator set_cluster_driver_port » cluster_driver_port[source]¶
validator set_cluster_id » cluster_id[source]¶
validator set_model_kwargs » model_kwargs[source]¶
validator set_verbose » verbose¶
If verbose is None, set it.
This allows users to pass in None as verbose to access the global setting.
to_json() → Union[SerializedConstructor, SerializedNotImplemented]¶
to_json_not_implemented() → SerializedNotImplemented¶ | [
6869,
279,
445,
11237,
389,
279,
2728,
10137,
323,
1988,
627,
19927,
62521,
84432,
13044,
25,
1796,
43447,
15091,
1150,
1145,
3009,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
27777,
25,
12536,
58,
33758,
53094,
58,
4066,
7646,
3126,
1145,
5464,
7646,
2087,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
445,
11237,
2122,
55609,
198,
18293,
304,
264,
1160,
315,
10137,
2819,
323,
471,
459,
445,
11237,
2122,
627,
456,
4369,
29938,
7383,
25,
610,
8,
11651,
528,
55609,
198,
1991,
279,
1396,
315,
11460,
3118,
304,
279,
1495,
627,
456,
4369,
29938,
5791,
24321,
56805,
25,
1796,
58,
4066,
2097,
2526,
11651,
528,
55609,
198,
1991,
279,
1396,
315,
11460,
304,
279,
1984,
627,
456,
6594,
8237,
7383,
25,
610,
8,
11651,
1796,
19155,
60,
55609,
198,
1991,
279,
4037,
3118,
304,
279,
1495,
627,
35798,
7383,
25,
610,
11,
12039,
3009,
25,
12536,
58,
14405,
17752,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
610,
55609,
198,
54644,
1495,
505,
1495,
627,
35798,
24321,
56805,
25,
1796,
58,
4066,
2097,
1145,
12039,
3009,
25,
12536,
58,
14405,
17752,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
5464,
2097,
55609,
198,
54644,
1984,
505,
6743,
627,
16503,
4933,
2310,
70693,
4194,
8345,
4194,
682,
5151,
55609,
198,
94201,
409,
70693,
10163,
422,
4927,
12418,
374,
1511,
627,
6766,
4971,
2703,
25,
9323,
58,
1858,
11,
610,
2526,
11651,
2290,
55609,
198,
8960,
279,
445,
11237,
627,
9905,
198,
1213,
2703,
1389,
8092,
311,
1052,
311,
3665,
279,
445,
11237,
311,
627,
13617,
512,
497,
2082,
9612,
487,
10344,
198,
657,
76,
5799,
4971,
2703,
45221,
2398,
14,
657,
76,
34506,
863,
340,
16503,
743,
29541,
21250,
8889,
4194,
8345,
4194,
10879,
21250,
8889,
76747,
60,
55609,
198,
16503,
743,
29541,
851,
4194,
8345,
4194,
10879,
851,
76747,
60,
55609,
198,
16503,
743,
5156,
37335,
4194,
8345,
4194,
1646,
37335,
76747,
60,
55609,
198,
16503,
743,
69021,
4194,
8345,
4194,
14008,
55609,
198,
2746,
14008,
374,
2290,
11,
743,
433,
627,
2028,
6276,
3932,
311,
1522,
304,
2290,
439,
14008,
311,
2680,
279,
3728,
6376,
627,
998,
9643,
368,
11651,
9323,
58,
78621,
13591,
11,
92572,
2688,
18804,
60,
55609,
198,
998,
9643,
8072,
18377,
14565,
368,
11651,
92572,
2688,
18804,
55609
] | https://langchain.readthedocs.io/en/latest/llms/langchain.llms.databricks.Databricks.html |
48773037688a-5 | to_json_not_implemented() → SerializedNotImplemented¶
property lc_attributes: Dict¶
Return a list of attribute names that should be included in the
serialized kwargs. These attributes must be accepted by the
constructor.
property lc_namespace: List[str]¶
Return the namespace of the langchain object.
eg. [“langchain”, “llms”, “openai”]
property lc_secrets: Dict[str, str]¶
Return a map of constructor argument names to secret ids.
eg. {“openai_api_key”: “OPENAI_API_KEY”}
property lc_serializable: bool¶
Return whether or not the class is serializable.
model Config[source]¶
Bases: object
extra = 'forbid'¶
underscore_attrs_are_private = True¶ | [
998,
9643,
8072,
18377,
14565,
368,
11651,
92572,
2688,
18804,
55609,
198,
3784,
37313,
18741,
25,
30226,
55609,
198,
5715,
264,
1160,
315,
7180,
5144,
430,
1288,
387,
5343,
304,
279,
198,
76377,
16901,
13,
4314,
8365,
2011,
387,
11928,
555,
279,
198,
22602,
627,
3784,
37313,
42671,
25,
1796,
17752,
60,
55609,
198,
5715,
279,
4573,
315,
279,
8859,
8995,
1665,
627,
797,
13,
510,
2118,
5317,
8995,
9520,
1054,
657,
1026,
9520,
1054,
2569,
2192,
863,
933,
3784,
37313,
3537,
53810,
25,
30226,
17752,
11,
610,
60,
55609,
198,
5715,
264,
2472,
315,
4797,
5811,
5144,
311,
6367,
14483,
627,
797,
13,
314,
2118,
2569,
2192,
11959,
3173,
57633,
1054,
32033,
15836,
11669,
6738,
863,
534,
3784,
37313,
26684,
8499,
25,
1845,
55609,
198,
5715,
3508,
477,
539,
279,
538,
374,
6275,
8499,
627,
2590,
5649,
76747,
60,
55609,
198,
33,
2315,
25,
1665,
198,
15824,
284,
364,
2000,
21301,
6,
55609,
198,
55033,
40678,
57955,
27345,
284,
3082,
55609
] | https://langchain.readthedocs.io/en/latest/llms/langchain.llms.databricks.Databricks.html |
d0069712e7c7-0 | langchain.llms.openllm.IdentifyingParams¶
class langchain.llms.openllm.IdentifyingParams[source]¶
Bases: TypedDict
Methods
__init__(*args, **kwargs)
clear()
copy()
fromkeys([value])
Create a new dictionary with keys from iterable and values set to value.
get(key[, default])
Return the value for key if key is in the dictionary, else default.
items()
keys()
pop(k[,d])
If the key is not found, return the default if given; otherwise, raise a KeyError.
popitem()
Remove and return a (key, value) pair as a 2-tuple.
setdefault(key[, default])
Insert key with a value of default if key is not in the dictionary.
update([E, ]**F)
If E is present and has a .keys() method, then does: for k in E: D[k] = E[k] If E is present and lacks a .keys() method, then does: for k, v in E: D[k] = v In either case, this is followed by: for k in F: D[k] = F[k]
values()
Attributes
model_name
model_id
server_url
server_type
embedded
llm_kwargs
clear() → None. Remove all items from D.¶
copy() → a shallow copy of D¶
fromkeys(value=None, /)¶
Create a new dictionary with keys from iterable and values set to value.
get(key, default=None, /)¶
Return the value for key if key is in the dictionary, else default.
items() → a set-like object providing a view on D's items¶
keys() → a set-like object providing a view on D's keys¶ | [
5317,
8995,
60098,
1026,
5949,
657,
76,
6580,
306,
7922,
4975,
55609,
198,
1058,
8859,
8995,
60098,
1026,
5949,
657,
76,
6580,
306,
7922,
4975,
76747,
60,
55609,
198,
33,
2315,
25,
51654,
13755,
198,
18337,
198,
565,
2381,
69106,
2164,
11,
4194,
334,
9872,
340,
7574,
746,
8728,
746,
1527,
10786,
2625,
970,
2608,
4110,
264,
502,
11240,
449,
7039,
505,
51934,
323,
2819,
743,
311,
907,
627,
456,
4962,
38372,
4194,
2309,
2608,
5715,
279,
907,
369,
1401,
422,
1401,
374,
304,
279,
11240,
11,
775,
1670,
627,
3699,
746,
10786,
746,
8539,
6097,
38372,
67,
2608,
2746,
279,
1401,
374,
539,
1766,
11,
471,
279,
1670,
422,
2728,
26,
6062,
11,
4933,
264,
39194,
627,
8539,
1224,
746,
13319,
323,
471,
264,
320,
798,
11,
907,
8,
6857,
439,
264,
220,
17,
2442,
6189,
627,
751,
2309,
4962,
38372,
4194,
2309,
2608,
14099,
1401,
449,
264,
907,
315,
1670,
422,
1401,
374,
539,
304,
279,
11240,
627,
2443,
2625,
36,
11,
4194,
79441,
37,
340,
2746,
469,
374,
3118,
323,
706,
264,
662,
10786,
368,
1749,
11,
1243,
1587,
25,
220,
369,
597,
304,
469,
25,
423,
6874,
60,
284,
469,
6874,
60,
1442,
469,
374,
3118,
323,
37856,
264,
662,
10786,
368,
1749,
11,
1243,
1587,
25,
220,
369,
597,
11,
348,
304,
469,
25,
423,
6874,
60,
284,
348,
763,
3060,
1162,
11,
420,
374,
8272,
555,
25,
369,
597,
304,
435,
25,
220,
423,
6874,
60,
284,
435,
6874,
933,
3745,
746,
10738,
198,
2590,
1292,
198,
2590,
851,
198,
4120,
2975,
198,
4120,
1857,
198,
70964,
198,
657,
76,
37335,
198,
7574,
368,
11651,
2290,
13,
4194,
11016,
682,
3673,
505,
423,
13,
55609,
198,
8728,
368,
11651,
264,
26682,
3048,
315,
423,
55609,
198,
1527,
10786,
3764,
5980,
11,
611,
8,
55609,
198,
4110,
264,
502,
11240,
449,
7039,
505,
51934,
323,
2819,
743,
311,
907,
627,
456,
4962,
11,
1670,
5980,
11,
611,
8,
55609,
198,
5715,
279,
907,
369,
1401,
422,
1401,
374,
304,
279,
11240,
11,
775,
1670,
627,
3699,
368,
11651,
264,
743,
12970,
1665,
8405,
264,
1684,
389,
423,
596,
3673,
55609,
198,
10786,
368,
11651,
264,
743,
12970,
1665,
8405,
264,
1684,
389,
423,
596,
7039,
55609
] | https://langchain.readthedocs.io/en/latest/llms/langchain.llms.openllm.IdentifyingParams.html |
d0069712e7c7-1 | keys() → a set-like object providing a view on D's keys¶
pop(k[, d]) → v, remove specified key and return the corresponding value.¶
If the key is not found, return the default if given; otherwise,
raise a KeyError.
popitem()¶
Remove and return a (key, value) pair as a 2-tuple.
Pairs are returned in LIFO (last-in, first-out) order.
Raises KeyError if the dict is empty.
setdefault(key, default=None, /)¶
Insert key with a value of default if key is not in the dictionary.
Return the value for key if key is in the dictionary, else default.
update([E, ]**F) → None. Update D from dict/iterable E and F.¶
If E is present and has a .keys() method, then does: for k in E: D[k] = E[k]
If E is present and lacks a .keys() method, then does: for k, v in E: D[k] = v
In either case, this is followed by: for k in F: D[k] = F[k]
values() → an object providing a view on D's values¶
embedded: bool¶
llm_kwargs: Dict[str, Any]¶
model_id: Optional[str]¶
model_name: str¶
server_type: Optional[Literal['http', 'grpc']]¶
server_url: Optional[str]¶ | [
10786,
368,
11651,
264,
743,
12970,
1665,
8405,
264,
1684,
389,
423,
596,
7039,
55609,
198,
8539,
6097,
38372,
294,
2526,
11651,
348,
11,
4148,
5300,
1401,
323,
471,
279,
12435,
907,
13,
55609,
198,
2746,
279,
1401,
374,
539,
1766,
11,
471,
279,
1670,
422,
2728,
26,
6062,
345,
19223,
264,
39194,
627,
8539,
1224,
368,
55609,
198,
13319,
323,
471,
264,
320,
798,
11,
907,
8,
6857,
439,
264,
220,
17,
2442,
6189,
627,
55328,
527,
6052,
304,
445,
27088,
320,
4354,
3502,
11,
1176,
9994,
8,
2015,
627,
36120,
39194,
422,
279,
6587,
374,
4384,
627,
751,
2309,
4962,
11,
1670,
5980,
11,
611,
8,
55609,
198,
14099,
1401,
449,
264,
907,
315,
1670,
422,
1401,
374,
539,
304,
279,
11240,
627,
5715,
279,
907,
369,
1401,
422,
1401,
374,
304,
279,
11240,
11,
775,
1670,
627,
2443,
2625,
36,
11,
2331,
334,
37,
8,
11651,
2290,
13,
4194,
5666,
423,
505,
6587,
14,
2058,
481,
469,
323,
435,
13,
55609,
198,
2746,
469,
374,
3118,
323,
706,
264,
662,
10786,
368,
1749,
11,
1243,
1587,
25,
220,
369,
597,
304,
469,
25,
423,
6874,
60,
284,
469,
6874,
933,
2746,
469,
374,
3118,
323,
37856,
264,
662,
10786,
368,
1749,
11,
1243,
1587,
25,
220,
369,
597,
11,
348,
304,
469,
25,
423,
6874,
60,
284,
348,
198,
644,
3060,
1162,
11,
420,
374,
8272,
555,
25,
369,
597,
304,
435,
25,
220,
423,
6874,
60,
284,
435,
6874,
933,
3745,
368,
11651,
459,
1665,
8405,
264,
1684,
389,
423,
596,
2819,
55609,
198,
70964,
25,
1845,
55609,
198,
657,
76,
37335,
25,
30226,
17752,
11,
5884,
60,
55609,
198,
2590,
851,
25,
12536,
17752,
60,
55609,
198,
2590,
1292,
25,
610,
55609,
198,
4120,
1857,
25,
12536,
58,
17802,
681,
1277,
518,
364,
57685,
31940,
55609,
198,
4120,
2975,
25,
12536,
17752,
60,
55609
] | https://langchain.readthedocs.io/en/latest/llms/langchain.llms.openllm.IdentifyingParams.html |
f74b9a11afa7-0 | langchain.llms.gpt4all.GPT4All¶
class langchain.llms.gpt4all.GPT4All(*, cache: Optional[bool] = None, verbose: bool = None, callbacks: Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]] = None, callback_manager: Optional[BaseCallbackManager] = None, tags: Optional[List[str]] = None, model: str, backend: Optional[str] = None, n_ctx: int = 512, n_parts: int = - 1, seed: int = 0, f16_kv: bool = False, logits_all: bool = False, vocab_only: bool = False, use_mlock: bool = False, embedding: bool = False, n_threads: Optional[int] = 4, n_predict: Optional[int] = 256, temp: Optional[float] = 0.8, top_p: Optional[float] = 0.95, top_k: Optional[int] = 40, echo: Optional[bool] = False, stop: Optional[List[str]] = [], repeat_last_n: Optional[int] = 64, repeat_penalty: Optional[float] = 1.3, n_batch: int = 1, streaming: bool = False, context_erase: float = 0.5, allow_download: bool = False, client: Any = None)[source]¶
Bases: LLM
Wrapper around GPT4All language models.
To use, you should have the gpt4all python package installed, the
pre-trained model file, and the model’s config information.
Example
from langchain.llms import GPT4All
model = GPT4All(model="./models/gpt4all-model.bin", n_ctx=512, n_threads=8)
# Simplest invocation
response = model("Once upon a time, ") | [
5317,
8995,
60098,
1026,
1326,
418,
19,
543,
1246,
2898,
19,
2460,
55609,
198,
1058,
8859,
8995,
60098,
1026,
1326,
418,
19,
543,
1246,
2898,
19,
2460,
4163,
11,
6636,
25,
12536,
58,
2707,
60,
284,
2290,
11,
14008,
25,
1845,
284,
2290,
11,
27777,
25,
12536,
58,
33758,
53094,
58,
4066,
7646,
3126,
1145,
5464,
7646,
2087,
5163,
284,
2290,
11,
4927,
12418,
25,
12536,
58,
4066,
7646,
2087,
60,
284,
2290,
11,
9681,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
1646,
25,
610,
11,
19713,
25,
12536,
17752,
60,
284,
2290,
11,
308,
15498,
25,
528,
284,
220,
8358,
11,
308,
34317,
25,
528,
284,
482,
220,
16,
11,
10533,
25,
528,
284,
220,
15,
11,
282,
845,
98166,
25,
1845,
284,
3641,
11,
61888,
5823,
25,
1845,
284,
3641,
11,
24757,
18917,
25,
1845,
284,
3641,
11,
1005,
722,
1039,
25,
1845,
284,
3641,
11,
40188,
25,
1845,
284,
3641,
11,
308,
30825,
25,
12536,
19155,
60,
284,
220,
19,
11,
308,
27913,
25,
12536,
19155,
60,
284,
220,
4146,
11,
2798,
25,
12536,
96481,
60,
284,
220,
15,
13,
23,
11,
1948,
623,
25,
12536,
96481,
60,
284,
220,
15,
13,
2721,
11,
1948,
4803,
25,
12536,
19155,
60,
284,
220,
1272,
11,
1722,
25,
12536,
58,
2707,
60,
284,
3641,
11,
3009,
25,
12536,
53094,
17752,
5163,
284,
10277,
13454,
12473,
1107,
25,
12536,
19155,
60,
284,
220,
1227,
11,
13454,
83386,
25,
12536,
96481,
60,
284,
220,
16,
13,
18,
11,
308,
14876,
25,
528,
284,
220,
16,
11,
17265,
25,
1845,
284,
3641,
11,
2317,
86616,
25,
2273,
284,
220,
15,
13,
20,
11,
2187,
37039,
25,
1845,
284,
3641,
11,
3016,
25,
5884,
284,
2290,
6758,
2484,
60,
55609,
198,
33,
2315,
25,
445,
11237,
198,
11803,
2212,
480,
2898,
19,
2460,
4221,
4211,
627,
1271,
1005,
11,
499,
1288,
617,
279,
342,
418,
19,
543,
10344,
6462,
10487,
11,
279,
198,
1762,
70024,
1646,
1052,
11,
323,
279,
1646,
753,
2242,
2038,
627,
13617,
198,
1527,
8859,
8995,
60098,
1026,
1179,
480,
2898,
19,
2460,
198,
2590,
284,
480,
2898,
19,
2460,
7790,
27518,
6644,
4951,
418,
19,
543,
29344,
30494,
498,
308,
15498,
28,
8358,
11,
308,
30825,
28,
23,
340,
2,
9170,
267,
29796,
198,
2376,
284,
1646,
446,
12805,
5304,
264,
892,
11,
12590
] | https://langchain.readthedocs.io/en/latest/llms/langchain.llms.gpt4all.GPT4All.html |
f74b9a11afa7-1 | # Simplest invocation
response = model("Once upon a time, ")
Create a new model by parsing and validating input data from keyword arguments.
Raises ValidationError if the input data cannot be parsed to form a valid model.
param allow_download: bool = False¶
If model does not exist in ~/.cache/gpt4all/, download it.
param backend: Optional[str] = None¶
param cache: Optional[bool] = None¶
param callback_manager: Optional[BaseCallbackManager] = None¶
param callbacks: Callbacks = None¶
param context_erase: float = 0.5¶
Leave (n_ctx * context_erase) tokens
starting from beginning if the context has run out.
param echo: Optional[bool] = False¶
Whether to echo the prompt.
param embedding: bool = False¶
Use embedding mode only.
param f16_kv: bool = False¶
Use half-precision for key/value cache.
param logits_all: bool = False¶
Return logits for all tokens, not just the last token.
param model: str [Required]¶
Path to the pre-trained GPT4All model file.
param n_batch: int = 1¶
Batch size for prompt processing.
param n_ctx: int = 512¶
Token context window.
param n_parts: int = -1¶
Number of parts to split the model into.
If -1, the number of parts is automatically determined.
param n_predict: Optional[int] = 256¶
The maximum number of tokens to generate.
param n_threads: Optional[int] = 4¶
Number of threads to use.
param repeat_last_n: Optional[int] = 64¶
Last n tokens to penalize
param repeat_penalty: Optional[float] = 1.3¶
The penalty to apply to repeated tokens. | [
2,
9170,
267,
29796,
198,
2376,
284,
1646,
446,
12805,
5304,
264,
892,
11,
14501,
4110,
264,
502,
1646,
555,
23115,
323,
69772,
1988,
828,
505,
16570,
6105,
627,
36120,
54129,
422,
279,
1988,
828,
4250,
387,
16051,
311,
1376,
264,
2764,
1646,
627,
913,
2187,
37039,
25,
1845,
284,
3641,
55609,
198,
2746,
1646,
1587,
539,
3073,
304,
41058,
9544,
4951,
418,
19,
543,
35645,
4232,
433,
627,
913,
19713,
25,
12536,
17752,
60,
284,
2290,
55609,
198,
913,
6636,
25,
12536,
58,
2707,
60,
284,
2290,
55609,
198,
913,
4927,
12418,
25,
12536,
58,
4066,
7646,
2087,
60,
284,
2290,
55609,
198,
913,
27777,
25,
23499,
82,
284,
2290,
55609,
198,
913,
2317,
86616,
25,
2273,
284,
220,
15,
13,
20,
55609,
198,
22586,
320,
77,
15498,
353,
2317,
86616,
8,
11460,
198,
40389,
505,
7314,
422,
279,
2317,
706,
1629,
704,
627,
913,
1722,
25,
12536,
58,
2707,
60,
284,
3641,
55609,
198,
25729,
311,
1722,
279,
10137,
627,
913,
40188,
25,
1845,
284,
3641,
55609,
198,
10464,
40188,
3941,
1193,
627,
913,
282,
845,
98166,
25,
1845,
284,
3641,
55609,
198,
10464,
4376,
12,
28281,
369,
1401,
58642,
6636,
627,
913,
61888,
5823,
25,
1845,
284,
3641,
55609,
198,
5715,
61888,
369,
682,
11460,
11,
539,
1120,
279,
1566,
4037,
627,
913,
1646,
25,
610,
510,
8327,
60,
55609,
198,
1858,
311,
279,
864,
70024,
480,
2898,
19,
2460,
1646,
1052,
627,
913,
308,
14876,
25,
528,
284,
220,
16,
55609,
198,
21753,
1404,
369,
10137,
8863,
627,
913,
308,
15498,
25,
528,
284,
220,
8358,
55609,
198,
3404,
2317,
3321,
627,
913,
308,
34317,
25,
528,
284,
482,
16,
55609,
198,
2903,
315,
5596,
311,
6859,
279,
1646,
1139,
627,
2746,
482,
16,
11,
279,
1396,
315,
5596,
374,
9651,
11075,
627,
913,
308,
27913,
25,
12536,
19155,
60,
284,
220,
4146,
55609,
198,
791,
7340,
1396,
315,
11460,
311,
7068,
627,
913,
308,
30825,
25,
12536,
19155,
60,
284,
220,
19,
55609,
198,
2903,
315,
14906,
311,
1005,
627,
913,
13454,
12473,
1107,
25,
12536,
19155,
60,
284,
220,
1227,
55609,
198,
5966,
308,
11460,
311,
47426,
553,
198,
913,
13454,
83386,
25,
12536,
96481,
60,
284,
220,
16,
13,
18,
55609,
198,
791,
16750,
311,
3881,
311,
11763,
11460,
13
] | https://langchain.readthedocs.io/en/latest/llms/langchain.llms.gpt4all.GPT4All.html |
f74b9a11afa7-2 | The penalty to apply to repeated tokens.
param seed: int = 0¶
Seed. If -1, a random seed is used.
param stop: Optional[List[str]] = []¶
A list of strings to stop generation when encountered.
param streaming: bool = False¶
Whether to stream the results or not.
param tags: Optional[List[str]] = None¶
Tags to add to the run trace.
param temp: Optional[float] = 0.8¶
The temperature to use for sampling.
param top_k: Optional[int] = 40¶
The top-k value to use for sampling.
param top_p: Optional[float] = 0.95¶
The top-p value to use for sampling.
param use_mlock: bool = False¶
Force system to keep model in RAM.
param verbose: bool [Optional]¶
Whether to print out response text.
param vocab_only: bool = False¶
Only load the vocabulary, no weights.
__call__(prompt: str, stop: Optional[List[str]] = None, callbacks: Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]] = None, **kwargs: Any) → str¶
Check Cache and run the LLM on the given prompt and input.
async agenerate(prompts: List[str], stop: Optional[List[str]] = None, callbacks: Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]] = None, *, tags: Optional[List[str]] = None, **kwargs: Any) → LLMResult¶
Run the LLM on the given prompt and input.
async agenerate_prompt(prompts: List[PromptValue], stop: Optional[List[str]] = None, callbacks: Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]] = None, **kwargs: Any) → LLMResult¶ | [
791,
16750,
311,
3881,
311,
11763,
11460,
627,
913,
10533,
25,
528,
284,
220,
15,
55609,
198,
42571,
13,
1442,
482,
16,
11,
264,
4288,
10533,
374,
1511,
627,
913,
3009,
25,
12536,
53094,
17752,
5163,
284,
3132,
55609,
198,
32,
1160,
315,
9246,
311,
3009,
9659,
994,
23926,
627,
913,
17265,
25,
1845,
284,
3641,
55609,
198,
25729,
311,
4365,
279,
3135,
477,
539,
627,
913,
9681,
25,
12536,
53094,
17752,
5163,
284,
2290,
55609,
198,
16309,
311,
923,
311,
279,
1629,
11917,
627,
913,
2798,
25,
12536,
96481,
60,
284,
220,
15,
13,
23,
55609,
198,
791,
9499,
311,
1005,
369,
25936,
627,
913,
1948,
4803,
25,
12536,
19155,
60,
284,
220,
1272,
55609,
198,
791,
1948,
12934,
907,
311,
1005,
369,
25936,
627,
913,
1948,
623,
25,
12536,
96481,
60,
284,
220,
15,
13,
2721,
55609,
198,
791,
1948,
2320,
907,
311,
1005,
369,
25936,
627,
913,
1005,
722,
1039,
25,
1845,
284,
3641,
55609,
198,
19085,
1887,
311,
2567,
1646,
304,
22813,
627,
913,
14008,
25,
1845,
510,
15669,
60,
55609,
198,
25729,
311,
1194,
704,
2077,
1495,
627,
913,
24757,
18917,
25,
1845,
284,
3641,
55609,
198,
7456,
2865,
279,
36018,
11,
912,
14661,
627,
565,
6797,
3889,
41681,
25,
610,
11,
3009,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
27777,
25,
12536,
58,
33758,
53094,
58,
4066,
7646,
3126,
1145,
5464,
7646,
2087,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
610,
55609,
198,
4061,
20044,
323,
1629,
279,
445,
11237,
389,
279,
2728,
10137,
323,
1988,
627,
7847,
945,
13523,
84432,
13044,
25,
1796,
17752,
1145,
3009,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
27777,
25,
12536,
58,
33758,
53094,
58,
4066,
7646,
3126,
1145,
5464,
7646,
2087,
5163,
284,
2290,
11,
12039,
9681,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
445,
11237,
2122,
55609,
198,
6869,
279,
445,
11237,
389,
279,
2728,
10137,
323,
1988,
627,
7847,
945,
13523,
62521,
84432,
13044,
25,
1796,
43447,
15091,
1150,
1145,
3009,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
27777,
25,
12536,
58,
33758,
53094,
58,
4066,
7646,
3126,
1145,
5464,
7646,
2087,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
445,
11237,
2122,
55609
] | https://langchain.readthedocs.io/en/latest/llms/langchain.llms.gpt4all.GPT4All.html |
f74b9a11afa7-3 | Take in a list of prompt values and return an LLMResult.
classmethod all_required_field_names() → Set¶
async apredict(text: str, *, stop: Optional[Sequence[str]] = None, **kwargs: Any) → str¶
Predict text from text.
async apredict_messages(messages: List[BaseMessage], *, stop: Optional[Sequence[str]] = None, **kwargs: Any) → BaseMessage¶
Predict message from messages.
dict(**kwargs: Any) → Dict¶
Return a dictionary of the LLM.
generate(prompts: List[str], stop: Optional[List[str]] = None, callbacks: Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]] = None, *, tags: Optional[List[str]] = None, **kwargs: Any) → LLMResult¶
Run the LLM on the given prompt and input.
generate_prompt(prompts: List[PromptValue], stop: Optional[List[str]] = None, callbacks: Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]] = None, **kwargs: Any) → LLMResult¶
Take in a list of prompt values and return an LLMResult.
get_num_tokens(text: str) → int¶
Get the number of tokens present in the text.
get_num_tokens_from_messages(messages: List[BaseMessage]) → int¶
Get the number of tokens in the message.
get_token_ids(text: str) → List[int]¶
Get the token present in the text.
predict(text: str, *, stop: Optional[Sequence[str]] = None, **kwargs: Any) → str¶
Predict text from text.
predict_messages(messages: List[BaseMessage], *, stop: Optional[Sequence[str]] = None, **kwargs: Any) → BaseMessage¶
Predict message from messages.
validator raise_deprecation » all fields¶ | [
18293,
304,
264,
1160,
315,
10137,
2819,
323,
471,
459,
445,
11237,
2122,
627,
27853,
682,
19265,
5121,
9366,
368,
11651,
2638,
55609,
198,
7847,
1469,
9037,
7383,
25,
610,
11,
12039,
3009,
25,
12536,
58,
14405,
17752,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
610,
55609,
198,
54644,
1495,
505,
1495,
627,
7847,
1469,
9037,
24321,
56805,
25,
1796,
58,
4066,
2097,
1145,
12039,
3009,
25,
12536,
58,
14405,
17752,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
5464,
2097,
55609,
198,
54644,
1984,
505,
6743,
627,
8644,
22551,
9872,
25,
5884,
8,
11651,
30226,
55609,
198,
5715,
264,
11240,
315,
279,
445,
11237,
627,
19927,
84432,
13044,
25,
1796,
17752,
1145,
3009,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
27777,
25,
12536,
58,
33758,
53094,
58,
4066,
7646,
3126,
1145,
5464,
7646,
2087,
5163,
284,
2290,
11,
12039,
9681,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
445,
11237,
2122,
55609,
198,
6869,
279,
445,
11237,
389,
279,
2728,
10137,
323,
1988,
627,
19927,
62521,
84432,
13044,
25,
1796,
43447,
15091,
1150,
1145,
3009,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
27777,
25,
12536,
58,
33758,
53094,
58,
4066,
7646,
3126,
1145,
5464,
7646,
2087,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
445,
11237,
2122,
55609,
198,
18293,
304,
264,
1160,
315,
10137,
2819,
323,
471,
459,
445,
11237,
2122,
627,
456,
4369,
29938,
7383,
25,
610,
8,
11651,
528,
55609,
198,
1991,
279,
1396,
315,
11460,
3118,
304,
279,
1495,
627,
456,
4369,
29938,
5791,
24321,
56805,
25,
1796,
58,
4066,
2097,
2526,
11651,
528,
55609,
198,
1991,
279,
1396,
315,
11460,
304,
279,
1984,
627,
456,
6594,
8237,
7383,
25,
610,
8,
11651,
1796,
19155,
60,
55609,
198,
1991,
279,
4037,
3118,
304,
279,
1495,
627,
35798,
7383,
25,
610,
11,
12039,
3009,
25,
12536,
58,
14405,
17752,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
610,
55609,
198,
54644,
1495,
505,
1495,
627,
35798,
24321,
56805,
25,
1796,
58,
4066,
2097,
1145,
12039,
3009,
25,
12536,
58,
14405,
17752,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
5464,
2097,
55609,
198,
54644,
1984,
505,
6743,
627,
16503,
4933,
2310,
70693,
4194,
8345,
4194,
682,
5151,
55609
] | https://langchain.readthedocs.io/en/latest/llms/langchain.llms.gpt4all.GPT4All.html |
f74b9a11afa7-4 | Predict message from messages.
validator raise_deprecation » all fields¶
Raise deprecation warning if callback_manager is used.
save(file_path: Union[Path, str]) → None¶
Save the LLM.
Parameters
file_path – Path to file to save the LLM to.
Example:
.. code-block:: python
llm.save(file_path=”path/llm.yaml”)
validator set_verbose » verbose¶
If verbose is None, set it.
This allows users to pass in None as verbose to access the global setting.
to_json() → Union[SerializedConstructor, SerializedNotImplemented]¶
to_json_not_implemented() → SerializedNotImplemented¶
validator validate_environment » all fields[source]¶
Validate that the python package exists in the environment.
property lc_attributes: Dict¶
Return a list of attribute names that should be included in the
serialized kwargs. These attributes must be accepted by the
constructor.
property lc_namespace: List[str]¶
Return the namespace of the langchain object.
eg. [“langchain”, “llms”, “openai”]
property lc_secrets: Dict[str, str]¶
Return a map of constructor argument names to secret ids.
eg. {“openai_api_key”: “OPENAI_API_KEY”}
property lc_serializable: bool¶
Return whether or not the class is serializable.
model Config[source]¶
Bases: object
Configuration for this pydantic object.
extra = 'forbid'¶ | [
54644,
1984,
505,
6743,
627,
16503,
4933,
2310,
70693,
4194,
8345,
4194,
682,
5151,
55609,
198,
94201,
409,
70693,
10163,
422,
4927,
12418,
374,
1511,
627,
6766,
4971,
2703,
25,
9323,
58,
1858,
11,
610,
2526,
11651,
2290,
55609,
198,
8960,
279,
445,
11237,
627,
9905,
198,
1213,
2703,
1389,
8092,
311,
1052,
311,
3665,
279,
445,
11237,
311,
627,
13617,
512,
497,
2082,
9612,
487,
10344,
198,
657,
76,
5799,
4971,
2703,
45221,
2398,
14,
657,
76,
34506,
863,
340,
16503,
743,
69021,
4194,
8345,
4194,
14008,
55609,
198,
2746,
14008,
374,
2290,
11,
743,
433,
627,
2028,
6276,
3932,
311,
1522,
304,
2290,
439,
14008,
311,
2680,
279,
3728,
6376,
627,
998,
9643,
368,
11651,
9323,
58,
78621,
13591,
11,
92572,
2688,
18804,
60,
55609,
198,
998,
9643,
8072,
18377,
14565,
368,
11651,
92572,
2688,
18804,
55609,
198,
16503,
9788,
52874,
4194,
8345,
4194,
682,
5151,
76747,
60,
55609,
198,
18409,
430,
279,
10344,
6462,
6866,
304,
279,
4676,
627,
3784,
37313,
18741,
25,
30226,
55609,
198,
5715,
264,
1160,
315,
7180,
5144,
430,
1288,
387,
5343,
304,
279,
198,
76377,
16901,
13,
4314,
8365,
2011,
387,
11928,
555,
279,
198,
22602,
627,
3784,
37313,
42671,
25,
1796,
17752,
60,
55609,
198,
5715,
279,
4573,
315,
279,
8859,
8995,
1665,
627,
797,
13,
510,
2118,
5317,
8995,
9520,
1054,
657,
1026,
9520,
1054,
2569,
2192,
863,
933,
3784,
37313,
3537,
53810,
25,
30226,
17752,
11,
610,
60,
55609,
198,
5715,
264,
2472,
315,
4797,
5811,
5144,
311,
6367,
14483,
627,
797,
13,
314,
2118,
2569,
2192,
11959,
3173,
57633,
1054,
32033,
15836,
11669,
6738,
863,
534,
3784,
37313,
26684,
8499,
25,
1845,
55609,
198,
5715,
3508,
477,
539,
279,
538,
374,
6275,
8499,
627,
2590,
5649,
76747,
60,
55609,
198,
33,
2315,
25,
1665,
198,
7843,
369,
420,
4611,
67,
8322,
1665,
627,
15824,
284,
364,
2000,
21301,
6,
55609
] | https://langchain.readthedocs.io/en/latest/llms/langchain.llms.gpt4all.GPT4All.html |
faa73cc1156f-0 | langchain.llms.azureml_endpoint.AzureMLOnlineEndpoint¶
class langchain.llms.azureml_endpoint.AzureMLOnlineEndpoint(*, cache: Optional[bool] = None, verbose: bool = None, callbacks: Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]] = None, callback_manager: Optional[BaseCallbackManager] = None, tags: Optional[List[str]] = None, endpoint_url: str = '', endpoint_api_key: str = '', deployment_name: str = '', http_client: Any = None, content_formatter: Any = None, model_kwargs: Optional[dict] = None)[source]¶
Bases: LLM, BaseModel
Wrapper around Azure ML Hosted models using Managed Online Endpoints.
Example
azure_llm = AzureMLModel(
endpoint_url="https://<your-endpoint>.<your_region>.inference.ml.azure.com/score",
endpoint_api_key="my-api-key",
deployment_name="my-deployment-name",
content_formatter=content_formatter,
)
Create a new model by parsing and validating input data from keyword arguments.
Raises ValidationError if the input data cannot be parsed to form a valid model.
param cache: Optional[bool] = None¶
param callback_manager: Optional[BaseCallbackManager] = None¶
param callbacks: Callbacks = None¶
param content_formatter: Any = None¶
The content formatter that provides an input and output
transform function to handle formats between the LLM and
the endpoint
param deployment_name: str = ''¶
Deployment Name for Endpoint. Should be passed to constructor or specified as
env var AZUREML_DEPLOYMENT_NAME.
param endpoint_api_key: str = ''¶
Authentication Key for Endpoint. Should be passed to constructor or specified as
env var AZUREML_ENDPOINT_API_KEY.
param endpoint_url: str = ''¶ | [
5317,
8995,
60098,
1026,
71340,
1029,
37799,
58927,
44,
1623,
77,
1074,
28480,
55609,
198,
1058,
8859,
8995,
60098,
1026,
71340,
1029,
37799,
58927,
44,
1623,
77,
1074,
28480,
4163,
11,
6636,
25,
12536,
58,
2707,
60,
284,
2290,
11,
14008,
25,
1845,
284,
2290,
11,
27777,
25,
12536,
58,
33758,
53094,
58,
4066,
7646,
3126,
1145,
5464,
7646,
2087,
5163,
284,
2290,
11,
4927,
12418,
25,
12536,
58,
4066,
7646,
2087,
60,
284,
2290,
11,
9681,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
15233,
2975,
25,
610,
284,
9158,
15233,
11959,
3173,
25,
610,
284,
9158,
24047,
1292,
25,
610,
284,
9158,
1795,
8342,
25,
5884,
284,
2290,
11,
2262,
75065,
25,
5884,
284,
2290,
11,
1646,
37335,
25,
12536,
58,
8644,
60,
284,
2290,
6758,
2484,
60,
55609,
198,
33,
2315,
25,
445,
11237,
11,
65705,
198,
11803,
2212,
35219,
20187,
16492,
291,
4211,
1701,
61844,
8267,
4060,
7862,
627,
13617,
198,
40595,
44095,
76,
284,
35219,
2735,
1747,
1021,
262,
15233,
2975,
429,
2485,
1129,
27,
22479,
13368,
2837,
14611,
27,
22479,
21276,
14611,
258,
2251,
58509,
71340,
916,
2754,
2202,
761,
262,
15233,
11959,
3173,
429,
2465,
24851,
16569,
761,
262,
24047,
1292,
429,
2465,
6953,
53899,
11753,
761,
262,
2262,
75065,
28,
1834,
75065,
345,
340,
4110,
264,
502,
1646,
555,
23115,
323,
69772,
1988,
828,
505,
16570,
6105,
627,
36120,
54129,
422,
279,
1988,
828,
4250,
387,
16051,
311,
1376,
264,
2764,
1646,
627,
913,
6636,
25,
12536,
58,
2707,
60,
284,
2290,
55609,
198,
913,
4927,
12418,
25,
12536,
58,
4066,
7646,
2087,
60,
284,
2290,
55609,
198,
913,
27777,
25,
23499,
82,
284,
2290,
55609,
198,
913,
2262,
75065,
25,
5884,
284,
2290,
55609,
198,
791,
2262,
25851,
430,
5825,
459,
1988,
323,
2612,
198,
4806,
734,
311,
3790,
20447,
1990,
279,
445,
11237,
323,
198,
1820,
15233,
198,
913,
24047,
1292,
25,
610,
284,
3436,
55609,
198,
76386,
4076,
369,
48369,
13,
12540,
387,
5946,
311,
4797,
477,
5300,
439,
198,
3239,
767,
31976,
4622,
2735,
2952,
72924,
5441,
4813,
627,
913,
15233,
11959,
3173,
25,
610,
284,
3436,
55609,
198,
19855,
5422,
369,
48369,
13,
12540,
387,
5946,
311,
4797,
477,
5300,
439,
198,
3239,
767,
31976,
4622,
2735,
49856,
11669,
6738,
627,
913,
15233,
2975,
25,
610,
284,
3436,
55609
] | https://langchain.readthedocs.io/en/latest/llms/langchain.llms.azureml_endpoint.AzureMLOnlineEndpoint.html |
faa73cc1156f-1 | env var AZUREML_ENDPOINT_API_KEY.
param endpoint_url: str = ''¶
URL of pre-existing Endpoint. Should be passed to constructor or specified as
env var AZUREML_ENDPOINT_URL.
param model_kwargs: Optional[dict] = None¶
Key word arguments to pass to the model.
param tags: Optional[List[str]] = None¶
Tags to add to the run trace.
param verbose: bool [Optional]¶
Whether to print out response text.
__call__(prompt: str, stop: Optional[List[str]] = None, callbacks: Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]] = None, **kwargs: Any) → str¶
Check Cache and run the LLM on the given prompt and input.
async agenerate(prompts: List[str], stop: Optional[List[str]] = None, callbacks: Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]] = None, *, tags: Optional[List[str]] = None, **kwargs: Any) → LLMResult¶
Run the LLM on the given prompt and input.
async agenerate_prompt(prompts: List[PromptValue], stop: Optional[List[str]] = None, callbacks: Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]] = None, **kwargs: Any) → LLMResult¶
Take in a list of prompt values and return an LLMResult.
classmethod all_required_field_names() → Set¶
async apredict(text: str, *, stop: Optional[Sequence[str]] = None, **kwargs: Any) → str¶
Predict text from text.
async apredict_messages(messages: List[BaseMessage], *, stop: Optional[Sequence[str]] = None, **kwargs: Any) → BaseMessage¶
Predict message from messages.
dict(**kwargs: Any) → Dict¶
Return a dictionary of the LLM. | [
3239,
767,
31976,
4622,
2735,
49856,
11669,
6738,
627,
913,
15233,
2975,
25,
610,
284,
3436,
55609,
198,
3222,
315,
864,
50457,
48369,
13,
12540,
387,
5946,
311,
4797,
477,
5300,
439,
198,
3239,
767,
31976,
4622,
2735,
49856,
8159,
627,
913,
1646,
37335,
25,
12536,
58,
8644,
60,
284,
2290,
55609,
198,
1622,
3492,
6105,
311,
1522,
311,
279,
1646,
627,
913,
9681,
25,
12536,
53094,
17752,
5163,
284,
2290,
55609,
198,
16309,
311,
923,
311,
279,
1629,
11917,
627,
913,
14008,
25,
1845,
510,
15669,
60,
55609,
198,
25729,
311,
1194,
704,
2077,
1495,
627,
565,
6797,
3889,
41681,
25,
610,
11,
3009,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
27777,
25,
12536,
58,
33758,
53094,
58,
4066,
7646,
3126,
1145,
5464,
7646,
2087,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
610,
55609,
198,
4061,
20044,
323,
1629,
279,
445,
11237,
389,
279,
2728,
10137,
323,
1988,
627,
7847,
945,
13523,
84432,
13044,
25,
1796,
17752,
1145,
3009,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
27777,
25,
12536,
58,
33758,
53094,
58,
4066,
7646,
3126,
1145,
5464,
7646,
2087,
5163,
284,
2290,
11,
12039,
9681,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
445,
11237,
2122,
55609,
198,
6869,
279,
445,
11237,
389,
279,
2728,
10137,
323,
1988,
627,
7847,
945,
13523,
62521,
84432,
13044,
25,
1796,
43447,
15091,
1150,
1145,
3009,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
27777,
25,
12536,
58,
33758,
53094,
58,
4066,
7646,
3126,
1145,
5464,
7646,
2087,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
445,
11237,
2122,
55609,
198,
18293,
304,
264,
1160,
315,
10137,
2819,
323,
471,
459,
445,
11237,
2122,
627,
27853,
682,
19265,
5121,
9366,
368,
11651,
2638,
55609,
198,
7847,
1469,
9037,
7383,
25,
610,
11,
12039,
3009,
25,
12536,
58,
14405,
17752,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
610,
55609,
198,
54644,
1495,
505,
1495,
627,
7847,
1469,
9037,
24321,
56805,
25,
1796,
58,
4066,
2097,
1145,
12039,
3009,
25,
12536,
58,
14405,
17752,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
5464,
2097,
55609,
198,
54644,
1984,
505,
6743,
627,
8644,
22551,
9872,
25,
5884,
8,
11651,
30226,
55609,
198,
5715,
264,
11240,
315,
279,
445,
11237,
13
] | https://langchain.readthedocs.io/en/latest/llms/langchain.llms.azureml_endpoint.AzureMLOnlineEndpoint.html |
faa73cc1156f-2 | dict(**kwargs: Any) → Dict¶
Return a dictionary of the LLM.
generate(prompts: List[str], stop: Optional[List[str]] = None, callbacks: Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]] = None, *, tags: Optional[List[str]] = None, **kwargs: Any) → LLMResult¶
Run the LLM on the given prompt and input.
generate_prompt(prompts: List[PromptValue], stop: Optional[List[str]] = None, callbacks: Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]] = None, **kwargs: Any) → LLMResult¶
Take in a list of prompt values and return an LLMResult.
get_num_tokens(text: str) → int¶
Get the number of tokens present in the text.
get_num_tokens_from_messages(messages: List[BaseMessage]) → int¶
Get the number of tokens in the message.
get_token_ids(text: str) → List[int]¶
Get the token present in the text.
predict(text: str, *, stop: Optional[Sequence[str]] = None, **kwargs: Any) → str¶
Predict text from text.
predict_messages(messages: List[BaseMessage], *, stop: Optional[Sequence[str]] = None, **kwargs: Any) → BaseMessage¶
Predict message from messages.
validator raise_deprecation » all fields¶
Raise deprecation warning if callback_manager is used.
save(file_path: Union[Path, str]) → None¶
Save the LLM.
Parameters
file_path – Path to file to save the LLM to.
Example:
.. code-block:: python
llm.save(file_path=”path/llm.yaml”)
validator set_verbose » verbose¶
If verbose is None, set it.
This allows users to pass in None as verbose to access the global setting. | [
8644,
22551,
9872,
25,
5884,
8,
11651,
30226,
55609,
198,
5715,
264,
11240,
315,
279,
445,
11237,
627,
19927,
84432,
13044,
25,
1796,
17752,
1145,
3009,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
27777,
25,
12536,
58,
33758,
53094,
58,
4066,
7646,
3126,
1145,
5464,
7646,
2087,
5163,
284,
2290,
11,
12039,
9681,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
445,
11237,
2122,
55609,
198,
6869,
279,
445,
11237,
389,
279,
2728,
10137,
323,
1988,
627,
19927,
62521,
84432,
13044,
25,
1796,
43447,
15091,
1150,
1145,
3009,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
27777,
25,
12536,
58,
33758,
53094,
58,
4066,
7646,
3126,
1145,
5464,
7646,
2087,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
445,
11237,
2122,
55609,
198,
18293,
304,
264,
1160,
315,
10137,
2819,
323,
471,
459,
445,
11237,
2122,
627,
456,
4369,
29938,
7383,
25,
610,
8,
11651,
528,
55609,
198,
1991,
279,
1396,
315,
11460,
3118,
304,
279,
1495,
627,
456,
4369,
29938,
5791,
24321,
56805,
25,
1796,
58,
4066,
2097,
2526,
11651,
528,
55609,
198,
1991,
279,
1396,
315,
11460,
304,
279,
1984,
627,
456,
6594,
8237,
7383,
25,
610,
8,
11651,
1796,
19155,
60,
55609,
198,
1991,
279,
4037,
3118,
304,
279,
1495,
627,
35798,
7383,
25,
610,
11,
12039,
3009,
25,
12536,
58,
14405,
17752,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
610,
55609,
198,
54644,
1495,
505,
1495,
627,
35798,
24321,
56805,
25,
1796,
58,
4066,
2097,
1145,
12039,
3009,
25,
12536,
58,
14405,
17752,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
5464,
2097,
55609,
198,
54644,
1984,
505,
6743,
627,
16503,
4933,
2310,
70693,
4194,
8345,
4194,
682,
5151,
55609,
198,
94201,
409,
70693,
10163,
422,
4927,
12418,
374,
1511,
627,
6766,
4971,
2703,
25,
9323,
58,
1858,
11,
610,
2526,
11651,
2290,
55609,
198,
8960,
279,
445,
11237,
627,
9905,
198,
1213,
2703,
1389,
8092,
311,
1052,
311,
3665,
279,
445,
11237,
311,
627,
13617,
512,
497,
2082,
9612,
487,
10344,
198,
657,
76,
5799,
4971,
2703,
45221,
2398,
14,
657,
76,
34506,
863,
340,
16503,
743,
69021,
4194,
8345,
4194,
14008,
55609,
198,
2746,
14008,
374,
2290,
11,
743,
433,
627,
2028,
6276,
3932,
311,
1522,
304,
2290,
439,
14008,
311,
2680,
279,
3728,
6376,
13
] | https://langchain.readthedocs.io/en/latest/llms/langchain.llms.azureml_endpoint.AzureMLOnlineEndpoint.html |
faa73cc1156f-3 | This allows users to pass in None as verbose to access the global setting.
to_json() → Union[SerializedConstructor, SerializedNotImplemented]¶
to_json_not_implemented() → SerializedNotImplemented¶
validator validate_client » http_client[source]¶
Validate that api key and python package exists in environment.
property lc_attributes: Dict¶
Return a list of attribute names that should be included in the
serialized kwargs. These attributes must be accepted by the
constructor.
property lc_namespace: List[str]¶
Return the namespace of the langchain object.
eg. [“langchain”, “llms”, “openai”]
property lc_secrets: Dict[str, str]¶
Return a map of constructor argument names to secret ids.
eg. {“openai_api_key”: “OPENAI_API_KEY”}
property lc_serializable: bool¶
Return whether or not the class is serializable.
model Config¶
Bases: object
Configuration for this pydantic object.
arbitrary_types_allowed = True¶ | [
2028,
6276,
3932,
311,
1522,
304,
2290,
439,
14008,
311,
2680,
279,
3728,
6376,
627,
998,
9643,
368,
11651,
9323,
58,
78621,
13591,
11,
92572,
2688,
18804,
60,
55609,
198,
998,
9643,
8072,
18377,
14565,
368,
11651,
92572,
2688,
18804,
55609,
198,
16503,
9788,
8342,
4194,
8345,
4194,
1795,
8342,
76747,
60,
55609,
198,
18409,
430,
6464,
1401,
323,
10344,
6462,
6866,
304,
4676,
627,
3784,
37313,
18741,
25,
30226,
55609,
198,
5715,
264,
1160,
315,
7180,
5144,
430,
1288,
387,
5343,
304,
279,
198,
76377,
16901,
13,
4314,
8365,
2011,
387,
11928,
555,
279,
198,
22602,
627,
3784,
37313,
42671,
25,
1796,
17752,
60,
55609,
198,
5715,
279,
4573,
315,
279,
8859,
8995,
1665,
627,
797,
13,
510,
2118,
5317,
8995,
9520,
1054,
657,
1026,
9520,
1054,
2569,
2192,
863,
933,
3784,
37313,
3537,
53810,
25,
30226,
17752,
11,
610,
60,
55609,
198,
5715,
264,
2472,
315,
4797,
5811,
5144,
311,
6367,
14483,
627,
797,
13,
314,
2118,
2569,
2192,
11959,
3173,
57633,
1054,
32033,
15836,
11669,
6738,
863,
534,
3784,
37313,
26684,
8499,
25,
1845,
55609,
198,
5715,
3508,
477,
539,
279,
538,
374,
6275,
8499,
627,
2590,
5649,
55609,
198,
33,
2315,
25,
1665,
198,
7843,
369,
420,
4611,
67,
8322,
1665,
627,
277,
88951,
9962,
43255,
284,
3082,
55609
] | https://langchain.readthedocs.io/en/latest/llms/langchain.llms.azureml_endpoint.AzureMLOnlineEndpoint.html |
182132f70ebe-0 | langchain.llms.huggingface_endpoint.HuggingFaceEndpoint¶
class langchain.llms.huggingface_endpoint.HuggingFaceEndpoint(*, cache: Optional[bool] = None, verbose: bool = None, callbacks: Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]] = None, callback_manager: Optional[BaseCallbackManager] = None, tags: Optional[List[str]] = None, endpoint_url: str = '', task: Optional[str] = None, model_kwargs: Optional[dict] = None, huggingfacehub_api_token: Optional[str] = None)[source]¶
Bases: LLM
Wrapper around HuggingFaceHub Inference Endpoints.
To use, you should have the huggingface_hub python package installed, and the
environment variable HUGGINGFACEHUB_API_TOKEN set with your API token, or pass
it as a named parameter to the constructor.
Only supports text-generation and text2text-generation for now.
Example
from langchain.llms import HuggingFaceEndpoint
endpoint_url = (
"https://abcdefghijklmnop.us-east-1.aws.endpoints.huggingface.cloud"
)
hf = HuggingFaceEndpoint(
endpoint_url=endpoint_url,
huggingfacehub_api_token="my-api-key"
)
Create a new model by parsing and validating input data from keyword arguments.
Raises ValidationError if the input data cannot be parsed to form a valid model.
param cache: Optional[bool] = None¶
param callback_manager: Optional[BaseCallbackManager] = None¶
param callbacks: Callbacks = None¶
param endpoint_url: str = ''¶
Endpoint URL to use.
param huggingfacehub_api_token: Optional[str] = None¶
param model_kwargs: Optional[dict] = None¶
Key word arguments to pass to the model. | [
5317,
8995,
60098,
1026,
870,
36368,
1594,
37799,
3924,
36368,
16680,
28480,
55609,
198,
1058,
8859,
8995,
60098,
1026,
870,
36368,
1594,
37799,
3924,
36368,
16680,
28480,
4163,
11,
6636,
25,
12536,
58,
2707,
60,
284,
2290,
11,
14008,
25,
1845,
284,
2290,
11,
27777,
25,
12536,
58,
33758,
53094,
58,
4066,
7646,
3126,
1145,
5464,
7646,
2087,
5163,
284,
2290,
11,
4927,
12418,
25,
12536,
58,
4066,
7646,
2087,
60,
284,
2290,
11,
9681,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
15233,
2975,
25,
610,
284,
9158,
3465,
25,
12536,
17752,
60,
284,
2290,
11,
1646,
37335,
25,
12536,
58,
8644,
60,
284,
2290,
11,
305,
36368,
1594,
27780,
11959,
6594,
25,
12536,
17752,
60,
284,
2290,
6758,
2484,
60,
55609,
198,
33,
2315,
25,
445,
11237,
198,
11803,
2212,
473,
36368,
16680,
19876,
763,
2251,
4060,
7862,
627,
1271,
1005,
11,
499,
1288,
617,
279,
305,
36368,
1594,
95096,
10344,
6462,
10487,
11,
323,
279,
198,
24175,
3977,
473,
3014,
50537,
20342,
39,
4594,
11669,
19199,
743,
449,
701,
5446,
4037,
11,
477,
1522,
198,
275,
439,
264,
7086,
5852,
311,
279,
4797,
627,
7456,
11815,
1495,
43927,
323,
1495,
17,
1342,
43927,
369,
1457,
627,
13617,
198,
1527,
8859,
8995,
60098,
1026,
1179,
473,
36368,
16680,
28480,
198,
33640,
2975,
284,
2456,
262,
330,
2485,
1129,
66202,
22680,
40607,
12,
16,
36266,
5183,
7862,
870,
36368,
1594,
17365,
702,
340,
45854,
284,
473,
36368,
16680,
28480,
1021,
262,
15233,
2975,
28,
33640,
2975,
345,
262,
305,
36368,
1594,
27780,
11959,
6594,
429,
2465,
24851,
16569,
702,
340,
4110,
264,
502,
1646,
555,
23115,
323,
69772,
1988,
828,
505,
16570,
6105,
627,
36120,
54129,
422,
279,
1988,
828,
4250,
387,
16051,
311,
1376,
264,
2764,
1646,
627,
913,
6636,
25,
12536,
58,
2707,
60,
284,
2290,
55609,
198,
913,
4927,
12418,
25,
12536,
58,
4066,
7646,
2087,
60,
284,
2290,
55609,
198,
913,
27777,
25,
23499,
82,
284,
2290,
55609,
198,
913,
15233,
2975,
25,
610,
284,
3436,
55609,
198,
28480,
5665,
311,
1005,
627,
913,
305,
36368,
1594,
27780,
11959,
6594,
25,
12536,
17752,
60,
284,
2290,
55609,
198,
913,
1646,
37335,
25,
12536,
58,
8644,
60,
284,
2290,
55609,
198,
1622,
3492,
6105,
311,
1522,
311,
279,
1646,
13
] | https://langchain.readthedocs.io/en/latest/llms/langchain.llms.huggingface_endpoint.HuggingFaceEndpoint.html |
182132f70ebe-1 | Key word arguments to pass to the model.
param tags: Optional[List[str]] = None¶
Tags to add to the run trace.
param task: Optional[str] = None¶
Task to call the model with.
Should be a task that returns generated_text or summary_text.
param verbose: bool [Optional]¶
Whether to print out response text.
__call__(prompt: str, stop: Optional[List[str]] = None, callbacks: Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]] = None, **kwargs: Any) → str¶
Check Cache and run the LLM on the given prompt and input.
async agenerate(prompts: List[str], stop: Optional[List[str]] = None, callbacks: Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]] = None, *, tags: Optional[List[str]] = None, **kwargs: Any) → LLMResult¶
Run the LLM on the given prompt and input.
async agenerate_prompt(prompts: List[PromptValue], stop: Optional[List[str]] = None, callbacks: Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]] = None, **kwargs: Any) → LLMResult¶
Take in a list of prompt values and return an LLMResult.
classmethod all_required_field_names() → Set¶
async apredict(text: str, *, stop: Optional[Sequence[str]] = None, **kwargs: Any) → str¶
Predict text from text.
async apredict_messages(messages: List[BaseMessage], *, stop: Optional[Sequence[str]] = None, **kwargs: Any) → BaseMessage¶
Predict message from messages.
dict(**kwargs: Any) → Dict¶
Return a dictionary of the LLM. | [
1622,
3492,
6105,
311,
1522,
311,
279,
1646,
627,
913,
9681,
25,
12536,
53094,
17752,
5163,
284,
2290,
55609,
198,
16309,
311,
923,
311,
279,
1629,
11917,
627,
913,
3465,
25,
12536,
17752,
60,
284,
2290,
55609,
198,
6396,
311,
1650,
279,
1646,
449,
627,
15346,
387,
264,
3465,
430,
4780,
8066,
4424,
477,
12399,
4424,
627,
913,
14008,
25,
1845,
510,
15669,
60,
55609,
198,
25729,
311,
1194,
704,
2077,
1495,
627,
565,
6797,
3889,
41681,
25,
610,
11,
3009,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
27777,
25,
12536,
58,
33758,
53094,
58,
4066,
7646,
3126,
1145,
5464,
7646,
2087,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
610,
55609,
198,
4061,
20044,
323,
1629,
279,
445,
11237,
389,
279,
2728,
10137,
323,
1988,
627,
7847,
945,
13523,
84432,
13044,
25,
1796,
17752,
1145,
3009,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
27777,
25,
12536,
58,
33758,
53094,
58,
4066,
7646,
3126,
1145,
5464,
7646,
2087,
5163,
284,
2290,
11,
12039,
9681,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
445,
11237,
2122,
55609,
198,
6869,
279,
445,
11237,
389,
279,
2728,
10137,
323,
1988,
627,
7847,
945,
13523,
62521,
84432,
13044,
25,
1796,
43447,
15091,
1150,
1145,
3009,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
27777,
25,
12536,
58,
33758,
53094,
58,
4066,
7646,
3126,
1145,
5464,
7646,
2087,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
445,
11237,
2122,
55609,
198,
18293,
304,
264,
1160,
315,
10137,
2819,
323,
471,
459,
445,
11237,
2122,
627,
27853,
682,
19265,
5121,
9366,
368,
11651,
2638,
55609,
198,
7847,
1469,
9037,
7383,
25,
610,
11,
12039,
3009,
25,
12536,
58,
14405,
17752,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
610,
55609,
198,
54644,
1495,
505,
1495,
627,
7847,
1469,
9037,
24321,
56805,
25,
1796,
58,
4066,
2097,
1145,
12039,
3009,
25,
12536,
58,
14405,
17752,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
5464,
2097,
55609,
198,
54644,
1984,
505,
6743,
627,
8644,
22551,
9872,
25,
5884,
8,
11651,
30226,
55609,
198,
5715,
264,
11240,
315,
279,
445,
11237,
13
] | https://langchain.readthedocs.io/en/latest/llms/langchain.llms.huggingface_endpoint.HuggingFaceEndpoint.html |
182132f70ebe-2 | dict(**kwargs: Any) → Dict¶
Return a dictionary of the LLM.
generate(prompts: List[str], stop: Optional[List[str]] = None, callbacks: Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]] = None, *, tags: Optional[List[str]] = None, **kwargs: Any) → LLMResult¶
Run the LLM on the given prompt and input.
generate_prompt(prompts: List[PromptValue], stop: Optional[List[str]] = None, callbacks: Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]] = None, **kwargs: Any) → LLMResult¶
Take in a list of prompt values and return an LLMResult.
get_num_tokens(text: str) → int¶
Get the number of tokens present in the text.
get_num_tokens_from_messages(messages: List[BaseMessage]) → int¶
Get the number of tokens in the message.
get_token_ids(text: str) → List[int]¶
Get the token present in the text.
predict(text: str, *, stop: Optional[Sequence[str]] = None, **kwargs: Any) → str¶
Predict text from text.
predict_messages(messages: List[BaseMessage], *, stop: Optional[Sequence[str]] = None, **kwargs: Any) → BaseMessage¶
Predict message from messages.
validator raise_deprecation » all fields¶
Raise deprecation warning if callback_manager is used.
save(file_path: Union[Path, str]) → None¶
Save the LLM.
Parameters
file_path – Path to file to save the LLM to.
Example:
.. code-block:: python
llm.save(file_path=”path/llm.yaml”)
validator set_verbose » verbose¶
If verbose is None, set it.
This allows users to pass in None as verbose to access the global setting. | [
8644,
22551,
9872,
25,
5884,
8,
11651,
30226,
55609,
198,
5715,
264,
11240,
315,
279,
445,
11237,
627,
19927,
84432,
13044,
25,
1796,
17752,
1145,
3009,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
27777,
25,
12536,
58,
33758,
53094,
58,
4066,
7646,
3126,
1145,
5464,
7646,
2087,
5163,
284,
2290,
11,
12039,
9681,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
445,
11237,
2122,
55609,
198,
6869,
279,
445,
11237,
389,
279,
2728,
10137,
323,
1988,
627,
19927,
62521,
84432,
13044,
25,
1796,
43447,
15091,
1150,
1145,
3009,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
27777,
25,
12536,
58,
33758,
53094,
58,
4066,
7646,
3126,
1145,
5464,
7646,
2087,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
445,
11237,
2122,
55609,
198,
18293,
304,
264,
1160,
315,
10137,
2819,
323,
471,
459,
445,
11237,
2122,
627,
456,
4369,
29938,
7383,
25,
610,
8,
11651,
528,
55609,
198,
1991,
279,
1396,
315,
11460,
3118,
304,
279,
1495,
627,
456,
4369,
29938,
5791,
24321,
56805,
25,
1796,
58,
4066,
2097,
2526,
11651,
528,
55609,
198,
1991,
279,
1396,
315,
11460,
304,
279,
1984,
627,
456,
6594,
8237,
7383,
25,
610,
8,
11651,
1796,
19155,
60,
55609,
198,
1991,
279,
4037,
3118,
304,
279,
1495,
627,
35798,
7383,
25,
610,
11,
12039,
3009,
25,
12536,
58,
14405,
17752,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
610,
55609,
198,
54644,
1495,
505,
1495,
627,
35798,
24321,
56805,
25,
1796,
58,
4066,
2097,
1145,
12039,
3009,
25,
12536,
58,
14405,
17752,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
5464,
2097,
55609,
198,
54644,
1984,
505,
6743,
627,
16503,
4933,
2310,
70693,
4194,
8345,
4194,
682,
5151,
55609,
198,
94201,
409,
70693,
10163,
422,
4927,
12418,
374,
1511,
627,
6766,
4971,
2703,
25,
9323,
58,
1858,
11,
610,
2526,
11651,
2290,
55609,
198,
8960,
279,
445,
11237,
627,
9905,
198,
1213,
2703,
1389,
8092,
311,
1052,
311,
3665,
279,
445,
11237,
311,
627,
13617,
512,
497,
2082,
9612,
487,
10344,
198,
657,
76,
5799,
4971,
2703,
45221,
2398,
14,
657,
76,
34506,
863,
340,
16503,
743,
69021,
4194,
8345,
4194,
14008,
55609,
198,
2746,
14008,
374,
2290,
11,
743,
433,
627,
2028,
6276,
3932,
311,
1522,
304,
2290,
439,
14008,
311,
2680,
279,
3728,
6376,
13
] | https://langchain.readthedocs.io/en/latest/llms/langchain.llms.huggingface_endpoint.HuggingFaceEndpoint.html |
182132f70ebe-3 | This allows users to pass in None as verbose to access the global setting.
to_json() → Union[SerializedConstructor, SerializedNotImplemented]¶
to_json_not_implemented() → SerializedNotImplemented¶
validator validate_environment » all fields[source]¶
Validate that api key and python package exists in environment.
property lc_attributes: Dict¶
Return a list of attribute names that should be included in the
serialized kwargs. These attributes must be accepted by the
constructor.
property lc_namespace: List[str]¶
Return the namespace of the langchain object.
eg. [“langchain”, “llms”, “openai”]
property lc_secrets: Dict[str, str]¶
Return a map of constructor argument names to secret ids.
eg. {“openai_api_key”: “OPENAI_API_KEY”}
property lc_serializable: bool¶
Return whether or not the class is serializable.
model Config[source]¶
Bases: object
Configuration for this pydantic object.
extra = 'forbid'¶ | [
2028,
6276,
3932,
311,
1522,
304,
2290,
439,
14008,
311,
2680,
279,
3728,
6376,
627,
998,
9643,
368,
11651,
9323,
58,
78621,
13591,
11,
92572,
2688,
18804,
60,
55609,
198,
998,
9643,
8072,
18377,
14565,
368,
11651,
92572,
2688,
18804,
55609,
198,
16503,
9788,
52874,
4194,
8345,
4194,
682,
5151,
76747,
60,
55609,
198,
18409,
430,
6464,
1401,
323,
10344,
6462,
6866,
304,
4676,
627,
3784,
37313,
18741,
25,
30226,
55609,
198,
5715,
264,
1160,
315,
7180,
5144,
430,
1288,
387,
5343,
304,
279,
198,
76377,
16901,
13,
4314,
8365,
2011,
387,
11928,
555,
279,
198,
22602,
627,
3784,
37313,
42671,
25,
1796,
17752,
60,
55609,
198,
5715,
279,
4573,
315,
279,
8859,
8995,
1665,
627,
797,
13,
510,
2118,
5317,
8995,
9520,
1054,
657,
1026,
9520,
1054,
2569,
2192,
863,
933,
3784,
37313,
3537,
53810,
25,
30226,
17752,
11,
610,
60,
55609,
198,
5715,
264,
2472,
315,
4797,
5811,
5144,
311,
6367,
14483,
627,
797,
13,
314,
2118,
2569,
2192,
11959,
3173,
57633,
1054,
32033,
15836,
11669,
6738,
863,
534,
3784,
37313,
26684,
8499,
25,
1845,
55609,
198,
5715,
3508,
477,
539,
279,
538,
374,
6275,
8499,
627,
2590,
5649,
76747,
60,
55609,
198,
33,
2315,
25,
1665,
198,
7843,
369,
420,
4611,
67,
8322,
1665,
627,
15824,
284,
364,
2000,
21301,
6,
55609
] | https://langchain.readthedocs.io/en/latest/llms/langchain.llms.huggingface_endpoint.HuggingFaceEndpoint.html |
c0ee403bcdb0-0 | langchain.llms.manifest.ManifestWrapper¶
class langchain.llms.manifest.ManifestWrapper(*, cache: Optional[bool] = None, verbose: bool = None, callbacks: Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]] = None, callback_manager: Optional[BaseCallbackManager] = None, tags: Optional[List[str]] = None, client: Any = None, llm_kwargs: Optional[Dict] = None)[source]¶
Bases: LLM
Wrapper around HazyResearch’s Manifest library.
Create a new model by parsing and validating input data from keyword arguments.
Raises ValidationError if the input data cannot be parsed to form a valid model.
param cache: Optional[bool] = None¶
param callback_manager: Optional[BaseCallbackManager] = None¶
param callbacks: Callbacks = None¶
param llm_kwargs: Optional[Dict] = None¶
param tags: Optional[List[str]] = None¶
Tags to add to the run trace.
param verbose: bool [Optional]¶
Whether to print out response text.
__call__(prompt: str, stop: Optional[List[str]] = None, callbacks: Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]] = None, **kwargs: Any) → str¶
Check Cache and run the LLM on the given prompt and input.
async agenerate(prompts: List[str], stop: Optional[List[str]] = None, callbacks: Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]] = None, *, tags: Optional[List[str]] = None, **kwargs: Any) → LLMResult¶
Run the LLM on the given prompt and input. | [
5317,
8995,
60098,
1026,
21697,
7107,
73372,
11803,
55609,
198,
1058,
8859,
8995,
60098,
1026,
21697,
7107,
73372,
11803,
4163,
11,
6636,
25,
12536,
58,
2707,
60,
284,
2290,
11,
14008,
25,
1845,
284,
2290,
11,
27777,
25,
12536,
58,
33758,
53094,
58,
4066,
7646,
3126,
1145,
5464,
7646,
2087,
5163,
284,
2290,
11,
4927,
12418,
25,
12536,
58,
4066,
7646,
2087,
60,
284,
2290,
11,
9681,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
3016,
25,
5884,
284,
2290,
11,
9507,
76,
37335,
25,
12536,
58,
13755,
60,
284,
2290,
6758,
2484,
60,
55609,
198,
33,
2315,
25,
445,
11237,
198,
11803,
2212,
473,
13933,
28528,
753,
40461,
6875,
627,
4110,
264,
502,
1646,
555,
23115,
323,
69772,
1988,
828,
505,
16570,
6105,
627,
36120,
54129,
422,
279,
1988,
828,
4250,
387,
16051,
311,
1376,
264,
2764,
1646,
627,
913,
6636,
25,
12536,
58,
2707,
60,
284,
2290,
55609,
198,
913,
4927,
12418,
25,
12536,
58,
4066,
7646,
2087,
60,
284,
2290,
55609,
198,
913,
27777,
25,
23499,
82,
284,
2290,
55609,
198,
913,
9507,
76,
37335,
25,
12536,
58,
13755,
60,
284,
2290,
55609,
198,
913,
9681,
25,
12536,
53094,
17752,
5163,
284,
2290,
55609,
198,
16309,
311,
923,
311,
279,
1629,
11917,
627,
913,
14008,
25,
1845,
510,
15669,
60,
55609,
198,
25729,
311,
1194,
704,
2077,
1495,
627,
565,
6797,
3889,
41681,
25,
610,
11,
3009,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
27777,
25,
12536,
58,
33758,
53094,
58,
4066,
7646,
3126,
1145,
5464,
7646,
2087,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
610,
55609,
198,
4061,
20044,
323,
1629,
279,
445,
11237,
389,
279,
2728,
10137,
323,
1988,
627,
7847,
945,
13523,
84432,
13044,
25,
1796,
17752,
1145,
3009,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
27777,
25,
12536,
58,
33758,
53094,
58,
4066,
7646,
3126,
1145,
5464,
7646,
2087,
5163,
284,
2290,
11,
12039,
9681,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
445,
11237,
2122,
55609,
198,
6869,
279,
445,
11237,
389,
279,
2728,
10137,
323,
1988,
13
] | https://langchain.readthedocs.io/en/latest/llms/langchain.llms.manifest.ManifestWrapper.html |
c0ee403bcdb0-1 | Run the LLM on the given prompt and input.
async agenerate_prompt(prompts: List[PromptValue], stop: Optional[List[str]] = None, callbacks: Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]] = None, **kwargs: Any) → LLMResult¶
Take in a list of prompt values and return an LLMResult.
classmethod all_required_field_names() → Set¶
async apredict(text: str, *, stop: Optional[Sequence[str]] = None, **kwargs: Any) → str¶
Predict text from text.
async apredict_messages(messages: List[BaseMessage], *, stop: Optional[Sequence[str]] = None, **kwargs: Any) → BaseMessage¶
Predict message from messages.
dict(**kwargs: Any) → Dict¶
Return a dictionary of the LLM.
generate(prompts: List[str], stop: Optional[List[str]] = None, callbacks: Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]] = None, *, tags: Optional[List[str]] = None, **kwargs: Any) → LLMResult¶
Run the LLM on the given prompt and input.
generate_prompt(prompts: List[PromptValue], stop: Optional[List[str]] = None, callbacks: Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]] = None, **kwargs: Any) → LLMResult¶
Take in a list of prompt values and return an LLMResult.
get_num_tokens(text: str) → int¶
Get the number of tokens present in the text.
get_num_tokens_from_messages(messages: List[BaseMessage]) → int¶
Get the number of tokens in the message.
get_token_ids(text: str) → List[int]¶
Get the token present in the text. | [
6869,
279,
445,
11237,
389,
279,
2728,
10137,
323,
1988,
627,
7847,
945,
13523,
62521,
84432,
13044,
25,
1796,
43447,
15091,
1150,
1145,
3009,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
27777,
25,
12536,
58,
33758,
53094,
58,
4066,
7646,
3126,
1145,
5464,
7646,
2087,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
445,
11237,
2122,
55609,
198,
18293,
304,
264,
1160,
315,
10137,
2819,
323,
471,
459,
445,
11237,
2122,
627,
27853,
682,
19265,
5121,
9366,
368,
11651,
2638,
55609,
198,
7847,
1469,
9037,
7383,
25,
610,
11,
12039,
3009,
25,
12536,
58,
14405,
17752,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
610,
55609,
198,
54644,
1495,
505,
1495,
627,
7847,
1469,
9037,
24321,
56805,
25,
1796,
58,
4066,
2097,
1145,
12039,
3009,
25,
12536,
58,
14405,
17752,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
5464,
2097,
55609,
198,
54644,
1984,
505,
6743,
627,
8644,
22551,
9872,
25,
5884,
8,
11651,
30226,
55609,
198,
5715,
264,
11240,
315,
279,
445,
11237,
627,
19927,
84432,
13044,
25,
1796,
17752,
1145,
3009,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
27777,
25,
12536,
58,
33758,
53094,
58,
4066,
7646,
3126,
1145,
5464,
7646,
2087,
5163,
284,
2290,
11,
12039,
9681,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
445,
11237,
2122,
55609,
198,
6869,
279,
445,
11237,
389,
279,
2728,
10137,
323,
1988,
627,
19927,
62521,
84432,
13044,
25,
1796,
43447,
15091,
1150,
1145,
3009,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
27777,
25,
12536,
58,
33758,
53094,
58,
4066,
7646,
3126,
1145,
5464,
7646,
2087,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
445,
11237,
2122,
55609,
198,
18293,
304,
264,
1160,
315,
10137,
2819,
323,
471,
459,
445,
11237,
2122,
627,
456,
4369,
29938,
7383,
25,
610,
8,
11651,
528,
55609,
198,
1991,
279,
1396,
315,
11460,
3118,
304,
279,
1495,
627,
456,
4369,
29938,
5791,
24321,
56805,
25,
1796,
58,
4066,
2097,
2526,
11651,
528,
55609,
198,
1991,
279,
1396,
315,
11460,
304,
279,
1984,
627,
456,
6594,
8237,
7383,
25,
610,
8,
11651,
1796,
19155,
60,
55609,
198,
1991,
279,
4037,
3118,
304,
279,
1495,
13
] | https://langchain.readthedocs.io/en/latest/llms/langchain.llms.manifest.ManifestWrapper.html |
c0ee403bcdb0-2 | Get the token present in the text.
predict(text: str, *, stop: Optional[Sequence[str]] = None, **kwargs: Any) → str¶
Predict text from text.
predict_messages(messages: List[BaseMessage], *, stop: Optional[Sequence[str]] = None, **kwargs: Any) → BaseMessage¶
Predict message from messages.
validator raise_deprecation » all fields¶
Raise deprecation warning if callback_manager is used.
save(file_path: Union[Path, str]) → None¶
Save the LLM.
Parameters
file_path – Path to file to save the LLM to.
Example:
.. code-block:: python
llm.save(file_path=”path/llm.yaml”)
validator set_verbose » verbose¶
If verbose is None, set it.
This allows users to pass in None as verbose to access the global setting.
to_json() → Union[SerializedConstructor, SerializedNotImplemented]¶
to_json_not_implemented() → SerializedNotImplemented¶
validator validate_environment » all fields[source]¶
Validate that python package exists in environment.
property lc_attributes: Dict¶
Return a list of attribute names that should be included in the
serialized kwargs. These attributes must be accepted by the
constructor.
property lc_namespace: List[str]¶
Return the namespace of the langchain object.
eg. [“langchain”, “llms”, “openai”]
property lc_secrets: Dict[str, str]¶
Return a map of constructor argument names to secret ids.
eg. {“openai_api_key”: “OPENAI_API_KEY”}
property lc_serializable: bool¶
Return whether or not the class is serializable.
model Config[source]¶
Bases: object
Configuration for this pydantic object.
extra = 'forbid'¶ | [
1991,
279,
4037,
3118,
304,
279,
1495,
627,
35798,
7383,
25,
610,
11,
12039,
3009,
25,
12536,
58,
14405,
17752,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
610,
55609,
198,
54644,
1495,
505,
1495,
627,
35798,
24321,
56805,
25,
1796,
58,
4066,
2097,
1145,
12039,
3009,
25,
12536,
58,
14405,
17752,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
5464,
2097,
55609,
198,
54644,
1984,
505,
6743,
627,
16503,
4933,
2310,
70693,
4194,
8345,
4194,
682,
5151,
55609,
198,
94201,
409,
70693,
10163,
422,
4927,
12418,
374,
1511,
627,
6766,
4971,
2703,
25,
9323,
58,
1858,
11,
610,
2526,
11651,
2290,
55609,
198,
8960,
279,
445,
11237,
627,
9905,
198,
1213,
2703,
1389,
8092,
311,
1052,
311,
3665,
279,
445,
11237,
311,
627,
13617,
512,
497,
2082,
9612,
487,
10344,
198,
657,
76,
5799,
4971,
2703,
45221,
2398,
14,
657,
76,
34506,
863,
340,
16503,
743,
69021,
4194,
8345,
4194,
14008,
55609,
198,
2746,
14008,
374,
2290,
11,
743,
433,
627,
2028,
6276,
3932,
311,
1522,
304,
2290,
439,
14008,
311,
2680,
279,
3728,
6376,
627,
998,
9643,
368,
11651,
9323,
58,
78621,
13591,
11,
92572,
2688,
18804,
60,
55609,
198,
998,
9643,
8072,
18377,
14565,
368,
11651,
92572,
2688,
18804,
55609,
198,
16503,
9788,
52874,
4194,
8345,
4194,
682,
5151,
76747,
60,
55609,
198,
18409,
430,
10344,
6462,
6866,
304,
4676,
627,
3784,
37313,
18741,
25,
30226,
55609,
198,
5715,
264,
1160,
315,
7180,
5144,
430,
1288,
387,
5343,
304,
279,
198,
76377,
16901,
13,
4314,
8365,
2011,
387,
11928,
555,
279,
198,
22602,
627,
3784,
37313,
42671,
25,
1796,
17752,
60,
55609,
198,
5715,
279,
4573,
315,
279,
8859,
8995,
1665,
627,
797,
13,
510,
2118,
5317,
8995,
9520,
1054,
657,
1026,
9520,
1054,
2569,
2192,
863,
933,
3784,
37313,
3537,
53810,
25,
30226,
17752,
11,
610,
60,
55609,
198,
5715,
264,
2472,
315,
4797,
5811,
5144,
311,
6367,
14483,
627,
797,
13,
314,
2118,
2569,
2192,
11959,
3173,
57633,
1054,
32033,
15836,
11669,
6738,
863,
534,
3784,
37313,
26684,
8499,
25,
1845,
55609,
198,
5715,
3508,
477,
539,
279,
538,
374,
6275,
8499,
627,
2590,
5649,
76747,
60,
55609,
198,
33,
2315,
25,
1665,
198,
7843,
369,
420,
4611,
67,
8322,
1665,
627,
15824,
284,
364,
2000,
21301,
6,
55609
] | https://langchain.readthedocs.io/en/latest/llms/langchain.llms.manifest.ManifestWrapper.html |
4da9d9bd15eb-0 | langchain.llms.aviary.get_completions¶
langchain.llms.aviary.get_completions(model: str, prompt: str, use_prompt_format: bool = True, version: str = '') → Dict[str, Union[str, float, int]][source]¶
Get completions from Aviary models. | [
5317,
8995,
60098,
1026,
85652,
661,
673,
3038,
11053,
919,
55609,
198,
5317,
8995,
60098,
1026,
85652,
661,
673,
3038,
11053,
919,
7790,
25,
610,
11,
10137,
25,
610,
11,
1005,
62521,
9132,
25,
1845,
284,
3082,
11,
2373,
25,
610,
284,
13760,
11651,
30226,
17752,
11,
9323,
17752,
11,
2273,
11,
528,
28819,
2484,
60,
55609,
198,
1991,
3543,
919,
505,
7671,
72,
661,
4211,
13
] | https://langchain.readthedocs.io/en/latest/llms/langchain.llms.aviary.get_completions.html |
1de06fa18214-0 | langchain.llms.databricks.get_default_api_token¶
langchain.llms.databricks.get_default_api_token() → str[source]¶
Gets the default Databricks personal access token.
Raises an error if the token cannot be automatically determined. | [
5317,
8995,
60098,
1026,
32112,
78889,
673,
10198,
11959,
6594,
55609,
198,
5317,
8995,
60098,
1026,
32112,
78889,
673,
10198,
11959,
6594,
368,
11651,
610,
76747,
60,
55609,
198,
50458,
279,
1670,
423,
2143,
78889,
4443,
2680,
4037,
627,
36120,
459,
1493,
422,
279,
4037,
4250,
387,
9651,
11075,
13
] | https://langchain.readthedocs.io/en/latest/llms/langchain.llms.databricks.get_default_api_token.html |
2b233b7257e7-0 | langchain.llms.sagemaker_endpoint.SagemakerEndpoint¶
class langchain.llms.sagemaker_endpoint.SagemakerEndpoint(*, cache: Optional[bool] = None, verbose: bool = None, callbacks: Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]] = None, callback_manager: Optional[BaseCallbackManager] = None, tags: Optional[List[str]] = None, client: Any = None, endpoint_name: str = '', region_name: str = '', credentials_profile_name: Optional[str] = None, content_handler: LLMContentHandler, model_kwargs: Optional[Dict] = None, endpoint_kwargs: Optional[Dict] = None)[source]¶
Bases: LLM
Wrapper around custom Sagemaker Inference Endpoints.
To use, you must supply the endpoint name from your deployed
Sagemaker model & the region where it is deployed.
To authenticate, the AWS client uses the following methods to
automatically load credentials:
https://boto3.amazonaws.com/v1/documentation/api/latest/guide/credentials.html
If a specific credential profile should be used, you must pass
the name of the profile from the ~/.aws/credentials file that is to be used.
Make sure the credentials / roles used have the required policies to
access the Sagemaker endpoint.
See: https://docs.aws.amazon.com/IAM/latest/UserGuide/access_policies.html
Create a new model by parsing and validating input data from keyword arguments.
Raises ValidationError if the input data cannot be parsed to form a valid model.
param cache: Optional[bool] = None¶
param callback_manager: Optional[BaseCallbackManager] = None¶
param callbacks: Callbacks = None¶
param content_handler: langchain.llms.sagemaker_endpoint.LLMContentHandler [Required]¶
The content handler class that provides an input and | [
5317,
8995,
60098,
1026,
516,
15003,
4506,
37799,
815,
15003,
4506,
28480,
55609,
198,
1058,
8859,
8995,
60098,
1026,
516,
15003,
4506,
37799,
815,
15003,
4506,
28480,
4163,
11,
6636,
25,
12536,
58,
2707,
60,
284,
2290,
11,
14008,
25,
1845,
284,
2290,
11,
27777,
25,
12536,
58,
33758,
53094,
58,
4066,
7646,
3126,
1145,
5464,
7646,
2087,
5163,
284,
2290,
11,
4927,
12418,
25,
12536,
58,
4066,
7646,
2087,
60,
284,
2290,
11,
9681,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
3016,
25,
5884,
284,
2290,
11,
15233,
1292,
25,
610,
284,
9158,
5654,
1292,
25,
610,
284,
9158,
16792,
14108,
1292,
25,
12536,
17752,
60,
284,
2290,
11,
2262,
10393,
25,
445,
11237,
2831,
3126,
11,
1646,
37335,
25,
12536,
58,
13755,
60,
284,
2290,
11,
15233,
37335,
25,
12536,
58,
13755,
60,
284,
2290,
6758,
2484,
60,
55609,
198,
33,
2315,
25,
445,
11237,
198,
11803,
2212,
2587,
328,
15003,
4506,
763,
2251,
4060,
7862,
627,
1271,
1005,
11,
499,
2011,
8312,
279,
15233,
836,
505,
701,
27167,
198,
50,
15003,
4506,
1646,
612,
279,
5654,
1405,
433,
374,
27167,
627,
1271,
34289,
11,
279,
24124,
3016,
5829,
279,
2768,
5528,
311,
198,
28172,
7167,
2865,
16792,
512,
2485,
1129,
65,
2117,
18,
29871,
916,
5574,
16,
86686,
10729,
34249,
4951,
35805,
14,
33453,
2628,
198,
2746,
264,
3230,
41307,
5643,
1288,
387,
1511,
11,
499,
2011,
1522,
198,
1820,
836,
315,
279,
5643,
505,
279,
41058,
8805,
14,
33453,
1052,
430,
374,
311,
387,
1511,
627,
8238,
2771,
279,
16792,
611,
13073,
1511,
617,
279,
2631,
10396,
311,
198,
5323,
279,
328,
15003,
4506,
15233,
627,
10031,
25,
3788,
1129,
14452,
36266,
18771,
916,
39251,
1428,
34249,
34611,
42110,
73456,
623,
43138,
2628,
198,
4110,
264,
502,
1646,
555,
23115,
323,
69772,
1988,
828,
505,
16570,
6105,
627,
36120,
54129,
422,
279,
1988,
828,
4250,
387,
16051,
311,
1376,
264,
2764,
1646,
627,
913,
6636,
25,
12536,
58,
2707,
60,
284,
2290,
55609,
198,
913,
4927,
12418,
25,
12536,
58,
4066,
7646,
2087,
60,
284,
2290,
55609,
198,
913,
27777,
25,
23499,
82,
284,
2290,
55609,
198,
913,
2262,
10393,
25,
8859,
8995,
60098,
1026,
516,
15003,
4506,
37799,
1236,
11237,
2831,
3126,
510,
8327,
60,
55609,
198,
791,
2262,
7158,
538,
430,
5825,
459,
1988,
323
] | https://langchain.readthedocs.io/en/latest/llms/langchain.llms.sagemaker_endpoint.SagemakerEndpoint.html |
2b233b7257e7-1 | The content handler class that provides an input and
output transform functions to handle formats between LLM
and the endpoint.
param credentials_profile_name: Optional[str] = None¶
The name of the profile in the ~/.aws/credentials or ~/.aws/config files, which
has either access keys or role information specified.
If not specified, the default credential profile or, if on an EC2 instance,
credentials from IMDS will be used.
See: https://boto3.amazonaws.com/v1/documentation/api/latest/guide/credentials.html
param endpoint_kwargs: Optional[Dict] = None¶
Optional attributes passed to the invoke_endpoint
function. See `boto3`_. docs for more info.
.. _boto3: <https://boto3.amazonaws.com/v1/documentation/api/latest/index.html>
param endpoint_name: str = ''¶
The name of the endpoint from the deployed Sagemaker model.
Must be unique within an AWS Region.
param model_kwargs: Optional[Dict] = None¶
Key word arguments to pass to the model.
param region_name: str = ''¶
The aws region where the Sagemaker model is deployed, eg. us-west-2.
param tags: Optional[List[str]] = None¶
Tags to add to the run trace.
param verbose: bool [Optional]¶
Whether to print out response text.
__call__(prompt: str, stop: Optional[List[str]] = None, callbacks: Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]] = None, **kwargs: Any) → str¶
Check Cache and run the LLM on the given prompt and input. | [
791,
2262,
7158,
538,
430,
5825,
459,
1988,
323,
198,
3081,
5276,
5865,
311,
3790,
20447,
1990,
445,
11237,
198,
438,
279,
15233,
627,
913,
16792,
14108,
1292,
25,
12536,
17752,
60,
284,
2290,
55609,
198,
791,
836,
315,
279,
5643,
304,
279,
41058,
8805,
14,
33453,
477,
41058,
8805,
15072,
3626,
11,
902,
198,
4752,
3060,
2680,
7039,
477,
3560,
2038,
5300,
627,
2746,
539,
5300,
11,
279,
1670,
41307,
5643,
477,
11,
422,
389,
459,
21283,
17,
2937,
345,
33453,
505,
6654,
6061,
690,
387,
1511,
627,
10031,
25,
3788,
1129,
65,
2117,
18,
29871,
916,
5574,
16,
86686,
10729,
34249,
4951,
35805,
14,
33453,
2628,
198,
913,
15233,
37335,
25,
12536,
58,
13755,
60,
284,
2290,
55609,
198,
15669,
8365,
5946,
311,
279,
20466,
37799,
198,
1723,
13,
3580,
1595,
65,
2117,
18,
63,
5056,
27437,
369,
810,
3630,
627,
497,
721,
65,
2117,
18,
25,
366,
2485,
1129,
65,
2117,
18,
29871,
916,
5574,
16,
86686,
10729,
34249,
9199,
2628,
397,
913,
15233,
1292,
25,
610,
284,
3436,
55609,
198,
791,
836,
315,
279,
15233,
505,
279,
27167,
328,
15003,
4506,
1646,
627,
32876,
387,
5016,
2949,
459,
24124,
17593,
627,
913,
1646,
37335,
25,
12536,
58,
13755,
60,
284,
2290,
55609,
198,
1622,
3492,
6105,
311,
1522,
311,
279,
1646,
627,
913,
5654,
1292,
25,
610,
284,
3436,
55609,
198,
791,
32621,
5654,
1405,
279,
328,
15003,
4506,
1646,
374,
27167,
11,
8866,
13,
603,
38702,
12,
17,
627,
913,
9681,
25,
12536,
53094,
17752,
5163,
284,
2290,
55609,
198,
16309,
311,
923,
311,
279,
1629,
11917,
627,
913,
14008,
25,
1845,
510,
15669,
60,
55609,
198,
25729,
311,
1194,
704,
2077,
1495,
627,
565,
6797,
3889,
41681,
25,
610,
11,
3009,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
27777,
25,
12536,
58,
33758,
53094,
58,
4066,
7646,
3126,
1145,
5464,
7646,
2087,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
610,
55609,
198,
4061,
20044,
323,
1629,
279,
445,
11237,
389,
279,
2728,
10137,
323,
1988,
13
] | https://langchain.readthedocs.io/en/latest/llms/langchain.llms.sagemaker_endpoint.SagemakerEndpoint.html |
2b233b7257e7-2 | Check Cache and run the LLM on the given prompt and input.
async agenerate(prompts: List[str], stop: Optional[List[str]] = None, callbacks: Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]] = None, *, tags: Optional[List[str]] = None, **kwargs: Any) → LLMResult¶
Run the LLM on the given prompt and input.
async agenerate_prompt(prompts: List[PromptValue], stop: Optional[List[str]] = None, callbacks: Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]] = None, **kwargs: Any) → LLMResult¶
Take in a list of prompt values and return an LLMResult.
classmethod all_required_field_names() → Set¶
async apredict(text: str, *, stop: Optional[Sequence[str]] = None, **kwargs: Any) → str¶
Predict text from text.
async apredict_messages(messages: List[BaseMessage], *, stop: Optional[Sequence[str]] = None, **kwargs: Any) → BaseMessage¶
Predict message from messages.
dict(**kwargs: Any) → Dict¶
Return a dictionary of the LLM.
generate(prompts: List[str], stop: Optional[List[str]] = None, callbacks: Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]] = None, *, tags: Optional[List[str]] = None, **kwargs: Any) → LLMResult¶
Run the LLM on the given prompt and input.
generate_prompt(prompts: List[PromptValue], stop: Optional[List[str]] = None, callbacks: Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]] = None, **kwargs: Any) → LLMResult¶
Take in a list of prompt values and return an LLMResult.
get_num_tokens(text: str) → int¶ | [
4061,
20044,
323,
1629,
279,
445,
11237,
389,
279,
2728,
10137,
323,
1988,
627,
7847,
945,
13523,
84432,
13044,
25,
1796,
17752,
1145,
3009,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
27777,
25,
12536,
58,
33758,
53094,
58,
4066,
7646,
3126,
1145,
5464,
7646,
2087,
5163,
284,
2290,
11,
12039,
9681,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
445,
11237,
2122,
55609,
198,
6869,
279,
445,
11237,
389,
279,
2728,
10137,
323,
1988,
627,
7847,
945,
13523,
62521,
84432,
13044,
25,
1796,
43447,
15091,
1150,
1145,
3009,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
27777,
25,
12536,
58,
33758,
53094,
58,
4066,
7646,
3126,
1145,
5464,
7646,
2087,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
445,
11237,
2122,
55609,
198,
18293,
304,
264,
1160,
315,
10137,
2819,
323,
471,
459,
445,
11237,
2122,
627,
27853,
682,
19265,
5121,
9366,
368,
11651,
2638,
55609,
198,
7847,
1469,
9037,
7383,
25,
610,
11,
12039,
3009,
25,
12536,
58,
14405,
17752,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
610,
55609,
198,
54644,
1495,
505,
1495,
627,
7847,
1469,
9037,
24321,
56805,
25,
1796,
58,
4066,
2097,
1145,
12039,
3009,
25,
12536,
58,
14405,
17752,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
5464,
2097,
55609,
198,
54644,
1984,
505,
6743,
627,
8644,
22551,
9872,
25,
5884,
8,
11651,
30226,
55609,
198,
5715,
264,
11240,
315,
279,
445,
11237,
627,
19927,
84432,
13044,
25,
1796,
17752,
1145,
3009,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
27777,
25,
12536,
58,
33758,
53094,
58,
4066,
7646,
3126,
1145,
5464,
7646,
2087,
5163,
284,
2290,
11,
12039,
9681,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
445,
11237,
2122,
55609,
198,
6869,
279,
445,
11237,
389,
279,
2728,
10137,
323,
1988,
627,
19927,
62521,
84432,
13044,
25,
1796,
43447,
15091,
1150,
1145,
3009,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
27777,
25,
12536,
58,
33758,
53094,
58,
4066,
7646,
3126,
1145,
5464,
7646,
2087,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
445,
11237,
2122,
55609,
198,
18293,
304,
264,
1160,
315,
10137,
2819,
323,
471,
459,
445,
11237,
2122,
627,
456,
4369,
29938,
7383,
25,
610,
8,
11651,
528,
55609
] | https://langchain.readthedocs.io/en/latest/llms/langchain.llms.sagemaker_endpoint.SagemakerEndpoint.html |
2b233b7257e7-3 | get_num_tokens(text: str) → int¶
Get the number of tokens present in the text.
get_num_tokens_from_messages(messages: List[BaseMessage]) → int¶
Get the number of tokens in the message.
get_token_ids(text: str) → List[int]¶
Get the token present in the text.
predict(text: str, *, stop: Optional[Sequence[str]] = None, **kwargs: Any) → str¶
Predict text from text.
predict_messages(messages: List[BaseMessage], *, stop: Optional[Sequence[str]] = None, **kwargs: Any) → BaseMessage¶
Predict message from messages.
validator raise_deprecation » all fields¶
Raise deprecation warning if callback_manager is used.
save(file_path: Union[Path, str]) → None¶
Save the LLM.
Parameters
file_path – Path to file to save the LLM to.
Example:
.. code-block:: python
llm.save(file_path=”path/llm.yaml”)
validator set_verbose » verbose¶
If verbose is None, set it.
This allows users to pass in None as verbose to access the global setting.
to_json() → Union[SerializedConstructor, SerializedNotImplemented]¶
to_json_not_implemented() → SerializedNotImplemented¶
validator validate_environment » all fields[source]¶
Validate that AWS credentials to and python package exists in environment.
property lc_attributes: Dict¶
Return a list of attribute names that should be included in the
serialized kwargs. These attributes must be accepted by the
constructor.
property lc_namespace: List[str]¶
Return the namespace of the langchain object.
eg. [“langchain”, “llms”, “openai”]
property lc_secrets: Dict[str, str]¶
Return a map of constructor argument names to secret ids. | [
456,
4369,
29938,
7383,
25,
610,
8,
11651,
528,
55609,
198,
1991,
279,
1396,
315,
11460,
3118,
304,
279,
1495,
627,
456,
4369,
29938,
5791,
24321,
56805,
25,
1796,
58,
4066,
2097,
2526,
11651,
528,
55609,
198,
1991,
279,
1396,
315,
11460,
304,
279,
1984,
627,
456,
6594,
8237,
7383,
25,
610,
8,
11651,
1796,
19155,
60,
55609,
198,
1991,
279,
4037,
3118,
304,
279,
1495,
627,
35798,
7383,
25,
610,
11,
12039,
3009,
25,
12536,
58,
14405,
17752,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
610,
55609,
198,
54644,
1495,
505,
1495,
627,
35798,
24321,
56805,
25,
1796,
58,
4066,
2097,
1145,
12039,
3009,
25,
12536,
58,
14405,
17752,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
5464,
2097,
55609,
198,
54644,
1984,
505,
6743,
627,
16503,
4933,
2310,
70693,
4194,
8345,
4194,
682,
5151,
55609,
198,
94201,
409,
70693,
10163,
422,
4927,
12418,
374,
1511,
627,
6766,
4971,
2703,
25,
9323,
58,
1858,
11,
610,
2526,
11651,
2290,
55609,
198,
8960,
279,
445,
11237,
627,
9905,
198,
1213,
2703,
1389,
8092,
311,
1052,
311,
3665,
279,
445,
11237,
311,
627,
13617,
512,
497,
2082,
9612,
487,
10344,
198,
657,
76,
5799,
4971,
2703,
45221,
2398,
14,
657,
76,
34506,
863,
340,
16503,
743,
69021,
4194,
8345,
4194,
14008,
55609,
198,
2746,
14008,
374,
2290,
11,
743,
433,
627,
2028,
6276,
3932,
311,
1522,
304,
2290,
439,
14008,
311,
2680,
279,
3728,
6376,
627,
998,
9643,
368,
11651,
9323,
58,
78621,
13591,
11,
92572,
2688,
18804,
60,
55609,
198,
998,
9643,
8072,
18377,
14565,
368,
11651,
92572,
2688,
18804,
55609,
198,
16503,
9788,
52874,
4194,
8345,
4194,
682,
5151,
76747,
60,
55609,
198,
18409,
430,
24124,
16792,
311,
323,
10344,
6462,
6866,
304,
4676,
627,
3784,
37313,
18741,
25,
30226,
55609,
198,
5715,
264,
1160,
315,
7180,
5144,
430,
1288,
387,
5343,
304,
279,
198,
76377,
16901,
13,
4314,
8365,
2011,
387,
11928,
555,
279,
198,
22602,
627,
3784,
37313,
42671,
25,
1796,
17752,
60,
55609,
198,
5715,
279,
4573,
315,
279,
8859,
8995,
1665,
627,
797,
13,
510,
2118,
5317,
8995,
9520,
1054,
657,
1026,
9520,
1054,
2569,
2192,
863,
933,
3784,
37313,
3537,
53810,
25,
30226,
17752,
11,
610,
60,
55609,
198,
5715,
264,
2472,
315,
4797,
5811,
5144,
311,
6367,
14483,
13
] | https://langchain.readthedocs.io/en/latest/llms/langchain.llms.sagemaker_endpoint.SagemakerEndpoint.html |
2b233b7257e7-4 | Return a map of constructor argument names to secret ids.
eg. {“openai_api_key”: “OPENAI_API_KEY”}
property lc_serializable: bool¶
Return whether or not the class is serializable.
model Config[source]¶
Bases: object
Configuration for this pydantic object.
extra = 'forbid'¶ | [
5715,
264,
2472,
315,
4797,
5811,
5144,
311,
6367,
14483,
627,
797,
13,
314,
2118,
2569,
2192,
11959,
3173,
57633,
1054,
32033,
15836,
11669,
6738,
863,
534,
3784,
37313,
26684,
8499,
25,
1845,
55609,
198,
5715,
3508,
477,
539,
279,
538,
374,
6275,
8499,
627,
2590,
5649,
76747,
60,
55609,
198,
33,
2315,
25,
1665,
198,
7843,
369,
420,
4611,
67,
8322,
1665,
627,
15824,
284,
364,
2000,
21301,
6,
55609
] | https://langchain.readthedocs.io/en/latest/llms/langchain.llms.sagemaker_endpoint.SagemakerEndpoint.html |
20678d2c20fe-0 | langchain.llms.beam.Beam¶
class langchain.llms.beam.Beam(*, cache: Optional[bool] = None, verbose: bool = None, callbacks: Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]] = None, callback_manager: Optional[BaseCallbackManager] = None, tags: Optional[List[str]] = None, model_name: str = '', name: str = '', cpu: str = '', memory: str = '', gpu: str = '', python_version: str = '', python_packages: List[str] = [], max_length: str = '', url: str = '', model_kwargs: Dict[str, Any] = None, beam_client_id: str = '', beam_client_secret: str = '', app_id: Optional[str] = None)[source]¶
Bases: LLM
Wrapper around Beam API for gpt2 large language model.
To use, you should have the beam-sdk python package installed,
and the environment variable BEAM_CLIENT_ID set with your client id
and BEAM_CLIENT_SECRET set with your client secret. Information on how
to get these is available here: https://docs.beam.cloud/account/api-keys.
The wrapper can then be called as follows, where the name, cpu, memory, gpu,
python version, and python packages can be updated accordingly. Once deployed,
the instance can be called.
Example
llm = Beam(model_name="gpt2",
name="langchain-gpt2",
cpu=8,
memory="32Gi",
gpu="A10G",
python_version="python3.8",
python_packages=[
"diffusers[torch]>=0.10",
"transformers",
"torch",
"pillow",
"accelerate",
"safetensors",
"xformers",], | [
5317,
8995,
60098,
1026,
8809,
309,
1823,
14922,
55609,
198,
1058,
8859,
8995,
60098,
1026,
8809,
309,
1823,
14922,
4163,
11,
6636,
25,
12536,
58,
2707,
60,
284,
2290,
11,
14008,
25,
1845,
284,
2290,
11,
27777,
25,
12536,
58,
33758,
53094,
58,
4066,
7646,
3126,
1145,
5464,
7646,
2087,
5163,
284,
2290,
11,
4927,
12418,
25,
12536,
58,
4066,
7646,
2087,
60,
284,
2290,
11,
9681,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
1646,
1292,
25,
610,
284,
9158,
836,
25,
610,
284,
9158,
17769,
25,
610,
284,
9158,
5044,
25,
610,
284,
9158,
39534,
25,
610,
284,
9158,
10344,
9625,
25,
610,
284,
9158,
10344,
42974,
25,
1796,
17752,
60,
284,
10277,
1973,
5228,
25,
610,
284,
9158,
2576,
25,
610,
284,
9158,
1646,
37335,
25,
30226,
17752,
11,
5884,
60,
284,
2290,
11,
24310,
8342,
851,
25,
610,
284,
9158,
24310,
8342,
22729,
25,
610,
284,
9158,
917,
851,
25,
12536,
17752,
60,
284,
2290,
6758,
2484,
60,
55609,
198,
33,
2315,
25,
445,
11237,
198,
11803,
2212,
51230,
5446,
369,
342,
418,
17,
3544,
4221,
1646,
627,
1271,
1005,
11,
499,
1288,
617,
279,
24310,
36578,
10344,
6462,
10487,
345,
438,
279,
4676,
3977,
7354,
1428,
23337,
3533,
743,
449,
701,
3016,
887,
198,
438,
7354,
1428,
23337,
32508,
743,
449,
701,
3016,
6367,
13,
8245,
389,
1268,
198,
998,
636,
1521,
374,
2561,
1618,
25,
3788,
1129,
14452,
8809,
309,
17365,
49020,
10729,
77209,
627,
791,
13564,
649,
1243,
387,
2663,
439,
11263,
11,
1405,
279,
836,
11,
17769,
11,
5044,
11,
39534,
345,
12958,
2373,
11,
323,
10344,
14519,
649,
387,
6177,
28178,
13,
9843,
27167,
345,
1820,
2937,
649,
387,
2663,
627,
13617,
198,
657,
76,
284,
51230,
7790,
1292,
429,
70,
418,
17,
761,
262,
836,
429,
5317,
8995,
2427,
418,
17,
761,
262,
17769,
28,
23,
345,
262,
5044,
429,
843,
47941,
761,
262,
39534,
429,
32,
605,
38,
761,
262,
10344,
9625,
429,
12958,
18,
13,
23,
761,
262,
10344,
42974,
34299,
286,
330,
13798,
4312,
58,
28514,
74875,
15,
13,
605,
761,
286,
330,
4806,
388,
761,
286,
330,
28514,
761,
286,
330,
50946,
363,
761,
286,
330,
44988,
59768,
761,
286,
330,
82,
2642,
295,
27627,
761,
286,
330,
87,
630,
388,
498,
1145
] | https://langchain.readthedocs.io/en/latest/llms/langchain.llms.beam.Beam.html |
20678d2c20fe-1 | "safetensors",
"xformers",],
max_length=50)
llm._deploy()
call_result = llm._call(input)
Create a new model by parsing and validating input data from keyword arguments.
Raises ValidationError if the input data cannot be parsed to form a valid model.
param app_id: Optional[str] = None¶
param beam_client_id: str = ''¶
param beam_client_secret: str = ''¶
param cache: Optional[bool] = None¶
param callback_manager: Optional[BaseCallbackManager] = None¶
param callbacks: Callbacks = None¶
param cpu: str = ''¶
param gpu: str = ''¶
param max_length: str = ''¶
param memory: str = ''¶
param model_kwargs: Dict[str, Any] [Optional]¶
Holds any model parameters valid for create call not
explicitly specified.
param model_name: str = ''¶
param name: str = ''¶
param python_packages: List[str] = []¶
param python_version: str = ''¶
param tags: Optional[List[str]] = None¶
Tags to add to the run trace.
param url: str = ''¶
model endpoint to use
param verbose: bool [Optional]¶
Whether to print out response text.
__call__(prompt: str, stop: Optional[List[str]] = None, callbacks: Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]] = None, **kwargs: Any) → str¶
Check Cache and run the LLM on the given prompt and input.
async agenerate(prompts: List[str], stop: Optional[List[str]] = None, callbacks: Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]] = None, *, tags: Optional[List[str]] = None, **kwargs: Any) → LLMResult¶ | [
41887,
2642,
295,
27627,
761,
286,
330,
87,
630,
388,
498,
1282,
262,
1973,
5228,
28,
1135,
340,
657,
76,
1462,
36894,
746,
6797,
5400,
284,
9507,
76,
1462,
6797,
5498,
340,
4110,
264,
502,
1646,
555,
23115,
323,
69772,
1988,
828,
505,
16570,
6105,
627,
36120,
54129,
422,
279,
1988,
828,
4250,
387,
16051,
311,
1376,
264,
2764,
1646,
627,
913,
917,
851,
25,
12536,
17752,
60,
284,
2290,
55609,
198,
913,
24310,
8342,
851,
25,
610,
284,
3436,
55609,
198,
913,
24310,
8342,
22729,
25,
610,
284,
3436,
55609,
198,
913,
6636,
25,
12536,
58,
2707,
60,
284,
2290,
55609,
198,
913,
4927,
12418,
25,
12536,
58,
4066,
7646,
2087,
60,
284,
2290,
55609,
198,
913,
27777,
25,
23499,
82,
284,
2290,
55609,
198,
913,
17769,
25,
610,
284,
3436,
55609,
198,
913,
39534,
25,
610,
284,
3436,
55609,
198,
913,
1973,
5228,
25,
610,
284,
3436,
55609,
198,
913,
5044,
25,
610,
284,
3436,
55609,
198,
913,
1646,
37335,
25,
30226,
17752,
11,
5884,
60,
510,
15669,
60,
55609,
198,
39,
18938,
904,
1646,
5137,
2764,
369,
1893,
1650,
539,
198,
94732,
398,
5300,
627,
913,
1646,
1292,
25,
610,
284,
3436,
55609,
198,
913,
836,
25,
610,
284,
3436,
55609,
198,
913,
10344,
42974,
25,
1796,
17752,
60,
284,
3132,
55609,
198,
913,
10344,
9625,
25,
610,
284,
3436,
55609,
198,
913,
9681,
25,
12536,
53094,
17752,
5163,
284,
2290,
55609,
198,
16309,
311,
923,
311,
279,
1629,
11917,
627,
913,
2576,
25,
610,
284,
3436,
55609,
198,
2590,
15233,
311,
1005,
198,
913,
14008,
25,
1845,
510,
15669,
60,
55609,
198,
25729,
311,
1194,
704,
2077,
1495,
627,
565,
6797,
3889,
41681,
25,
610,
11,
3009,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
27777,
25,
12536,
58,
33758,
53094,
58,
4066,
7646,
3126,
1145,
5464,
7646,
2087,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
610,
55609,
198,
4061,
20044,
323,
1629,
279,
445,
11237,
389,
279,
2728,
10137,
323,
1988,
627,
7847,
945,
13523,
84432,
13044,
25,
1796,
17752,
1145,
3009,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
27777,
25,
12536,
58,
33758,
53094,
58,
4066,
7646,
3126,
1145,
5464,
7646,
2087,
5163,
284,
2290,
11,
12039,
9681,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
445,
11237,
2122,
55609
] | https://langchain.readthedocs.io/en/latest/llms/langchain.llms.beam.Beam.html |
20678d2c20fe-2 | Run the LLM on the given prompt and input.
async agenerate_prompt(prompts: List[PromptValue], stop: Optional[List[str]] = None, callbacks: Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]] = None, **kwargs: Any) → LLMResult¶
Take in a list of prompt values and return an LLMResult.
classmethod all_required_field_names() → Set¶
app_creation() → None[source]¶
Creates a Python file which will contain your Beam app definition.
async apredict(text: str, *, stop: Optional[Sequence[str]] = None, **kwargs: Any) → str¶
Predict text from text.
async apredict_messages(messages: List[BaseMessage], *, stop: Optional[Sequence[str]] = None, **kwargs: Any) → BaseMessage¶
Predict message from messages.
validator build_extra » all fields[source]¶
Build extra kwargs from additional params that were passed in.
dict(**kwargs: Any) → Dict¶
Return a dictionary of the LLM.
generate(prompts: List[str], stop: Optional[List[str]] = None, callbacks: Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]] = None, *, tags: Optional[List[str]] = None, **kwargs: Any) → LLMResult¶
Run the LLM on the given prompt and input.
generate_prompt(prompts: List[PromptValue], stop: Optional[List[str]] = None, callbacks: Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]] = None, **kwargs: Any) → LLMResult¶
Take in a list of prompt values and return an LLMResult.
get_num_tokens(text: str) → int¶
Get the number of tokens present in the text.
get_num_tokens_from_messages(messages: List[BaseMessage]) → int¶ | [
6869,
279,
445,
11237,
389,
279,
2728,
10137,
323,
1988,
627,
7847,
945,
13523,
62521,
84432,
13044,
25,
1796,
43447,
15091,
1150,
1145,
3009,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
27777,
25,
12536,
58,
33758,
53094,
58,
4066,
7646,
3126,
1145,
5464,
7646,
2087,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
445,
11237,
2122,
55609,
198,
18293,
304,
264,
1160,
315,
10137,
2819,
323,
471,
459,
445,
11237,
2122,
627,
27853,
682,
19265,
5121,
9366,
368,
11651,
2638,
55609,
198,
680,
47263,
368,
11651,
2290,
76747,
60,
55609,
198,
55968,
264,
13325,
1052,
902,
690,
6782,
701,
51230,
917,
7419,
627,
7847,
1469,
9037,
7383,
25,
610,
11,
12039,
3009,
25,
12536,
58,
14405,
17752,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
610,
55609,
198,
54644,
1495,
505,
1495,
627,
7847,
1469,
9037,
24321,
56805,
25,
1796,
58,
4066,
2097,
1145,
12039,
3009,
25,
12536,
58,
14405,
17752,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
5464,
2097,
55609,
198,
54644,
1984,
505,
6743,
627,
16503,
1977,
32958,
4194,
8345,
4194,
682,
5151,
76747,
60,
55609,
198,
11313,
5066,
16901,
505,
5217,
3712,
430,
1051,
5946,
304,
627,
8644,
22551,
9872,
25,
5884,
8,
11651,
30226,
55609,
198,
5715,
264,
11240,
315,
279,
445,
11237,
627,
19927,
84432,
13044,
25,
1796,
17752,
1145,
3009,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
27777,
25,
12536,
58,
33758,
53094,
58,
4066,
7646,
3126,
1145,
5464,
7646,
2087,
5163,
284,
2290,
11,
12039,
9681,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
445,
11237,
2122,
55609,
198,
6869,
279,
445,
11237,
389,
279,
2728,
10137,
323,
1988,
627,
19927,
62521,
84432,
13044,
25,
1796,
43447,
15091,
1150,
1145,
3009,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
27777,
25,
12536,
58,
33758,
53094,
58,
4066,
7646,
3126,
1145,
5464,
7646,
2087,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
445,
11237,
2122,
55609,
198,
18293,
304,
264,
1160,
315,
10137,
2819,
323,
471,
459,
445,
11237,
2122,
627,
456,
4369,
29938,
7383,
25,
610,
8,
11651,
528,
55609,
198,
1991,
279,
1396,
315,
11460,
3118,
304,
279,
1495,
627,
456,
4369,
29938,
5791,
24321,
56805,
25,
1796,
58,
4066,
2097,
2526,
11651,
528,
55609
] | https://langchain.readthedocs.io/en/latest/llms/langchain.llms.beam.Beam.html |
20678d2c20fe-3 | get_num_tokens_from_messages(messages: List[BaseMessage]) → int¶
Get the number of tokens in the message.
get_token_ids(text: str) → List[int]¶
Get the token present in the text.
predict(text: str, *, stop: Optional[Sequence[str]] = None, **kwargs: Any) → str¶
Predict text from text.
predict_messages(messages: List[BaseMessage], *, stop: Optional[Sequence[str]] = None, **kwargs: Any) → BaseMessage¶
Predict message from messages.
validator raise_deprecation » all fields¶
Raise deprecation warning if callback_manager is used.
run_creation() → None[source]¶
Creates a Python file which will be deployed on beam.
save(file_path: Union[Path, str]) → None¶
Save the LLM.
Parameters
file_path – Path to file to save the LLM to.
Example:
.. code-block:: python
llm.save(file_path=”path/llm.yaml”)
validator set_verbose » verbose¶
If verbose is None, set it.
This allows users to pass in None as verbose to access the global setting.
to_json() → Union[SerializedConstructor, SerializedNotImplemented]¶
to_json_not_implemented() → SerializedNotImplemented¶
validator validate_environment » all fields[source]¶
Validate that api key and python package exists in environment.
property authorization: str¶
property lc_attributes: Dict¶
Return a list of attribute names that should be included in the
serialized kwargs. These attributes must be accepted by the
constructor.
property lc_namespace: List[str]¶
Return the namespace of the langchain object.
eg. [“langchain”, “llms”, “openai”]
property lc_secrets: Dict[str, str]¶ | [
456,
4369,
29938,
5791,
24321,
56805,
25,
1796,
58,
4066,
2097,
2526,
11651,
528,
55609,
198,
1991,
279,
1396,
315,
11460,
304,
279,
1984,
627,
456,
6594,
8237,
7383,
25,
610,
8,
11651,
1796,
19155,
60,
55609,
198,
1991,
279,
4037,
3118,
304,
279,
1495,
627,
35798,
7383,
25,
610,
11,
12039,
3009,
25,
12536,
58,
14405,
17752,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
610,
55609,
198,
54644,
1495,
505,
1495,
627,
35798,
24321,
56805,
25,
1796,
58,
4066,
2097,
1145,
12039,
3009,
25,
12536,
58,
14405,
17752,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
5464,
2097,
55609,
198,
54644,
1984,
505,
6743,
627,
16503,
4933,
2310,
70693,
4194,
8345,
4194,
682,
5151,
55609,
198,
94201,
409,
70693,
10163,
422,
4927,
12418,
374,
1511,
627,
6236,
47263,
368,
11651,
2290,
76747,
60,
55609,
198,
55968,
264,
13325,
1052,
902,
690,
387,
27167,
389,
24310,
627,
6766,
4971,
2703,
25,
9323,
58,
1858,
11,
610,
2526,
11651,
2290,
55609,
198,
8960,
279,
445,
11237,
627,
9905,
198,
1213,
2703,
1389,
8092,
311,
1052,
311,
3665,
279,
445,
11237,
311,
627,
13617,
512,
497,
2082,
9612,
487,
10344,
198,
657,
76,
5799,
4971,
2703,
45221,
2398,
14,
657,
76,
34506,
863,
340,
16503,
743,
69021,
4194,
8345,
4194,
14008,
55609,
198,
2746,
14008,
374,
2290,
11,
743,
433,
627,
2028,
6276,
3932,
311,
1522,
304,
2290,
439,
14008,
311,
2680,
279,
3728,
6376,
627,
998,
9643,
368,
11651,
9323,
58,
78621,
13591,
11,
92572,
2688,
18804,
60,
55609,
198,
998,
9643,
8072,
18377,
14565,
368,
11651,
92572,
2688,
18804,
55609,
198,
16503,
9788,
52874,
4194,
8345,
4194,
682,
5151,
76747,
60,
55609,
198,
18409,
430,
6464,
1401,
323,
10344,
6462,
6866,
304,
4676,
627,
3784,
24645,
25,
610,
55609,
198,
3784,
37313,
18741,
25,
30226,
55609,
198,
5715,
264,
1160,
315,
7180,
5144,
430,
1288,
387,
5343,
304,
279,
198,
76377,
16901,
13,
4314,
8365,
2011,
387,
11928,
555,
279,
198,
22602,
627,
3784,
37313,
42671,
25,
1796,
17752,
60,
55609,
198,
5715,
279,
4573,
315,
279,
8859,
8995,
1665,
627,
797,
13,
510,
2118,
5317,
8995,
9520,
1054,
657,
1026,
9520,
1054,
2569,
2192,
863,
933,
3784,
37313,
3537,
53810,
25,
30226,
17752,
11,
610,
60,
55609
] | https://langchain.readthedocs.io/en/latest/llms/langchain.llms.beam.Beam.html |
20678d2c20fe-4 | property lc_secrets: Dict[str, str]¶
Return a map of constructor argument names to secret ids.
eg. {“openai_api_key”: “OPENAI_API_KEY”}
property lc_serializable: bool¶
Return whether or not the class is serializable.
model Config[source]¶
Bases: object
Configuration for this pydantic config.
extra = 'forbid'¶ | [
3784,
37313,
3537,
53810,
25,
30226,
17752,
11,
610,
60,
55609,
198,
5715,
264,
2472,
315,
4797,
5811,
5144,
311,
6367,
14483,
627,
797,
13,
314,
2118,
2569,
2192,
11959,
3173,
57633,
1054,
32033,
15836,
11669,
6738,
863,
534,
3784,
37313,
26684,
8499,
25,
1845,
55609,
198,
5715,
3508,
477,
539,
279,
538,
374,
6275,
8499,
627,
2590,
5649,
76747,
60,
55609,
198,
33,
2315,
25,
1665,
198,
7843,
369,
420,
4611,
67,
8322,
2242,
627,
15824,
284,
364,
2000,
21301,
6,
55609
] | https://langchain.readthedocs.io/en/latest/llms/langchain.llms.beam.Beam.html |
bbba48e1749a-0 | langchain.llms.rwkv.RWKV¶
class langchain.llms.rwkv.RWKV(*, cache: Optional[bool] = None, verbose: bool = None, callbacks: Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]] = None, callback_manager: Optional[BaseCallbackManager] = None, tags: Optional[List[str]] = None, model: str, tokens_path: str, strategy: str = 'cpu fp32', rwkv_verbose: bool = True, temperature: float = 1.0, top_p: float = 0.5, penalty_alpha_frequency: float = 0.4, penalty_alpha_presence: float = 0.4, CHUNK_LEN: int = 256, max_tokens_per_generation: int = 256, client: Any = None, tokenizer: Any = None, pipeline: Any = None, model_tokens: Any = None, model_state: Any = None)[source]¶
Bases: LLM, BaseModel
Wrapper around RWKV language models.
To use, you should have the rwkv python package installed, the
pre-trained model file, and the model’s config information.
Example
from langchain.llms import RWKV
model = RWKV(model="./models/rwkv-3b-fp16.bin", strategy="cpu fp32")
# Simplest invocation
response = model("Once upon a time, ")
Create a new model by parsing and validating input data from keyword arguments.
Raises ValidationError if the input data cannot be parsed to form a valid model.
param CHUNK_LEN: int = 256¶
Batch size for prompt processing.
param cache: Optional[bool] = None¶
param callback_manager: Optional[BaseCallbackManager] = None¶
param callbacks: Callbacks = None¶
param max_tokens_per_generation: int = 256¶ | [
5317,
8995,
60098,
1026,
1783,
86,
44508,
2056,
69416,
53,
55609,
198,
1058,
8859,
8995,
60098,
1026,
1783,
86,
44508,
2056,
69416,
53,
4163,
11,
6636,
25,
12536,
58,
2707,
60,
284,
2290,
11,
14008,
25,
1845,
284,
2290,
11,
27777,
25,
12536,
58,
33758,
53094,
58,
4066,
7646,
3126,
1145,
5464,
7646,
2087,
5163,
284,
2290,
11,
4927,
12418,
25,
12536,
58,
4066,
7646,
2087,
60,
284,
2290,
11,
9681,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
1646,
25,
610,
11,
11460,
2703,
25,
610,
11,
8446,
25,
610,
284,
364,
16881,
12276,
843,
518,
27082,
44508,
69021,
25,
1845,
284,
3082,
11,
9499,
25,
2273,
284,
220,
16,
13,
15,
11,
1948,
623,
25,
2273,
284,
220,
15,
13,
20,
11,
16750,
27731,
41232,
25,
2273,
284,
220,
15,
13,
19,
11,
16750,
27731,
57503,
25,
2273,
284,
220,
15,
13,
19,
11,
99717,
15906,
25,
528,
284,
220,
4146,
11,
1973,
29938,
5796,
65291,
25,
528,
284,
220,
4146,
11,
3016,
25,
5884,
284,
2290,
11,
47058,
25,
5884,
284,
2290,
11,
15660,
25,
5884,
284,
2290,
11,
1646,
29938,
25,
5884,
284,
2290,
11,
1646,
4486,
25,
5884,
284,
2290,
6758,
2484,
60,
55609,
198,
33,
2315,
25,
445,
11237,
11,
65705,
198,
11803,
2212,
47306,
83807,
4221,
4211,
627,
1271,
1005,
11,
499,
1288,
617,
279,
27082,
44508,
10344,
6462,
10487,
11,
279,
198,
1762,
70024,
1646,
1052,
11,
323,
279,
1646,
753,
2242,
2038,
627,
13617,
198,
1527,
8859,
8995,
60098,
1026,
1179,
47306,
83807,
198,
2590,
284,
47306,
83807,
7790,
27518,
6644,
7534,
86,
44508,
12,
18,
65,
2269,
79,
845,
30494,
498,
8446,
429,
16881,
12276,
843,
1158,
2,
9170,
267,
29796,
198,
2376,
284,
1646,
446,
12805,
5304,
264,
892,
11,
14501,
4110,
264,
502,
1646,
555,
23115,
323,
69772,
1988,
828,
505,
16570,
6105,
627,
36120,
54129,
422,
279,
1988,
828,
4250,
387,
16051,
311,
1376,
264,
2764,
1646,
627,
913,
99717,
15906,
25,
528,
284,
220,
4146,
55609,
198,
21753,
1404,
369,
10137,
8863,
627,
913,
6636,
25,
12536,
58,
2707,
60,
284,
2290,
55609,
198,
913,
4927,
12418,
25,
12536,
58,
4066,
7646,
2087,
60,
284,
2290,
55609,
198,
913,
27777,
25,
23499,
82,
284,
2290,
55609,
198,
913,
1973,
29938,
5796,
65291,
25,
528,
284,
220,
4146,
55609
] | https://langchain.readthedocs.io/en/latest/llms/langchain.llms.rwkv.RWKV.html |
bbba48e1749a-1 | param max_tokens_per_generation: int = 256¶
Maximum number of tokens to generate.
param model: str [Required]¶
Path to the pre-trained RWKV model file.
param penalty_alpha_frequency: float = 0.4¶
Positive values penalize new tokens based on their existing frequency
in the text so far, decreasing the model’s likelihood to repeat the same
line verbatim..
param penalty_alpha_presence: float = 0.4¶
Positive values penalize new tokens based on whether they appear
in the text so far, increasing the model’s likelihood to talk about
new topics..
param rwkv_verbose: bool = True¶
Print debug information.
param strategy: str = 'cpu fp32'¶
Token context window.
param tags: Optional[List[str]] = None¶
Tags to add to the run trace.
param temperature: float = 1.0¶
The temperature to use for sampling.
param tokens_path: str [Required]¶
Path to the RWKV tokens file.
param top_p: float = 0.5¶
The top-p value to use for sampling.
param verbose: bool [Optional]¶
Whether to print out response text.
__call__(prompt: str, stop: Optional[List[str]] = None, callbacks: Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]] = None, **kwargs: Any) → str¶
Check Cache and run the LLM on the given prompt and input.
async agenerate(prompts: List[str], stop: Optional[List[str]] = None, callbacks: Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]] = None, *, tags: Optional[List[str]] = None, **kwargs: Any) → LLMResult¶
Run the LLM on the given prompt and input. | [
913,
1973,
29938,
5796,
65291,
25,
528,
284,
220,
4146,
55609,
198,
28409,
1396,
315,
11460,
311,
7068,
627,
913,
1646,
25,
610,
510,
8327,
60,
55609,
198,
1858,
311,
279,
864,
70024,
47306,
83807,
1646,
1052,
627,
913,
16750,
27731,
41232,
25,
2273,
284,
220,
15,
13,
19,
55609,
198,
36590,
2819,
47426,
553,
502,
11460,
3196,
389,
872,
6484,
11900,
198,
258,
279,
1495,
779,
3117,
11,
44649,
279,
1646,
753,
29736,
311,
13454,
279,
1890,
198,
1074,
2807,
55848,
35047,
913,
16750,
27731,
57503,
25,
2273,
284,
220,
15,
13,
19,
55609,
198,
36590,
2819,
47426,
553,
502,
11460,
3196,
389,
3508,
814,
5101,
198,
258,
279,
1495,
779,
3117,
11,
7859,
279,
1646,
753,
29736,
311,
3137,
922,
198,
943,
13650,
35047,
913,
27082,
44508,
69021,
25,
1845,
284,
3082,
55609,
198,
9171,
7542,
2038,
627,
913,
8446,
25,
610,
284,
364,
16881,
12276,
843,
6,
55609,
198,
3404,
2317,
3321,
627,
913,
9681,
25,
12536,
53094,
17752,
5163,
284,
2290,
55609,
198,
16309,
311,
923,
311,
279,
1629,
11917,
627,
913,
9499,
25,
2273,
284,
220,
16,
13,
15,
55609,
198,
791,
9499,
311,
1005,
369,
25936,
627,
913,
11460,
2703,
25,
610,
510,
8327,
60,
55609,
198,
1858,
311,
279,
47306,
83807,
11460,
1052,
627,
913,
1948,
623,
25,
2273,
284,
220,
15,
13,
20,
55609,
198,
791,
1948,
2320,
907,
311,
1005,
369,
25936,
627,
913,
14008,
25,
1845,
510,
15669,
60,
55609,
198,
25729,
311,
1194,
704,
2077,
1495,
627,
565,
6797,
3889,
41681,
25,
610,
11,
3009,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
27777,
25,
12536,
58,
33758,
53094,
58,
4066,
7646,
3126,
1145,
5464,
7646,
2087,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
610,
55609,
198,
4061,
20044,
323,
1629,
279,
445,
11237,
389,
279,
2728,
10137,
323,
1988,
627,
7847,
945,
13523,
84432,
13044,
25,
1796,
17752,
1145,
3009,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
27777,
25,
12536,
58,
33758,
53094,
58,
4066,
7646,
3126,
1145,
5464,
7646,
2087,
5163,
284,
2290,
11,
12039,
9681,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
445,
11237,
2122,
55609,
198,
6869,
279,
445,
11237,
389,
279,
2728,
10137,
323,
1988,
13
] | https://langchain.readthedocs.io/en/latest/llms/langchain.llms.rwkv.RWKV.html |
bbba48e1749a-2 | Run the LLM on the given prompt and input.
async agenerate_prompt(prompts: List[PromptValue], stop: Optional[List[str]] = None, callbacks: Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]] = None, **kwargs: Any) → LLMResult¶
Take in a list of prompt values and return an LLMResult.
classmethod all_required_field_names() → Set¶
async apredict(text: str, *, stop: Optional[Sequence[str]] = None, **kwargs: Any) → str¶
Predict text from text.
async apredict_messages(messages: List[BaseMessage], *, stop: Optional[Sequence[str]] = None, **kwargs: Any) → BaseMessage¶
Predict message from messages.
dict(**kwargs: Any) → Dict¶
Return a dictionary of the LLM.
generate(prompts: List[str], stop: Optional[List[str]] = None, callbacks: Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]] = None, *, tags: Optional[List[str]] = None, **kwargs: Any) → LLMResult¶
Run the LLM on the given prompt and input.
generate_prompt(prompts: List[PromptValue], stop: Optional[List[str]] = None, callbacks: Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]] = None, **kwargs: Any) → LLMResult¶
Take in a list of prompt values and return an LLMResult.
get_num_tokens(text: str) → int¶
Get the number of tokens present in the text.
get_num_tokens_from_messages(messages: List[BaseMessage]) → int¶
Get the number of tokens in the message.
get_token_ids(text: str) → List[int]¶
Get the token present in the text. | [
6869,
279,
445,
11237,
389,
279,
2728,
10137,
323,
1988,
627,
7847,
945,
13523,
62521,
84432,
13044,
25,
1796,
43447,
15091,
1150,
1145,
3009,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
27777,
25,
12536,
58,
33758,
53094,
58,
4066,
7646,
3126,
1145,
5464,
7646,
2087,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
445,
11237,
2122,
55609,
198,
18293,
304,
264,
1160,
315,
10137,
2819,
323,
471,
459,
445,
11237,
2122,
627,
27853,
682,
19265,
5121,
9366,
368,
11651,
2638,
55609,
198,
7847,
1469,
9037,
7383,
25,
610,
11,
12039,
3009,
25,
12536,
58,
14405,
17752,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
610,
55609,
198,
54644,
1495,
505,
1495,
627,
7847,
1469,
9037,
24321,
56805,
25,
1796,
58,
4066,
2097,
1145,
12039,
3009,
25,
12536,
58,
14405,
17752,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
5464,
2097,
55609,
198,
54644,
1984,
505,
6743,
627,
8644,
22551,
9872,
25,
5884,
8,
11651,
30226,
55609,
198,
5715,
264,
11240,
315,
279,
445,
11237,
627,
19927,
84432,
13044,
25,
1796,
17752,
1145,
3009,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
27777,
25,
12536,
58,
33758,
53094,
58,
4066,
7646,
3126,
1145,
5464,
7646,
2087,
5163,
284,
2290,
11,
12039,
9681,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
445,
11237,
2122,
55609,
198,
6869,
279,
445,
11237,
389,
279,
2728,
10137,
323,
1988,
627,
19927,
62521,
84432,
13044,
25,
1796,
43447,
15091,
1150,
1145,
3009,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
27777,
25,
12536,
58,
33758,
53094,
58,
4066,
7646,
3126,
1145,
5464,
7646,
2087,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
445,
11237,
2122,
55609,
198,
18293,
304,
264,
1160,
315,
10137,
2819,
323,
471,
459,
445,
11237,
2122,
627,
456,
4369,
29938,
7383,
25,
610,
8,
11651,
528,
55609,
198,
1991,
279,
1396,
315,
11460,
3118,
304,
279,
1495,
627,
456,
4369,
29938,
5791,
24321,
56805,
25,
1796,
58,
4066,
2097,
2526,
11651,
528,
55609,
198,
1991,
279,
1396,
315,
11460,
304,
279,
1984,
627,
456,
6594,
8237,
7383,
25,
610,
8,
11651,
1796,
19155,
60,
55609,
198,
1991,
279,
4037,
3118,
304,
279,
1495,
13
] | https://langchain.readthedocs.io/en/latest/llms/langchain.llms.rwkv.RWKV.html |
bbba48e1749a-3 | Get the token present in the text.
predict(text: str, *, stop: Optional[Sequence[str]] = None, **kwargs: Any) → str¶
Predict text from text.
predict_messages(messages: List[BaseMessage], *, stop: Optional[Sequence[str]] = None, **kwargs: Any) → BaseMessage¶
Predict message from messages.
validator raise_deprecation » all fields¶
Raise deprecation warning if callback_manager is used.
run_rnn(_tokens: List[str], newline_adj: int = 0) → Any[source]¶
rwkv_generate(prompt: str) → str[source]¶
save(file_path: Union[Path, str]) → None¶
Save the LLM.
Parameters
file_path – Path to file to save the LLM to.
Example:
.. code-block:: python
llm.save(file_path=”path/llm.yaml”)
validator set_verbose » verbose¶
If verbose is None, set it.
This allows users to pass in None as verbose to access the global setting.
to_json() → Union[SerializedConstructor, SerializedNotImplemented]¶
to_json_not_implemented() → SerializedNotImplemented¶
validator validate_environment » all fields[source]¶
Validate that the python package exists in the environment.
property lc_attributes: Dict¶
Return a list of attribute names that should be included in the
serialized kwargs. These attributes must be accepted by the
constructor.
property lc_namespace: List[str]¶
Return the namespace of the langchain object.
eg. [“langchain”, “llms”, “openai”]
property lc_secrets: Dict[str, str]¶
Return a map of constructor argument names to secret ids.
eg. {“openai_api_key”: “OPENAI_API_KEY”}
property lc_serializable: bool¶ | [
1991,
279,
4037,
3118,
304,
279,
1495,
627,
35798,
7383,
25,
610,
11,
12039,
3009,
25,
12536,
58,
14405,
17752,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
610,
55609,
198,
54644,
1495,
505,
1495,
627,
35798,
24321,
56805,
25,
1796,
58,
4066,
2097,
1145,
12039,
3009,
25,
12536,
58,
14405,
17752,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
5464,
2097,
55609,
198,
54644,
1984,
505,
6743,
627,
16503,
4933,
2310,
70693,
4194,
8345,
4194,
682,
5151,
55609,
198,
94201,
409,
70693,
10163,
422,
4927,
12418,
374,
1511,
627,
6236,
99700,
2551,
31666,
25,
1796,
17752,
1145,
40127,
43659,
25,
528,
284,
220,
15,
8,
11651,
5884,
76747,
60,
55609,
198,
32868,
44508,
49951,
73353,
25,
610,
8,
11651,
610,
76747,
60,
55609,
198,
6766,
4971,
2703,
25,
9323,
58,
1858,
11,
610,
2526,
11651,
2290,
55609,
198,
8960,
279,
445,
11237,
627,
9905,
198,
1213,
2703,
1389,
8092,
311,
1052,
311,
3665,
279,
445,
11237,
311,
627,
13617,
512,
497,
2082,
9612,
487,
10344,
198,
657,
76,
5799,
4971,
2703,
45221,
2398,
14,
657,
76,
34506,
863,
340,
16503,
743,
69021,
4194,
8345,
4194,
14008,
55609,
198,
2746,
14008,
374,
2290,
11,
743,
433,
627,
2028,
6276,
3932,
311,
1522,
304,
2290,
439,
14008,
311,
2680,
279,
3728,
6376,
627,
998,
9643,
368,
11651,
9323,
58,
78621,
13591,
11,
92572,
2688,
18804,
60,
55609,
198,
998,
9643,
8072,
18377,
14565,
368,
11651,
92572,
2688,
18804,
55609,
198,
16503,
9788,
52874,
4194,
8345,
4194,
682,
5151,
76747,
60,
55609,
198,
18409,
430,
279,
10344,
6462,
6866,
304,
279,
4676,
627,
3784,
37313,
18741,
25,
30226,
55609,
198,
5715,
264,
1160,
315,
7180,
5144,
430,
1288,
387,
5343,
304,
279,
198,
76377,
16901,
13,
4314,
8365,
2011,
387,
11928,
555,
279,
198,
22602,
627,
3784,
37313,
42671,
25,
1796,
17752,
60,
55609,
198,
5715,
279,
4573,
315,
279,
8859,
8995,
1665,
627,
797,
13,
510,
2118,
5317,
8995,
9520,
1054,
657,
1026,
9520,
1054,
2569,
2192,
863,
933,
3784,
37313,
3537,
53810,
25,
30226,
17752,
11,
610,
60,
55609,
198,
5715,
264,
2472,
315,
4797,
5811,
5144,
311,
6367,
14483,
627,
797,
13,
314,
2118,
2569,
2192,
11959,
3173,
57633,
1054,
32033,
15836,
11669,
6738,
863,
534,
3784,
37313,
26684,
8499,
25,
1845,
55609
] | https://langchain.readthedocs.io/en/latest/llms/langchain.llms.rwkv.RWKV.html |
bbba48e1749a-4 | property lc_serializable: bool¶
Return whether or not the class is serializable.
model Config[source]¶
Bases: object
Configuration for this pydantic object.
extra = 'forbid'¶ | [
3784,
37313,
26684,
8499,
25,
1845,
55609,
198,
5715,
3508,
477,
539,
279,
538,
374,
6275,
8499,
627,
2590,
5649,
76747,
60,
55609,
198,
33,
2315,
25,
1665,
198,
7843,
369,
420,
4611,
67,
8322,
1665,
627,
15824,
284,
364,
2000,
21301,
6,
55609
] | https://langchain.readthedocs.io/en/latest/llms/langchain.llms.rwkv.RWKV.html |
1e265abd1285-0 | langchain.llms.writer.Writer¶
class langchain.llms.writer.Writer(*, cache: Optional[bool] = None, verbose: bool = None, callbacks: Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]] = None, callback_manager: Optional[BaseCallbackManager] = None, tags: Optional[List[str]] = None, writer_org_id: Optional[str] = None, model_id: str = 'palmyra-instruct', min_tokens: Optional[int] = None, max_tokens: Optional[int] = None, temperature: Optional[float] = None, top_p: Optional[float] = None, stop: Optional[List[str]] = None, presence_penalty: Optional[float] = None, repetition_penalty: Optional[float] = None, best_of: Optional[int] = None, logprobs: bool = False, n: Optional[int] = None, writer_api_key: Optional[str] = None, base_url: Optional[str] = None)[source]¶
Bases: LLM
Wrapper around Writer large language models.
To use, you should have the environment variable WRITER_API_KEY and
WRITER_ORG_ID set with your API key and organization ID respectively.
Example
from langchain import Writer
writer = Writer(model_id="palmyra-base")
Create a new model by parsing and validating input data from keyword arguments.
Raises ValidationError if the input data cannot be parsed to form a valid model.
param base_url: Optional[str] = None¶
Base url to use, if None decides based on model name.
param best_of: Optional[int] = None¶
Generates this many completions server-side and returns the “best”.
param cache: Optional[bool] = None¶
param callback_manager: Optional[BaseCallbackManager] = None¶
param callbacks: Callbacks = None¶
param logprobs: bool = False¶ | [
5317,
8995,
60098,
1026,
31932,
48938,
55609,
198,
1058,
8859,
8995,
60098,
1026,
31932,
48938,
4163,
11,
6636,
25,
12536,
58,
2707,
60,
284,
2290,
11,
14008,
25,
1845,
284,
2290,
11,
27777,
25,
12536,
58,
33758,
53094,
58,
4066,
7646,
3126,
1145,
5464,
7646,
2087,
5163,
284,
2290,
11,
4927,
12418,
25,
12536,
58,
4066,
7646,
2087,
60,
284,
2290,
11,
9681,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
7061,
36683,
851,
25,
12536,
17752,
60,
284,
2290,
11,
1646,
851,
25,
610,
284,
364,
19866,
2465,
969,
3502,
1257,
518,
1332,
29938,
25,
12536,
19155,
60,
284,
2290,
11,
1973,
29938,
25,
12536,
19155,
60,
284,
2290,
11,
9499,
25,
12536,
96481,
60,
284,
2290,
11,
1948,
623,
25,
12536,
96481,
60,
284,
2290,
11,
3009,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
9546,
83386,
25,
12536,
96481,
60,
284,
2290,
11,
54515,
83386,
25,
12536,
96481,
60,
284,
2290,
11,
1888,
3659,
25,
12536,
19155,
60,
284,
2290,
11,
1515,
782,
1302,
25,
1845,
284,
3641,
11,
308,
25,
12536,
19155,
60,
284,
2290,
11,
7061,
11959,
3173,
25,
12536,
17752,
60,
284,
2290,
11,
2385,
2975,
25,
12536,
17752,
60,
284,
2290,
6758,
2484,
60,
55609,
198,
33,
2315,
25,
445,
11237,
198,
11803,
2212,
30504,
3544,
4221,
4211,
627,
1271,
1005,
11,
499,
1288,
617,
279,
4676,
3977,
25321,
25764,
11669,
6738,
323,
198,
18408,
25764,
20424,
38,
3533,
743,
449,
701,
5446,
1401,
323,
7471,
3110,
15947,
627,
13617,
198,
1527,
8859,
8995,
1179,
30504,
198,
18688,
284,
30504,
7790,
851,
429,
19866,
2465,
969,
31113,
1158,
4110,
264,
502,
1646,
555,
23115,
323,
69772,
1988,
828,
505,
16570,
6105,
627,
36120,
54129,
422,
279,
1988,
828,
4250,
387,
16051,
311,
1376,
264,
2764,
1646,
627,
913,
2385,
2975,
25,
12536,
17752,
60,
284,
2290,
55609,
198,
4066,
2576,
311,
1005,
11,
422,
2290,
28727,
3196,
389,
1646,
836,
627,
913,
1888,
3659,
25,
12536,
19155,
60,
284,
2290,
55609,
198,
5648,
988,
420,
1690,
3543,
919,
3622,
25034,
323,
4780,
279,
1054,
16241,
863,
627,
913,
6636,
25,
12536,
58,
2707,
60,
284,
2290,
55609,
198,
913,
4927,
12418,
25,
12536,
58,
4066,
7646,
2087,
60,
284,
2290,
55609,
198,
913,
27777,
25,
23499,
82,
284,
2290,
55609,
198,
913,
1515,
782,
1302,
25,
1845,
284,
3641,
55609
] | https://langchain.readthedocs.io/en/latest/llms/langchain.llms.writer.Writer.html |
1e265abd1285-1 | param callbacks: Callbacks = None¶
param logprobs: bool = False¶
Whether to return log probabilities.
param max_tokens: Optional[int] = None¶
Maximum number of tokens to generate.
param min_tokens: Optional[int] = None¶
Minimum number of tokens to generate.
param model_id: str = 'palmyra-instruct'¶
Model name to use.
param n: Optional[int] = None¶
How many completions to generate.
param presence_penalty: Optional[float] = None¶
Penalizes repeated tokens regardless of frequency.
param repetition_penalty: Optional[float] = None¶
Penalizes repeated tokens according to frequency.
param stop: Optional[List[str]] = None¶
Sequences when completion generation will stop.
param tags: Optional[List[str]] = None¶
Tags to add to the run trace.
param temperature: Optional[float] = None¶
What sampling temperature to use.
param top_p: Optional[float] = None¶
Total probability mass of tokens to consider at each step.
param verbose: bool [Optional]¶
Whether to print out response text.
param writer_api_key: Optional[str] = None¶
Writer API key.
param writer_org_id: Optional[str] = None¶
Writer organization ID.
__call__(prompt: str, stop: Optional[List[str]] = None, callbacks: Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]] = None, **kwargs: Any) → str¶
Check Cache and run the LLM on the given prompt and input.
async agenerate(prompts: List[str], stop: Optional[List[str]] = None, callbacks: Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]] = None, *, tags: Optional[List[str]] = None, **kwargs: Any) → LLMResult¶ | [
913,
27777,
25,
23499,
82,
284,
2290,
55609,
198,
913,
1515,
782,
1302,
25,
1845,
284,
3641,
55609,
198,
25729,
311,
471,
1515,
49316,
627,
913,
1973,
29938,
25,
12536,
19155,
60,
284,
2290,
55609,
198,
28409,
1396,
315,
11460,
311,
7068,
627,
913,
1332,
29938,
25,
12536,
19155,
60,
284,
2290,
55609,
198,
29795,
1396,
315,
11460,
311,
7068,
627,
913,
1646,
851,
25,
610,
284,
364,
19866,
2465,
969,
3502,
1257,
6,
55609,
198,
1747,
836,
311,
1005,
627,
913,
308,
25,
12536,
19155,
60,
284,
2290,
55609,
198,
4438,
1690,
3543,
919,
311,
7068,
627,
913,
9546,
83386,
25,
12536,
96481,
60,
284,
2290,
55609,
198,
29305,
278,
4861,
11763,
11460,
15851,
315,
11900,
627,
913,
54515,
83386,
25,
12536,
96481,
60,
284,
2290,
55609,
198,
29305,
278,
4861,
11763,
11460,
4184,
311,
11900,
627,
913,
3009,
25,
12536,
53094,
17752,
5163,
284,
2290,
55609,
198,
1542,
45045,
994,
9954,
9659,
690,
3009,
627,
913,
9681,
25,
12536,
53094,
17752,
5163,
284,
2290,
55609,
198,
16309,
311,
923,
311,
279,
1629,
11917,
627,
913,
9499,
25,
12536,
96481,
60,
284,
2290,
55609,
198,
3923,
25936,
9499,
311,
1005,
627,
913,
1948,
623,
25,
12536,
96481,
60,
284,
2290,
55609,
198,
7749,
19463,
3148,
315,
11460,
311,
2980,
520,
1855,
3094,
627,
913,
14008,
25,
1845,
510,
15669,
60,
55609,
198,
25729,
311,
1194,
704,
2077,
1495,
627,
913,
7061,
11959,
3173,
25,
12536,
17752,
60,
284,
2290,
55609,
198,
6628,
5446,
1401,
627,
913,
7061,
36683,
851,
25,
12536,
17752,
60,
284,
2290,
55609,
198,
6628,
7471,
3110,
627,
565,
6797,
3889,
41681,
25,
610,
11,
3009,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
27777,
25,
12536,
58,
33758,
53094,
58,
4066,
7646,
3126,
1145,
5464,
7646,
2087,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
610,
55609,
198,
4061,
20044,
323,
1629,
279,
445,
11237,
389,
279,
2728,
10137,
323,
1988,
627,
7847,
945,
13523,
84432,
13044,
25,
1796,
17752,
1145,
3009,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
27777,
25,
12536,
58,
33758,
53094,
58,
4066,
7646,
3126,
1145,
5464,
7646,
2087,
5163,
284,
2290,
11,
12039,
9681,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
445,
11237,
2122,
55609
] | https://langchain.readthedocs.io/en/latest/llms/langchain.llms.writer.Writer.html |
1e265abd1285-2 | Run the LLM on the given prompt and input.
async agenerate_prompt(prompts: List[PromptValue], stop: Optional[List[str]] = None, callbacks: Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]] = None, **kwargs: Any) → LLMResult¶
Take in a list of prompt values and return an LLMResult.
classmethod all_required_field_names() → Set¶
async apredict(text: str, *, stop: Optional[Sequence[str]] = None, **kwargs: Any) → str¶
Predict text from text.
async apredict_messages(messages: List[BaseMessage], *, stop: Optional[Sequence[str]] = None, **kwargs: Any) → BaseMessage¶
Predict message from messages.
dict(**kwargs: Any) → Dict¶
Return a dictionary of the LLM.
generate(prompts: List[str], stop: Optional[List[str]] = None, callbacks: Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]] = None, *, tags: Optional[List[str]] = None, **kwargs: Any) → LLMResult¶
Run the LLM on the given prompt and input.
generate_prompt(prompts: List[PromptValue], stop: Optional[List[str]] = None, callbacks: Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]] = None, **kwargs: Any) → LLMResult¶
Take in a list of prompt values and return an LLMResult.
get_num_tokens(text: str) → int¶
Get the number of tokens present in the text.
get_num_tokens_from_messages(messages: List[BaseMessage]) → int¶
Get the number of tokens in the message.
get_token_ids(text: str) → List[int]¶
Get the token present in the text. | [
6869,
279,
445,
11237,
389,
279,
2728,
10137,
323,
1988,
627,
7847,
945,
13523,
62521,
84432,
13044,
25,
1796,
43447,
15091,
1150,
1145,
3009,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
27777,
25,
12536,
58,
33758,
53094,
58,
4066,
7646,
3126,
1145,
5464,
7646,
2087,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
445,
11237,
2122,
55609,
198,
18293,
304,
264,
1160,
315,
10137,
2819,
323,
471,
459,
445,
11237,
2122,
627,
27853,
682,
19265,
5121,
9366,
368,
11651,
2638,
55609,
198,
7847,
1469,
9037,
7383,
25,
610,
11,
12039,
3009,
25,
12536,
58,
14405,
17752,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
610,
55609,
198,
54644,
1495,
505,
1495,
627,
7847,
1469,
9037,
24321,
56805,
25,
1796,
58,
4066,
2097,
1145,
12039,
3009,
25,
12536,
58,
14405,
17752,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
5464,
2097,
55609,
198,
54644,
1984,
505,
6743,
627,
8644,
22551,
9872,
25,
5884,
8,
11651,
30226,
55609,
198,
5715,
264,
11240,
315,
279,
445,
11237,
627,
19927,
84432,
13044,
25,
1796,
17752,
1145,
3009,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
27777,
25,
12536,
58,
33758,
53094,
58,
4066,
7646,
3126,
1145,
5464,
7646,
2087,
5163,
284,
2290,
11,
12039,
9681,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
445,
11237,
2122,
55609,
198,
6869,
279,
445,
11237,
389,
279,
2728,
10137,
323,
1988,
627,
19927,
62521,
84432,
13044,
25,
1796,
43447,
15091,
1150,
1145,
3009,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
27777,
25,
12536,
58,
33758,
53094,
58,
4066,
7646,
3126,
1145,
5464,
7646,
2087,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
445,
11237,
2122,
55609,
198,
18293,
304,
264,
1160,
315,
10137,
2819,
323,
471,
459,
445,
11237,
2122,
627,
456,
4369,
29938,
7383,
25,
610,
8,
11651,
528,
55609,
198,
1991,
279,
1396,
315,
11460,
3118,
304,
279,
1495,
627,
456,
4369,
29938,
5791,
24321,
56805,
25,
1796,
58,
4066,
2097,
2526,
11651,
528,
55609,
198,
1991,
279,
1396,
315,
11460,
304,
279,
1984,
627,
456,
6594,
8237,
7383,
25,
610,
8,
11651,
1796,
19155,
60,
55609,
198,
1991,
279,
4037,
3118,
304,
279,
1495,
13
] | https://langchain.readthedocs.io/en/latest/llms/langchain.llms.writer.Writer.html |
1e265abd1285-3 | Get the token present in the text.
predict(text: str, *, stop: Optional[Sequence[str]] = None, **kwargs: Any) → str¶
Predict text from text.
predict_messages(messages: List[BaseMessage], *, stop: Optional[Sequence[str]] = None, **kwargs: Any) → BaseMessage¶
Predict message from messages.
validator raise_deprecation » all fields¶
Raise deprecation warning if callback_manager is used.
save(file_path: Union[Path, str]) → None¶
Save the LLM.
Parameters
file_path – Path to file to save the LLM to.
Example:
.. code-block:: python
llm.save(file_path=”path/llm.yaml”)
validator set_verbose » verbose¶
If verbose is None, set it.
This allows users to pass in None as verbose to access the global setting.
to_json() → Union[SerializedConstructor, SerializedNotImplemented]¶
to_json_not_implemented() → SerializedNotImplemented¶
validator validate_environment » all fields[source]¶
Validate that api key and organization id exist in environment.
property lc_attributes: Dict¶
Return a list of attribute names that should be included in the
serialized kwargs. These attributes must be accepted by the
constructor.
property lc_namespace: List[str]¶
Return the namespace of the langchain object.
eg. [“langchain”, “llms”, “openai”]
property lc_secrets: Dict[str, str]¶
Return a map of constructor argument names to secret ids.
eg. {“openai_api_key”: “OPENAI_API_KEY”}
property lc_serializable: bool¶
Return whether or not the class is serializable.
model Config[source]¶
Bases: object
Configuration for this pydantic object.
extra = 'forbid'¶ | [
1991,
279,
4037,
3118,
304,
279,
1495,
627,
35798,
7383,
25,
610,
11,
12039,
3009,
25,
12536,
58,
14405,
17752,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
610,
55609,
198,
54644,
1495,
505,
1495,
627,
35798,
24321,
56805,
25,
1796,
58,
4066,
2097,
1145,
12039,
3009,
25,
12536,
58,
14405,
17752,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
5464,
2097,
55609,
198,
54644,
1984,
505,
6743,
627,
16503,
4933,
2310,
70693,
4194,
8345,
4194,
682,
5151,
55609,
198,
94201,
409,
70693,
10163,
422,
4927,
12418,
374,
1511,
627,
6766,
4971,
2703,
25,
9323,
58,
1858,
11,
610,
2526,
11651,
2290,
55609,
198,
8960,
279,
445,
11237,
627,
9905,
198,
1213,
2703,
1389,
8092,
311,
1052,
311,
3665,
279,
445,
11237,
311,
627,
13617,
512,
497,
2082,
9612,
487,
10344,
198,
657,
76,
5799,
4971,
2703,
45221,
2398,
14,
657,
76,
34506,
863,
340,
16503,
743,
69021,
4194,
8345,
4194,
14008,
55609,
198,
2746,
14008,
374,
2290,
11,
743,
433,
627,
2028,
6276,
3932,
311,
1522,
304,
2290,
439,
14008,
311,
2680,
279,
3728,
6376,
627,
998,
9643,
368,
11651,
9323,
58,
78621,
13591,
11,
92572,
2688,
18804,
60,
55609,
198,
998,
9643,
8072,
18377,
14565,
368,
11651,
92572,
2688,
18804,
55609,
198,
16503,
9788,
52874,
4194,
8345,
4194,
682,
5151,
76747,
60,
55609,
198,
18409,
430,
6464,
1401,
323,
7471,
887,
3073,
304,
4676,
627,
3784,
37313,
18741,
25,
30226,
55609,
198,
5715,
264,
1160,
315,
7180,
5144,
430,
1288,
387,
5343,
304,
279,
198,
76377,
16901,
13,
4314,
8365,
2011,
387,
11928,
555,
279,
198,
22602,
627,
3784,
37313,
42671,
25,
1796,
17752,
60,
55609,
198,
5715,
279,
4573,
315,
279,
8859,
8995,
1665,
627,
797,
13,
510,
2118,
5317,
8995,
9520,
1054,
657,
1026,
9520,
1054,
2569,
2192,
863,
933,
3784,
37313,
3537,
53810,
25,
30226,
17752,
11,
610,
60,
55609,
198,
5715,
264,
2472,
315,
4797,
5811,
5144,
311,
6367,
14483,
627,
797,
13,
314,
2118,
2569,
2192,
11959,
3173,
57633,
1054,
32033,
15836,
11669,
6738,
863,
534,
3784,
37313,
26684,
8499,
25,
1845,
55609,
198,
5715,
3508,
477,
539,
279,
538,
374,
6275,
8499,
627,
2590,
5649,
76747,
60,
55609,
198,
33,
2315,
25,
1665,
198,
7843,
369,
420,
4611,
67,
8322,
1665,
627,
15824,
284,
364,
2000,
21301,
6,
55609
] | https://langchain.readthedocs.io/en/latest/llms/langchain.llms.writer.Writer.html |
1e265abd1285-4 | Configuration for this pydantic object.
extra = 'forbid'¶ | [
7843,
369,
420,
4611,
67,
8322,
1665,
627,
15824,
284,
364,
2000,
21301,
6,
55609
] | https://langchain.readthedocs.io/en/latest/llms/langchain.llms.writer.Writer.html |
05e8e554f6ac-0 | langchain.llms.openai.OpenAIChat¶
class langchain.llms.openai.OpenAIChat(*, cache: Optional[bool] = None, verbose: bool = None, callbacks: Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]] = None, callback_manager: Optional[BaseCallbackManager] = None, tags: Optional[List[str]] = None, client: Any = None, model_name: str = 'gpt-3.5-turbo', model_kwargs: Dict[str, Any] = None, openai_api_key: Optional[str] = None, openai_api_base: Optional[str] = None, openai_proxy: Optional[str] = None, max_retries: int = 6, prefix_messages: List = None, streaming: bool = False, allowed_special: Union[Literal['all'], AbstractSet[str]] = {}, disallowed_special: Union[Literal['all'], Collection[str]] = 'all')[source]¶
Bases: BaseLLM
Wrapper around OpenAI Chat large language models.
To use, you should have the openai python package installed, and the
environment variable OPENAI_API_KEY set with your API key.
Any parameters that are valid to be passed to the openai.create call can be passed
in, even if not explicitly saved on this class.
Example
from langchain.llms import OpenAIChat
openaichat = OpenAIChat(model_name="gpt-3.5-turbo")
Create a new model by parsing and validating input data from keyword arguments.
Raises ValidationError if the input data cannot be parsed to form a valid model.
param allowed_special: Union[Literal['all'], AbstractSet[str]] = {}¶
Set of special tokens that are allowed。
param cache: Optional[bool] = None¶
param callback_manager: Optional[BaseCallbackManager] = None¶ | [
5317,
8995,
60098,
1026,
5949,
2192,
13250,
15836,
16047,
55609,
198,
1058,
8859,
8995,
60098,
1026,
5949,
2192,
13250,
15836,
16047,
4163,
11,
6636,
25,
12536,
58,
2707,
60,
284,
2290,
11,
14008,
25,
1845,
284,
2290,
11,
27777,
25,
12536,
58,
33758,
53094,
58,
4066,
7646,
3126,
1145,
5464,
7646,
2087,
5163,
284,
2290,
11,
4927,
12418,
25,
12536,
58,
4066,
7646,
2087,
60,
284,
2290,
11,
9681,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
3016,
25,
5884,
284,
2290,
11,
1646,
1292,
25,
610,
284,
364,
70,
418,
12,
18,
13,
20,
2442,
324,
754,
518,
1646,
37335,
25,
30226,
17752,
11,
5884,
60,
284,
2290,
11,
1825,
2192,
11959,
3173,
25,
12536,
17752,
60,
284,
2290,
11,
1825,
2192,
11959,
7806,
25,
12536,
17752,
60,
284,
2290,
11,
1825,
2192,
30812,
25,
12536,
17752,
60,
284,
2290,
11,
1973,
1311,
4646,
25,
528,
284,
220,
21,
11,
9436,
24321,
25,
1796,
284,
2290,
11,
17265,
25,
1845,
284,
3641,
11,
5535,
42729,
25,
9323,
58,
17802,
681,
543,
4181,
13822,
1681,
17752,
5163,
284,
16857,
834,
21642,
42729,
25,
9323,
58,
17802,
681,
543,
4181,
11348,
17752,
5163,
284,
364,
543,
13588,
2484,
60,
55609,
198,
33,
2315,
25,
5464,
4178,
44,
198,
11803,
2212,
5377,
15836,
13149,
3544,
4221,
4211,
627,
1271,
1005,
11,
499,
1288,
617,
279,
1825,
2192,
10344,
6462,
10487,
11,
323,
279,
198,
24175,
3977,
30941,
15836,
11669,
6738,
743,
449,
701,
5446,
1401,
627,
8780,
5137,
430,
527,
2764,
311,
387,
5946,
311,
279,
1825,
2192,
2581,
1650,
649,
387,
5946,
198,
258,
11,
1524,
422,
539,
21650,
6924,
389,
420,
538,
627,
13617,
198,
1527,
8859,
8995,
60098,
1026,
1179,
5377,
15836,
16047,
198,
2569,
64,
718,
266,
284,
5377,
15836,
16047,
7790,
1292,
429,
70,
418,
12,
18,
13,
20,
2442,
324,
754,
1158,
4110,
264,
502,
1646,
555,
23115,
323,
69772,
1988,
828,
505,
16570,
6105,
627,
36120,
54129,
422,
279,
1988,
828,
4250,
387,
16051,
311,
1376,
264,
2764,
1646,
627,
913,
5535,
42729,
25,
9323,
58,
17802,
681,
543,
4181,
13822,
1681,
17752,
5163,
284,
4792,
55609,
198,
1681,
315,
3361,
11460,
430,
527,
5535,
9174,
913,
6636,
25,
12536,
58,
2707,
60,
284,
2290,
55609,
198,
913,
4927,
12418,
25,
12536,
58,
4066,
7646,
2087,
60,
284,
2290,
55609
] | https://langchain.readthedocs.io/en/latest/llms/langchain.llms.openai.OpenAIChat.html |
05e8e554f6ac-1 | param callback_manager: Optional[BaseCallbackManager] = None¶
param callbacks: Callbacks = None¶
param disallowed_special: Union[Literal['all'], Collection[str]] = 'all'¶
Set of special tokens that are not allowed。
param max_retries: int = 6¶
Maximum number of retries to make when generating.
param model_kwargs: Dict[str, Any] [Optional]¶
Holds any model parameters valid for create call not explicitly specified.
param model_name: str = 'gpt-3.5-turbo'¶
Model name to use.
param openai_api_base: Optional[str] = None¶
param openai_api_key: Optional[str] = None¶
param openai_proxy: Optional[str] = None¶
param prefix_messages: List [Optional]¶
Series of messages for Chat input.
param streaming: bool = False¶
Whether to stream the results or not.
param tags: Optional[List[str]] = None¶
Tags to add to the run trace.
param verbose: bool [Optional]¶
Whether to print out response text.
__call__(prompt: str, stop: Optional[List[str]] = None, callbacks: Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]] = None, **kwargs: Any) → str¶
Check Cache and run the LLM on the given prompt and input.
async agenerate(prompts: List[str], stop: Optional[List[str]] = None, callbacks: Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]] = None, *, tags: Optional[List[str]] = None, **kwargs: Any) → LLMResult¶
Run the LLM on the given prompt and input. | [
913,
4927,
12418,
25,
12536,
58,
4066,
7646,
2087,
60,
284,
2290,
55609,
198,
913,
27777,
25,
23499,
82,
284,
2290,
55609,
198,
913,
834,
21642,
42729,
25,
9323,
58,
17802,
681,
543,
4181,
11348,
17752,
5163,
284,
364,
543,
6,
55609,
198,
1681,
315,
3361,
11460,
430,
527,
539,
5535,
9174,
913,
1973,
1311,
4646,
25,
528,
284,
220,
21,
55609,
198,
28409,
1396,
315,
61701,
311,
1304,
994,
24038,
627,
913,
1646,
37335,
25,
30226,
17752,
11,
5884,
60,
510,
15669,
60,
55609,
198,
39,
18938,
904,
1646,
5137,
2764,
369,
1893,
1650,
539,
21650,
5300,
627,
913,
1646,
1292,
25,
610,
284,
364,
70,
418,
12,
18,
13,
20,
2442,
324,
754,
6,
55609,
198,
1747,
836,
311,
1005,
627,
913,
1825,
2192,
11959,
7806,
25,
12536,
17752,
60,
284,
2290,
55609,
198,
913,
1825,
2192,
11959,
3173,
25,
12536,
17752,
60,
284,
2290,
55609,
198,
913,
1825,
2192,
30812,
25,
12536,
17752,
60,
284,
2290,
55609,
198,
913,
9436,
24321,
25,
1796,
510,
15669,
60,
55609,
198,
26625,
315,
6743,
369,
13149,
1988,
627,
913,
17265,
25,
1845,
284,
3641,
55609,
198,
25729,
311,
4365,
279,
3135,
477,
539,
627,
913,
9681,
25,
12536,
53094,
17752,
5163,
284,
2290,
55609,
198,
16309,
311,
923,
311,
279,
1629,
11917,
627,
913,
14008,
25,
1845,
510,
15669,
60,
55609,
198,
25729,
311,
1194,
704,
2077,
1495,
627,
565,
6797,
3889,
41681,
25,
610,
11,
3009,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
27777,
25,
12536,
58,
33758,
53094,
58,
4066,
7646,
3126,
1145,
5464,
7646,
2087,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
610,
55609,
198,
4061,
20044,
323,
1629,
279,
445,
11237,
389,
279,
2728,
10137,
323,
1988,
627,
7847,
945,
13523,
84432,
13044,
25,
1796,
17752,
1145,
3009,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
27777,
25,
12536,
58,
33758,
53094,
58,
4066,
7646,
3126,
1145,
5464,
7646,
2087,
5163,
284,
2290,
11,
12039,
9681,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
445,
11237,
2122,
55609,
198,
6869,
279,
445,
11237,
389,
279,
2728,
10137,
323,
1988,
13
] | https://langchain.readthedocs.io/en/latest/llms/langchain.llms.openai.OpenAIChat.html |
05e8e554f6ac-2 | Run the LLM on the given prompt and input.
async agenerate_prompt(prompts: List[PromptValue], stop: Optional[List[str]] = None, callbacks: Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]] = None, **kwargs: Any) → LLMResult¶
Take in a list of prompt values and return an LLMResult.
classmethod all_required_field_names() → Set¶
async apredict(text: str, *, stop: Optional[Sequence[str]] = None, **kwargs: Any) → str¶
Predict text from text.
async apredict_messages(messages: List[BaseMessage], *, stop: Optional[Sequence[str]] = None, **kwargs: Any) → BaseMessage¶
Predict message from messages.
validator build_extra » all fields[source]¶
Build extra kwargs from additional params that were passed in.
dict(**kwargs: Any) → Dict¶
Return a dictionary of the LLM.
generate(prompts: List[str], stop: Optional[List[str]] = None, callbacks: Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]] = None, *, tags: Optional[List[str]] = None, **kwargs: Any) → LLMResult¶
Run the LLM on the given prompt and input.
generate_prompt(prompts: List[PromptValue], stop: Optional[List[str]] = None, callbacks: Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]] = None, **kwargs: Any) → LLMResult¶
Take in a list of prompt values and return an LLMResult.
get_num_tokens(text: str) → int¶
Get the number of tokens present in the text.
get_num_tokens_from_messages(messages: List[BaseMessage]) → int¶
Get the number of tokens in the message.
get_token_ids(text: str) → List[int][source]¶ | [
6869,
279,
445,
11237,
389,
279,
2728,
10137,
323,
1988,
627,
7847,
945,
13523,
62521,
84432,
13044,
25,
1796,
43447,
15091,
1150,
1145,
3009,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
27777,
25,
12536,
58,
33758,
53094,
58,
4066,
7646,
3126,
1145,
5464,
7646,
2087,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
445,
11237,
2122,
55609,
198,
18293,
304,
264,
1160,
315,
10137,
2819,
323,
471,
459,
445,
11237,
2122,
627,
27853,
682,
19265,
5121,
9366,
368,
11651,
2638,
55609,
198,
7847,
1469,
9037,
7383,
25,
610,
11,
12039,
3009,
25,
12536,
58,
14405,
17752,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
610,
55609,
198,
54644,
1495,
505,
1495,
627,
7847,
1469,
9037,
24321,
56805,
25,
1796,
58,
4066,
2097,
1145,
12039,
3009,
25,
12536,
58,
14405,
17752,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
5464,
2097,
55609,
198,
54644,
1984,
505,
6743,
627,
16503,
1977,
32958,
4194,
8345,
4194,
682,
5151,
76747,
60,
55609,
198,
11313,
5066,
16901,
505,
5217,
3712,
430,
1051,
5946,
304,
627,
8644,
22551,
9872,
25,
5884,
8,
11651,
30226,
55609,
198,
5715,
264,
11240,
315,
279,
445,
11237,
627,
19927,
84432,
13044,
25,
1796,
17752,
1145,
3009,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
27777,
25,
12536,
58,
33758,
53094,
58,
4066,
7646,
3126,
1145,
5464,
7646,
2087,
5163,
284,
2290,
11,
12039,
9681,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
445,
11237,
2122,
55609,
198,
6869,
279,
445,
11237,
389,
279,
2728,
10137,
323,
1988,
627,
19927,
62521,
84432,
13044,
25,
1796,
43447,
15091,
1150,
1145,
3009,
25,
12536,
53094,
17752,
5163,
284,
2290,
11,
27777,
25,
12536,
58,
33758,
53094,
58,
4066,
7646,
3126,
1145,
5464,
7646,
2087,
5163,
284,
2290,
11,
3146,
9872,
25,
5884,
8,
11651,
445,
11237,
2122,
55609,
198,
18293,
304,
264,
1160,
315,
10137,
2819,
323,
471,
459,
445,
11237,
2122,
627,
456,
4369,
29938,
7383,
25,
610,
8,
11651,
528,
55609,
198,
1991,
279,
1396,
315,
11460,
3118,
304,
279,
1495,
627,
456,
4369,
29938,
5791,
24321,
56805,
25,
1796,
58,
4066,
2097,
2526,
11651,
528,
55609,
198,
1991,
279,
1396,
315,
11460,
304,
279,
1984,
627,
456,
6594,
8237,
7383,
25,
610,
8,
11651,
1796,
19155,
1483,
2484,
60,
55609
] | https://langchain.readthedocs.io/en/latest/llms/langchain.llms.openai.OpenAIChat.html |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.