ashutoshchoudhari commited on
Commit
d679794
·
1 Parent(s): 0b756d7

Update SDK version to 1.43.2 and remove outdated app documentation

Browse files
Files changed (2) hide show
  1. README.md +1 -1
  2. app_documentation.md +0 -127
README.md CHANGED
@@ -4,7 +4,7 @@ emoji: 🌍
4
  colorFrom: pink
5
  colorTo: indigo
6
  sdk: streamlit
7
- sdk_version: 1.42.0
8
  app_file: app.py
9
  pinned: false
10
  license: mit
 
4
  colorFrom: pink
5
  colorTo: indigo
6
  sdk: streamlit
7
+ sdk_version: 1.43.2
8
  app_file: app.py
9
  pinned: false
10
  license: mit
app_documentation.md DELETED
@@ -1,127 +0,0 @@
1
- # Langchain Search Assistant
2
-
3
- ## Overview
4
- This application is a powerful research assistant built with Langchain that can search across multiple knowledge sources including Wikipedia, arXiv, and the web via DuckDuckGo. It leverages Groq's LLM capabilities to provide intelligent, context-aware responses to user queries.
5
-
6
- ## Features
7
- - **Multi-source search**: Access information from Wikipedia, arXiv scientific papers, and web results
8
- - **Conversational memory**: Retains context from previous interactions
9
- - **Streaming responses**: See the AI's response generated in real-time
10
- - **User-friendly interface**: Clean Streamlit UI for easy interaction
11
-
12
- ## Technical Components
13
- - **LLM**: Groq's Llama3-8b-8192 model (with fallback support for Ollama models)
14
- - **Embeddings**: Hugging Face's all-MiniLM-L6-v2
15
- - **Search Tools**:
16
- - Wikipedia API
17
- - arXiv API
18
- - DuckDuckGo Search
19
- - **Framework**: Langchain for agent orchestration
20
- - **Frontend**: Streamlit
21
-
22
- ## Project Structure
23
- - **app.py**: Main application file containing the Streamlit UI and Langchain integration
24
- - **requirements.txt**: Dependencies required to run the application
25
- - **README.md**: Project metadata and description for Hugging Face Spaces
26
- - **tools_agents.ipynb**: Jupyter notebook demonstrating how to use Langchain tools and agents
27
- - **.github/workflows/main.yaml**: GitHub Actions workflow for deploying to Hugging Face Spaces
28
- - **.gitattributes**: Git LFS configuration for handling large files
29
- - **.gitignore**: Standard Python gitignore file
30
- - **LICENSE**: MIT License file
31
- - **app_documentation.md**: This comprehensive documentation file
32
-
33
- ## Implementation Details
34
-
35
- ### LLM Integration
36
- The application uses Groq's API to access the Llama3-8b-8192 model with streaming capability:
37
-
38
- ```python
39
- llm = ChatGroq(
40
- groq_api_key = st.session_state.api_key,
41
- model_name = "Llama3-8b-8192",
42
- streaming = True
43
- )
44
- ```
45
-
46
- Alternative local models can also be configured with Ollama:
47
- ```python
48
- #llm = ChatOllama(base_url=OLLAMA_WSL_IP, model="llama3.1", streaming=True)
49
- ```
50
-
51
- ### Search Tools Configuration
52
- The app configures three primary search tools:
53
-
54
- 1. **Wikipedia Search**:
55
- ```python
56
- api_wrapper_wiki = WikipediaAPIWrapper(top_k_results = 3, doc_content_chars_max=10000)
57
- wiki = WikipediaQueryRun(api_wrapper=api_wrapper_wiki)
58
- wiki_tool= Tool(
59
- name = "Wikipedia",
60
- func = wiki.run,
61
- description = "This tool uses the Wikipedia API to search for a topic."
62
- )
63
- ```
64
-
65
- 2. **arXiv Search**:
66
- ```python
67
- api_wrapper_arxiv = ArxivAPIWrapper(top_k_results = 5, doc_content_chars_max=10000)
68
- arxiv = ArxivQueryRun(api_wrapper=api_wrapper_arxiv)
69
- arxiv_tool = Tool(
70
- name = "arxiv",
71
- func = arxiv.run,
72
- description = "Searches arXiv for papers matching the query.",
73
- )
74
- ```
75
-
76
- 3. **DuckDuckGo Web Search**:
77
- ```python
78
- api_wrapper_ddg = DuckDuckGoSearchAPIWrapper(region="us-en", time="y", max_results=10)
79
- ddg = DuckDuckGoSearchResults(
80
- api_wrapper=api_wrapper_ddg,
81
- output_format="string",
82
- handle_tool_error=True,
83
- handle_validation_error=True)
84
- ddg_tool = Tool(
85
- name = "DuckDuckGo_Search",
86
- func = ddg.run,
87
- description = "Searches for search queries using the DuckDuckGo Search engine."
88
- )
89
- ```
90
-
91
- ### Agent Configuration
92
- The system uses the CHAT_CONVERSATIONAL_REACT_DESCRIPTION agent type with a conversational memory buffer:
93
-
94
- ```python
95
- memory = ConversationBufferWindowMemory(k=5, memory_key="chat_history", return_messages=True)
96
-
97
- search_agent = initialize_agent(
98
- tools = tools,
99
- llm = llm,
100
- agent = AgentType.CHAT_CONVERSATIONAL_REACT_DESCRIPTION,
101
- max_iterations = 10,
102
- memory = memory,
103
- handle_parsing_errors = True)
104
- ```
105
-
106
- ## Setup Requirements
107
- 1. Groq API key
108
- 2. Hugging Face token (for embeddings)
109
- 3. Python environment with required dependencies
110
-
111
- ## Installation Instructions
112
- Install the required packages using:
113
-
114
- ```bash
115
- pip install -r requirements.txt
116
- ```
117
-
118
- Required packages include:
119
- - arxiv
120
- - wikipedia
121
- - langchain, langchain-community, langchain-huggingface, langchain-groq
122
- - openai
123
- - duckduckgo-search
124
- - ollama, langchain-ollama (for local model support)
125
-
126
- ## Environment Variables
127
- Create a `.env` file with the following variables: