ashutoshchoudhari commited on
Commit
9531029
·
1 Parent(s): d679794

Add live demo link, usage instructions, and deployment details to README.md

Browse files
Files changed (1) hide show
  1. README.md +47 -1
README.md CHANGED
@@ -18,6 +18,9 @@ Check out the configuration reference at https://huggingface.co/docs/hub/spaces-
18
  ## Overview
19
  This application is a powerful research assistant built with Langchain that can search across multiple knowledge sources including Wikipedia, arXiv, and the web via DuckDuckGo. It leverages Groq's LLM capabilities to provide intelligent, context-aware responses to user queries.
20
 
 
 
 
21
  ## Features
22
  - **Multi-source search**: Access information from Wikipedia, arXiv scientific papers, and web results
23
  - **Conversational memory**: Retains context from previous interactions
@@ -139,4 +142,47 @@ Required packages include:
139
  - ollama, langchain-ollama (for local model support)
140
 
141
  ## Environment Variables
142
- Create a `.env` file with the following variables:
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
18
  ## Overview
19
  This application is a powerful research assistant built with Langchain that can search across multiple knowledge sources including Wikipedia, arXiv, and the web via DuckDuckGo. It leverages Groq's LLM capabilities to provide intelligent, context-aware responses to user queries.
20
 
21
+ ## Live Demo
22
+ Try the application live at: [https://huggingface.co/spaces/ashutoshchoudhari/Search-Engine-LLM-app](https://huggingface.co/spaces/ashutoshchoudhari/Search-Engine-LLM-app)
23
+
24
  ## Features
25
  - **Multi-source search**: Access information from Wikipedia, arXiv scientific papers, and web results
26
  - **Conversational memory**: Retains context from previous interactions
 
142
  - ollama, langchain-ollama (for local model support)
143
 
144
  ## Environment Variables
145
+ Create a `.env` file with the following variables:
146
+ ```
147
+ GROQ_API_KEY=your_groq_api_key_here
148
+ HF_TOKEN=your_huggingface_token_here
149
+ ```
150
+
151
+ ## Usage
152
+ 1. Start the application using Streamlit:
153
+ ```bash
154
+ streamlit run app.py
155
+ ```
156
+ 2. Enter your Groq API key in the sidebar when prompted
157
+ 3. Type your research question in the chat input box
158
+ 4. The agent will search across available sources and provide a comprehensive response
159
+ 5. Your conversation history will be maintained throughout the session
160
+
161
+ ## Example Queries
162
+ - "What are the latest developments in quantum computing?"
163
+ - "Explain the concept of transformer models in NLP"
164
+ - "What were the key findings from the recent climate change report?"
165
+ - "Tell me about the history and applications of reinforcement learning"
166
+
167
+ ## Deployment
168
+ This project is configured to deploy to Hugging Face Spaces using GitHub Actions. The workflow in `.github/workflows/main.yaml` automatically syncs the repository to Hugging Face when changes are pushed to the main branch.
169
+
170
+ ### Live Application
171
+ The app is currently deployed and accessible at: [https://huggingface.co/spaces/ashutoshchoudhari/Search-Engine-LLM-app](https://huggingface.co/spaces/ashutoshchoudhari/Search-Engine-LLM-app)
172
+
173
+ ### Local Development
174
+ For local development, you can use:
175
+ ```bash
176
+ streamlit run app.py
177
+ ```
178
+
179
+ ## License
180
+ This project is licensed under the MIT License - see the LICENSE[LICENSE] file for details.
181
+
182
+ ## Acknowledgments
183
+ - Langchain for providing the agent and tool framework
184
+ - Groq for the LLM API access
185
+ - Hugging Face for embeddings and hosting capabilities
186
+
187
+ ## Contributing
188
+ Contributions, issues, and feature requests are welcome. Feel free to check issues page if you want to contribute.