wishwakankanamg commited on
Commit
fe2bb7c
Β·
1 Parent(s): 9a1b0b5

fixed brainstorm node malfunction issue

Browse files
Files changed (4) hide show
  1. __pycache__/graph.cpython-313.pyc +0 -0
  2. app.log +276 -0
  3. app.py +4 -4
  4. graph.py +114 -124
__pycache__/graph.cpython-313.pyc CHANGED
Binary files a/__pycache__/graph.cpython-313.pyc and b/__pycache__/graph.cpython-313.pyc differ
 
app.log CHANGED
@@ -51474,3 +51474,279 @@ For troubleshooting, visit: https://python.langchain.com/docs/troubleshooting/er
51474
  2025-06-08 06:38:49:__main__:INFO: Greeting added for new user via handle_initial_greeting_load.
51475
  2025-06-08 06:39:58:__main__:INFO: Starting the interface
51476
  2025-06-08 06:40:44:__main__:INFO: Greeting added for new user via handle_initial_greeting_load.
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
51474
  2025-06-08 06:38:49:__main__:INFO: Greeting added for new user via handle_initial_greeting_load.
51475
  2025-06-08 06:39:58:__main__:INFO: Starting the interface
51476
  2025-06-08 06:40:44:__main__:INFO: Greeting added for new user via handle_initial_greeting_load.
51477
+ 2025-06-08 06:41:52:__main__:INFO: Starting the interface
51478
+ 2025-06-08 06:44:28:__main__:INFO: Starting the interface
51479
+ 2025-06-08 06:45:58:__main__:INFO: Greeting added for new user via handle_initial_greeting_load.
51480
+ 2025-06-08 06:47:44:__main__:INFO: Starting the interface
51481
+ 2025-06-08 06:48:22:__main__:INFO: Starting the interface
51482
+ 2025-06-08 06:49:19:__main__:INFO: Starting the interface
51483
+ 2025-06-08 06:50:03:__main__:INFO: Greeting added for new user via handle_initial_greeting_load.
51484
+ 2025-06-08 06:59:50:__main__:INFO: Starting the interface
51485
+ 2025-06-08 07:00:02:__main__:INFO: Greeting added for new user via handle_initial_greeting_load.
51486
+ 2025-06-08 07:00:39:__main__:INFO: Greeting added for new user via handle_initial_greeting_load.
51487
+ 2025-06-08 07:00:56:__main__:INFO: Greeting added for new user via handle_initial_greeting_load.
51488
+ 2025-06-08 07:01:03:__main__:INFO: Greeting added for new user via handle_initial_greeting_load.
51489
+ 2025-06-08 07:01:46:__main__:INFO: Starting the interface
51490
+ 2025-06-08 07:01:50:__main__:INFO: Greeting added for new user via handle_initial_greeting_load.
51491
+ 2025-06-08 07:02:07:__main__:INFO: Prompt: You are a helpful assistant.
51492
+ 2025-06-08 07:02:47:__main__:INFO: Prompt: You are a helpful assistant.
51493
+ 2025-06-08 07:03:06:__main__:INFO: Prompt: You are a helpful assistant.
51494
+ 2025-06-08 07:03:39:__main__:INFO: Prompt: You are a helpful assistant.
51495
+ 2025-06-08 07:04:06:__main__:ERROR: Exception occurred
51496
+ Traceback (most recent call last):
51497
+ File "C:\myworkspace\huggingface_agents\mcp-hackthon\dev\DIY_assistant\app.py", line 99, in chat_fn
51498
+ async for stream_mode, chunk in graph.astream(
51499
+ ...<56 lines>...
51500
+ yield output, gr.skip(), gr.skip()
51501
+ File "C:\myworkspace\huggingface_agents\mcp-hackthon\dev\DIY_assistant\env\Lib\site-packages\langgraph\pregel\__init__.py", line 2677, in astream
51502
+ raise GraphRecursionError(msg)
51503
+ langgraph.errors.GraphRecursionError: Recursion limit of 20 reached without hitting a stop condition. You can increase the limit by setting the `recursion_limit` config key.
51504
+ For troubleshooting, visit: https://python.langchain.com/docs/troubleshooting/errors/GRAPH_RECURSION_LIMIT
51505
+ 2025-06-08 07:08:50:__main__:INFO: Starting the interface
51506
+ 2025-06-08 07:14:56:__main__:INFO: Starting the interface
51507
+ 2025-06-08 07:15:16:__main__:INFO: Greeting added for new user via handle_initial_greeting_load.
51508
+ 2025-06-08 07:15:36:__main__:INFO: Prompt: You are a helpful assistant.
51509
+ 2025-06-08 07:16:56:__main__:INFO: Greeting added for new user via handle_initial_greeting_load.
51510
+ 2025-06-08 07:18:26:__main__:INFO: Greeting added for new user via handle_initial_greeting_load.
51511
+ 2025-06-08 07:19:05:__main__:INFO: Greeting added for new user via handle_initial_greeting_load.
51512
+ 2025-06-08 07:19:30:__main__:INFO: Greeting added for new user via handle_initial_greeting_load.
51513
+ 2025-06-08 07:20:11:__main__:INFO: Greeting added for new user via handle_initial_greeting_load.
51514
+ 2025-06-08 07:20:25:__main__:INFO: Greeting added for new user via handle_initial_greeting_load.
51515
+ 2025-06-08 07:21:01:__main__:INFO: Greeting added for new user via handle_initial_greeting_load.
51516
+ 2025-06-08 07:28:50:__main__:INFO: Starting the interface
51517
+ 2025-06-08 07:28:54:__main__:INFO: Greeting added for new user via handle_initial_greeting_load.
51518
+ 2025-06-08 07:30:10:__main__:INFO: Prompt: You are a helpful assistant.
51519
+ 2025-06-08 07:30:19:__main__:ERROR: Exception occurred
51520
+ Traceback (most recent call last):
51521
+ File "C:\myworkspace\huggingface_agents\mcp-hackthon\dev\DIY_assistant\app.py", line 99, in chat_fn
51522
+ async for stream_mode, chunk in graph.astream(
51523
+ ...<56 lines>...
51524
+ yield output, gr.skip(), gr.skip()
51525
+ File "C:\myworkspace\huggingface_agents\mcp-hackthon\dev\DIY_assistant\env\Lib\site-packages\langgraph\pregel\__init__.py", line 2677, in astream
51526
+ raise GraphRecursionError(msg)
51527
+ langgraph.errors.GraphRecursionError: Recursion limit of 20 reached without hitting a stop condition. You can increase the limit by setting the `recursion_limit` config key.
51528
+ For troubleshooting, visit: https://python.langchain.com/docs/troubleshooting/errors/GRAPH_RECURSION_LIMIT
51529
+ 2025-06-08 07:31:06:__main__:INFO: Starting the interface
51530
+ 2025-06-08 07:31:31:__main__:INFO: Greeting added for new user via handle_initial_greeting_load.
51531
+ 2025-06-08 07:31:57:__main__:INFO: Prompt: You are a helpful assistant.
51532
+ 2025-06-08 07:32:46:__main__:INFO: Prompt: You are a helpful assistant.
51533
+ 2025-06-08 07:34:19:__main__:INFO: Prompt: You are a helpful assistant.
51534
+ 2025-06-08 07:34:45:__main__:INFO: Prompt: You are a helpful assistant.
51535
+ 2025-06-08 07:35:17:__main__:INFO: Prompt: You are a helpful assistant.
51536
+ 2025-06-08 07:35:44:__main__:INFO: Prompt: You are a helpful assistant.
51537
+ 2025-06-08 07:36:09:__main__:ERROR: Exception occurred
51538
+ Traceback (most recent call last):
51539
+ File "C:\myworkspace\huggingface_agents\mcp-hackthon\dev\DIY_assistant\app.py", line 99, in chat_fn
51540
+ async for stream_mode, chunk in graph.astream(
51541
+ ...<56 lines>...
51542
+ yield output, gr.skip(), gr.skip()
51543
+ File "C:\myworkspace\huggingface_agents\mcp-hackthon\dev\DIY_assistant\env\Lib\site-packages\langgraph\pregel\__init__.py", line 2677, in astream
51544
+ raise GraphRecursionError(msg)
51545
+ langgraph.errors.GraphRecursionError: Recursion limit of 20 reached without hitting a stop condition. You can increase the limit by setting the `recursion_limit` config key.
51546
+ For troubleshooting, visit: https://python.langchain.com/docs/troubleshooting/errors/GRAPH_RECURSION_LIMIT
51547
+ 2025-06-08 07:48:19:__main__:INFO: Starting the interface
51548
+ 2025-06-08 07:48:28:__main__:INFO: Greeting added for new user via handle_initial_greeting_load.
51549
+ 2025-06-08 07:48:39:__main__:INFO: Prompt: You are a helpful assistant.
51550
+ 2025-06-08 07:48:54:__main__:INFO: Prompt: You are a helpful assistant.
51551
+ 2025-06-08 07:49:56:__main__:INFO: Prompt: You are a helpful assistant.
51552
+ 2025-06-08 07:50:22:__main__:INFO: Prompt: You are a helpful assistant.
51553
+ 2025-06-08 07:50:47:__main__:ERROR: Exception occurred
51554
+ Traceback (most recent call last):
51555
+ File "C:\myworkspace\huggingface_agents\mcp-hackthon\dev\DIY_assistant\app.py", line 99, in chat_fn
51556
+ async for stream_mode, chunk in graph.astream(
51557
+ ...<56 lines>...
51558
+ yield output, gr.skip(), gr.skip()
51559
+ File "C:\myworkspace\huggingface_agents\mcp-hackthon\dev\DIY_assistant\env\Lib\site-packages\langgraph\pregel\__init__.py", line 2677, in astream
51560
+ raise GraphRecursionError(msg)
51561
+ langgraph.errors.GraphRecursionError: Recursion limit of 20 reached without hitting a stop condition. You can increase the limit by setting the `recursion_limit` config key.
51562
+ For troubleshooting, visit: https://python.langchain.com/docs/troubleshooting/errors/GRAPH_RECURSION_LIMIT
51563
+ 2025-06-08 08:00:08:__main__:INFO: Greeting added for new user via handle_initial_greeting_load.
51564
+ 2025-06-08 08:00:14:__main__:INFO: Greeting added for new user via handle_initial_greeting_load.
51565
+ 2025-06-08 08:00:42:__main__:INFO: Starting the interface
51566
+ 2025-06-08 08:00:46:__main__:INFO: Greeting added for new user via handle_initial_greeting_load.
51567
+ 2025-06-08 08:01:01:__main__:INFO: Prompt: You are a helpful assistant.
51568
+ 2025-06-08 08:01:32:__main__:INFO: Prompt: You are a helpful assistant.
51569
+ 2025-06-08 08:02:19:__main__:INFO: Prompt: You are a helpful assistant.
51570
+ 2025-06-08 08:02:52:__main__:INFO: Prompt: You are a helpful assistant.
51571
+ 2025-06-08 08:03:22:__main__:ERROR: Exception occurred
51572
+ Traceback (most recent call last):
51573
+ File "C:\myworkspace\huggingface_agents\mcp-hackthon\dev\DIY_assistant\app.py", line 99, in chat_fn
51574
+ async for stream_mode, chunk in graph.astream(
51575
+ ...<56 lines>...
51576
+ yield output, gr.skip(), gr.skip()
51577
+ File "C:\myworkspace\huggingface_agents\mcp-hackthon\dev\DIY_assistant\env\Lib\site-packages\langgraph\pregel\__init__.py", line 2677, in astream
51578
+ raise GraphRecursionError(msg)
51579
+ langgraph.errors.GraphRecursionError: Recursion limit of 20 reached without hitting a stop condition. You can increase the limit by setting the `recursion_limit` config key.
51580
+ For troubleshooting, visit: https://python.langchain.com/docs/troubleshooting/errors/GRAPH_RECURSION_LIMIT
51581
+ 2025-06-08 08:09:02:__main__:INFO: Starting the interface
51582
+ 2025-06-08 08:09:08:__main__:INFO: Greeting added for new user via handle_initial_greeting_load.
51583
+ 2025-06-08 08:09:16:__main__:INFO: Prompt: You are a helpful assistant.
51584
+ 2025-06-08 08:10:30:__main__:INFO: Prompt: You are a helpful assistant.
51585
+ 2025-06-08 08:10:54:__main__:INFO: Prompt: You are a helpful assistant.
51586
+ 2025-06-08 08:11:13:__main__:INFO: Prompt: You are a helpful assistant.
51587
+ 2025-06-08 08:11:16:__main__:ERROR: Exception occurred
51588
+ Traceback (most recent call last):
51589
+ File "C:\myworkspace\huggingface_agents\mcp-hackthon\dev\DIY_assistant\app.py", line 99, in chat_fn
51590
+ async for stream_mode, chunk in graph.astream(
51591
+ ...<56 lines>...
51592
+ yield output, gr.skip(), gr.skip()
51593
+ File "C:\myworkspace\huggingface_agents\mcp-hackthon\dev\DIY_assistant\env\Lib\site-packages\langgraph\pregel\__init__.py", line 2655, in astream
51594
+ async for _ in runner.atick(
51595
+ ...<7 lines>...
51596
+ yield o
51597
+ File "C:\myworkspace\huggingface_agents\mcp-hackthon\dev\DIY_assistant\env\Lib\site-packages\langgraph\pregel\runner.py", line 400, in atick
51598
+ _panic_or_proceed(
51599
+ ~~~~~~~~~~~~~~~~~^
51600
+ futures.done.union(f for f, t in futures.items() if t is not None),
51601
+ ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
51602
+ timeout_exc_cls=asyncio.TimeoutError,
51603
+ ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
51604
+ panic=reraise,
51605
+ ^^^^^^^^^^^^^^
51606
+ )
51607
+ ^
51608
+ File "C:\myworkspace\huggingface_agents\mcp-hackthon\dev\DIY_assistant\env\Lib\site-packages\langgraph\pregel\runner.py", line 509, in _panic_or_proceed
51609
+ raise exc
51610
+ File "C:\myworkspace\huggingface_agents\mcp-hackthon\dev\DIY_assistant\env\Lib\site-packages\langgraph\pregel\retry.py", line 136, in arun_with_retry
51611
+ return await task.proc.ainvoke(task.input, config)
51612
+ ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
51613
+ File "C:\myworkspace\huggingface_agents\mcp-hackthon\dev\DIY_assistant\env\Lib\site-packages\langgraph\utils\runnable.py", line 678, in ainvoke
51614
+ input = await step.ainvoke(input, config)
51615
+ ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
51616
+ File "C:\myworkspace\huggingface_agents\mcp-hackthon\dev\DIY_assistant\env\Lib\site-packages\langgraph\utils\runnable.py", line 440, in ainvoke
51617
+ ret = await self.afunc(*args, **kwargs)
51618
+ ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
51619
+ File "C:\myworkspace\huggingface_agents\mcp-hackthon\dev\DIY_assistant\env\Lib\site-packages\langgraph\graph\branch.py", line 197, in _aroute
51620
+ return self._finish(writer, input, result, config)
51621
+ ~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
51622
+ File "C:\myworkspace\huggingface_agents\mcp-hackthon\dev\DIY_assistant\env\Lib\site-packages\langgraph\graph\branch.py", line 210, in _finish
51623
+ r if isinstance(r, Send) else self.ends[r] for r in result
51624
+ ~~~~~~~~~^^^
51625
+ KeyError: '__end__'
51626
+ During task with name 'guidance_node' and id '5bd9150d-2dae-234c-83db-5f948acab738'
51627
+ 2025-06-08 08:11:55:__main__:INFO: Prompt: You are a helpful assistant.
51628
+ 2025-06-08 08:12:11:__main__:INFO: Prompt: You are a helpful assistant.
51629
+ 2025-06-08 08:12:34:__main__:INFO: Prompt: You are a helpful assistant.
51630
+ 2025-06-08 08:12:53:__main__:INFO: Prompt: You are a helpful assistant.
51631
+ 2025-06-08 08:13:14:__main__:INFO: Prompt: You are a helpful assistant.
51632
+ 2025-06-08 08:13:37:__main__:INFO: Prompt: You are a helpful assistant.
51633
+ 2025-06-08 08:13:58:__main__:ERROR: Exception occurred
51634
+ Traceback (most recent call last):
51635
+ File "C:\myworkspace\huggingface_agents\mcp-hackthon\dev\DIY_assistant\app.py", line 99, in chat_fn
51636
+ async for stream_mode, chunk in graph.astream(
51637
+ ...<56 lines>...
51638
+ yield output, gr.skip(), gr.skip()
51639
+ File "C:\myworkspace\huggingface_agents\mcp-hackthon\dev\DIY_assistant\env\Lib\site-packages\langgraph\pregel\__init__.py", line 2677, in astream
51640
+ raise GraphRecursionError(msg)
51641
+ langgraph.errors.GraphRecursionError: Recursion limit of 20 reached without hitting a stop condition. You can increase the limit by setting the `recursion_limit` config key.
51642
+ For troubleshooting, visit: https://python.langchain.com/docs/troubleshooting/errors/GRAPH_RECURSION_LIMIT
51643
+ 2025-06-08 08:17:16:__main__:INFO: Greeting added for new user via handle_initial_greeting_load.
51644
+ 2025-06-08 08:17:23:__main__:INFO: Greeting added for new user via handle_initial_greeting_load.
51645
+ 2025-06-08 08:17:31:__main__:INFO: Greeting added for new user via handle_initial_greeting_load.
51646
+ 2025-06-08 08:18:01:__main__:INFO: Starting the interface
51647
+ 2025-06-08 08:21:10:__main__:INFO: Greeting added for new user via handle_initial_greeting_load.
51648
+ 2025-06-08 08:21:17:__main__:INFO: Prompt:
51649
+ 2025-06-08 08:22:14:__main__:INFO: Prompt:
51650
+ 2025-06-08 08:23:11:__main__:INFO: Prompt:
51651
+ 2025-06-08 08:23:48:__main__:INFO: Prompt:
51652
+ 2025-06-08 08:24:18:__main__:INFO: Prompt:
51653
+ 2025-06-08 08:24:46:__main__:ERROR: Exception occurred
51654
+ Traceback (most recent call last):
51655
+ File "C:\myworkspace\huggingface_agents\mcp-hackthon\dev\DIY_assistant\app.py", line 99, in chat_fn
51656
+ async for stream_mode, chunk in graph.astream(
51657
+ ...<56 lines>...
51658
+ yield output, gr.skip(), gr.skip()
51659
+ File "C:\myworkspace\huggingface_agents\mcp-hackthon\dev\DIY_assistant\env\Lib\site-packages\langgraph\pregel\__init__.py", line 2677, in astream
51660
+ raise GraphRecursionError(msg)
51661
+ langgraph.errors.GraphRecursionError: Recursion limit of 20 reached without hitting a stop condition. You can increase the limit by setting the `recursion_limit` config key.
51662
+ For troubleshooting, visit: https://python.langchain.com/docs/troubleshooting/errors/GRAPH_RECURSION_LIMIT
51663
+ 2025-06-08 08:34:19:__main__:INFO: Starting the interface
51664
+ 2025-06-08 08:34:36:__main__:INFO: Greeting added for new user via handle_initial_greeting_load.
51665
+ 2025-06-08 08:36:01:__main__:INFO: Prompt: You are a helpful assistant.
51666
+ 2025-06-08 08:36:23:__main__:INFO: Prompt: You are a helpful assistant.
51667
+ 2025-06-08 08:36:44:__main__:INFO: Prompt: You are a helpful assistant.
51668
+ 2025-06-08 08:37:18:__main__:ERROR: Exception occurred
51669
+ Traceback (most recent call last):
51670
+ File "C:\myworkspace\huggingface_agents\mcp-hackthon\dev\DIY_assistant\app.py", line 99, in chat_fn
51671
+ async for stream_mode, chunk in graph.astream(
51672
+ ...<56 lines>...
51673
+ yield output, gr.skip(), gr.skip()
51674
+ File "C:\myworkspace\huggingface_agents\mcp-hackthon\dev\DIY_assistant\env\Lib\site-packages\langgraph\pregel\__init__.py", line 2677, in astream
51675
+ raise GraphRecursionError(msg)
51676
+ langgraph.errors.GraphRecursionError: Recursion limit of 20 reached without hitting a stop condition. You can increase the limit by setting the `recursion_limit` config key.
51677
+ For troubleshooting, visit: https://python.langchain.com/docs/troubleshooting/errors/GRAPH_RECURSION_LIMIT
51678
+ 2025-06-08 08:41:43:__main__:INFO: Greeting added for new user via handle_initial_greeting_load.
51679
+ 2025-06-08 08:42:26:__main__:INFO: Starting the interface
51680
+ 2025-06-08 08:42:35:__main__:INFO: Greeting added for new user via handle_initial_greeting_load.
51681
+ 2025-06-08 08:42:49:__main__:INFO: Prompt: You are a helpful assistant.
51682
+ 2025-06-08 08:43:00:__main__:INFO: Prompt: You are a helpful assistant.
51683
+ 2025-06-08 08:43:27:__main__:INFO: Prompt: You are a helpful assistant.
51684
+ 2025-06-08 08:43:57:__main__:ERROR: Exception occurred
51685
+ Traceback (most recent call last):
51686
+ File "C:\myworkspace\huggingface_agents\mcp-hackthon\dev\DIY_assistant\app.py", line 99, in chat_fn
51687
+ async for stream_mode, chunk in graph.astream(
51688
+ ...<56 lines>...
51689
+ yield output, gr.skip(), gr.skip()
51690
+ File "C:\myworkspace\huggingface_agents\mcp-hackthon\dev\DIY_assistant\env\Lib\site-packages\langgraph\pregel\__init__.py", line 2677, in astream
51691
+ raise GraphRecursionError(msg)
51692
+ langgraph.errors.GraphRecursionError: Recursion limit of 20 reached without hitting a stop condition. You can increase the limit by setting the `recursion_limit` config key.
51693
+ For troubleshooting, visit: https://python.langchain.com/docs/troubleshooting/errors/GRAPH_RECURSION_LIMIT
51694
+ 2025-06-08 08:47:01:__main__:INFO: Greeting added for new user via handle_initial_greeting_load.
51695
+ 2025-06-08 08:47:30:__main__:INFO: Greeting added for new user via handle_initial_greeting_load.
51696
+ 2025-06-08 08:48:46:__main__:INFO: Greeting added for new user via handle_initial_greeting_load.
51697
+ 2025-06-08 08:49:09:__main__:INFO: Starting the interface
51698
+ 2025-06-08 08:49:12:__main__:INFO: Greeting added for new user via handle_initial_greeting_load.
51699
+ 2025-06-08 08:49:24:__main__:INFO: Prompt: You are a helpful assistant.
51700
+ 2025-06-08 08:49:43:__main__:INFO: Prompt: You are a helpful assistant.
51701
+ 2025-06-08 08:51:12:__main__:INFO: Prompt: You are a helpful assistant.
51702
+ 2025-06-08 08:51:51:__main__:INFO: Prompt: You are a helpful assistant.
51703
+ 2025-06-08 08:51:53:__main__:ERROR: Exception occurred
51704
+ Traceback (most recent call last):
51705
+ File "C:\myworkspace\huggingface_agents\mcp-hackthon\dev\DIY_assistant\app.py", line 99, in chat_fn
51706
+ async for stream_mode, chunk in graph.astream(
51707
+ ...<56 lines>...
51708
+ yield output, gr.skip(), gr.skip()
51709
+ File "C:\myworkspace\huggingface_agents\mcp-hackthon\dev\DIY_assistant\env\Lib\site-packages\langgraph\pregel\__init__.py", line 2655, in astream
51710
+ async for _ in runner.atick(
51711
+ ...<7 lines>...
51712
+ yield o
51713
+ File "C:\myworkspace\huggingface_agents\mcp-hackthon\dev\DIY_assistant\env\Lib\site-packages\langgraph\pregel\runner.py", line 400, in atick
51714
+ _panic_or_proceed(
51715
+ ~~~~~~~~~~~~~~~~~^
51716
+ futures.done.union(f for f, t in futures.items() if t is not None),
51717
+ ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
51718
+ timeout_exc_cls=asyncio.TimeoutError,
51719
+ ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
51720
+ panic=reraise,
51721
+ ^^^^^^^^^^^^^^
51722
+ )
51723
+ ^
51724
+ File "C:\myworkspace\huggingface_agents\mcp-hackthon\dev\DIY_assistant\env\Lib\site-packages\langgraph\pregel\runner.py", line 509, in _panic_or_proceed
51725
+ raise exc
51726
+ File "C:\myworkspace\huggingface_agents\mcp-hackthon\dev\DIY_assistant\env\Lib\site-packages\langgraph\pregel\retry.py", line 136, in arun_with_retry
51727
+ return await task.proc.ainvoke(task.input, config)
51728
+ ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
51729
+ File "C:\myworkspace\huggingface_agents\mcp-hackthon\dev\DIY_assistant\env\Lib\site-packages\langgraph\utils\runnable.py", line 678, in ainvoke
51730
+ input = await step.ainvoke(input, config)
51731
+ ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
51732
+ File "C:\myworkspace\huggingface_agents\mcp-hackthon\dev\DIY_assistant\env\Lib\site-packages\langgraph\utils\runnable.py", line 440, in ainvoke
51733
+ ret = await self.afunc(*args, **kwargs)
51734
+ ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
51735
+ File "C:\myworkspace\huggingface_agents\mcp-hackthon\dev\DIY_assistant\env\Lib\site-packages\langgraph\graph\branch.py", line 197, in _aroute
51736
+ return self._finish(writer, input, result, config)
51737
+ ~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
51738
+ File "C:\myworkspace\huggingface_agents\mcp-hackthon\dev\DIY_assistant\env\Lib\site-packages\langgraph\graph\branch.py", line 210, in _finish
51739
+ r if isinstance(r, Send) else self.ends[r] for r in result
51740
+ ~~~~~~~~~^^^
51741
+ KeyError: '__end__'
51742
+ During task with name 'guidance_node' and id 'b37f68f6-9612-3304-c8ec-641872e32bde'
51743
+ 2025-06-08 08:53:37:__main__:INFO: Greeting added for new user via handle_initial_greeting_load.
51744
+ 2025-06-08 08:53:55:__main__:INFO: Greeting added for new user via handle_initial_greeting_load.
51745
+ 2025-06-08 08:54:01:__main__:INFO: Greeting added for new user via handle_initial_greeting_load.
51746
+ 2025-06-08 08:54:26:__main__:INFO: Greeting added for new user via handle_initial_greeting_load.
51747
+ 2025-06-08 08:54:40:__main__:INFO: Greeting added for new user via handle_initial_greeting_load.
51748
+ 2025-06-08 08:56:37:__main__:INFO: Greeting added for new user via handle_initial_greeting_load.
51749
+ 2025-06-08 08:56:56:__main__:INFO: Greeting added for new user via handle_initial_greeting_load.
51750
+ 2025-06-08 08:57:11:__main__:INFO: Greeting added for new user via handle_initial_greeting_load.
51751
+ 2025-06-08 08:57:25:__main__:INFO: Greeting added for new user via handle_initial_greeting_load.
51752
+ 2025-06-08 10:09:17:__main__:INFO: Greeting added for new user via handle_initial_greeting_load.
app.py CHANGED
@@ -362,12 +362,12 @@ footer {visibility: hidden}
362
  }
363
 
364
  .wrap.sidebar-parent {
365
- min-height: 1500px !important;
366
- height: 1500px !important;
367
  }
368
 
369
  #main-app {
370
- height: 2000px; /* or 800px, or 100% */
371
  overflow-y: auto; /* optional if you want it scrollable */
372
  }
373
 
@@ -543,7 +543,7 @@ if __name__ == "__main__":
543
  type="messages",
544
  scale=0,
545
  show_copy_button=True,
546
- height=150,
547
  editable="all",
548
  elem_classes="main-chatbox"
549
  )
 
362
  }
363
 
364
  .wrap.sidebar-parent {
365
+ min-height: 1400px !important;
366
+ height: 1400px !important;
367
  }
368
 
369
  #main-app {
370
+ height: 1600px; /* or 800px, or 100% */
371
  overflow-y: auto; /* optional if you want it scrollable */
372
  }
373
 
 
543
  type="messages",
544
  scale=0,
545
  show_copy_button=True,
546
+ height=400,
547
  editable="all",
548
  elem_classes="main-chatbox"
549
  )
graph.py CHANGED
@@ -137,14 +137,14 @@ class GraphProcessingState(BaseModel):
137
 
138
 
139
 
140
-
141
-
142
-
143
  async def guidance_node(state: GraphProcessingState, config=None):
144
- print("\n--- Guidance Node (Debug via print) ---") # Added a newline for clarity
145
-
146
 
147
  print(f"Prompt: {state.prompt}")
 
 
 
 
148
  for message in state.messages:
149
  if isinstance(message, HumanMessage):
150
  print(f"Human: {message.content}")
@@ -163,8 +163,6 @@ async def guidance_node(state: GraphProcessingState, config=None):
163
  elif isinstance(message, ToolMessage):
164
  print(f"Tool: {message.content}")
165
 
166
-
167
-
168
 
169
  # Log boolean completion flags
170
  # Define the order of stages
@@ -174,9 +172,7 @@ async def guidance_node(state: GraphProcessingState, config=None):
174
  completed = [stage for stage in stage_order if getattr(state, f"{stage}_complete", False)]
175
  incomplete = [stage for stage in stage_order if not getattr(state, f"{stage}_complete", False)]
176
 
177
- print(f"Tools Enabled: {state.tools_enabled}")
178
- print(f"Search Enabled: {state.search_enabled}")
179
- print(f"Next Stage: {state.next_stage}")
180
 
181
  # Determine the next stage
182
  if not incomplete:
@@ -189,34 +185,65 @@ async def guidance_node(state: GraphProcessingState, config=None):
189
  else:
190
  # Set the next stage to the first incomplete stage
191
  next_stage = incomplete[0]
 
 
192
  return {
193
  "messages": [],
194
  "next_stage": next_stage,
195
  "pending_approval_stage": None,
196
  }
 
 
197
 
198
- async def brainstorming_node(state: GraphProcessingState, config=None):
199
- print("\n--- brainstorming Node (Debug via print) ---") # Added a newline for clarity
 
200
 
 
 
 
 
201
 
202
- print(f"Prompt: {state.prompt}")
203
- for message in state.messages:
204
- if isinstance(message, HumanMessage):
205
- print(f"Human: {message.content}")
206
- elif isinstance(message, AIMessage):
207
- # Check if content is non-empty
208
- if message.content:
209
- # If content is a list (e.g., list of dicts), extract text
210
- if isinstance(message.content, list):
211
- texts = [item.get('text', '') for item in message.content if isinstance(item, dict) and 'text' in item]
212
- if texts:
213
- print(f"AI: {' '.join(texts)}")
214
- elif isinstance(message.content, str):
215
- print(f"AI: {message.content}")
216
- elif isinstance(message, SystemMessage):
217
- print(f"System: {message.content}")
218
- elif isinstance(message, ToolMessage):
219
- print(f"Tool: {message.content}")
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
220
 
221
  print(f"Tools Enabled: {state.tools_enabled}")
222
  print(f"Search Enabled: {state.search_enabled}")
@@ -259,63 +286,59 @@ async def brainstorming_node(state: GraphProcessingState, config=None):
259
  # THIS LINE DEFINES THE VARIABLE
260
  proposed_next_stage = incomplete[0]
261
 
262
- print(f"Proposed next stage: {proposed_next_stage}")
263
-
264
- status_summary = f"Completed stages: {completed}\nIncomplete stages: {incomplete}"
265
-
266
  guidance_prompt_text = (
267
  """
268
- You are a creative and helpful AI assistant acting as a **DIY Project Brainstorming Facilitator**. Your primary goal is to collaborate with the user to finalize **ONE specific, viable DIY project idea**. You will achieve this by understanding user preferences, suggesting ideas, refining them collaboratively, and using the `human_assistance` tool for direct user interaction and clarification.
269
-
270
- **Critical Criteria for the Final DIY Project Idea (MUST be met):**
271
- 1. **Buildable:** Achievable by an average person with basic DIY skills.
272
- 2. **Common Materials/Tools:** Uses only materials (e.g., wood, screws, glue, paint, fabric, cardboard) and basic hand tools (e.g., screwdrivers, hammers, saws, drills) commonly available in general hardware stores, craft stores, or supermarkets worldwide.
273
- 3. **Avoid Specializations:** Explicitly AVOID projects requiring specialized electronic components, 3D printing, specific brand items not universally available, or complex machinery.
274
- 4. **Tangible Product:** The final result must be a physical, tangible item.
275
-
276
- **Your Process for Each Brainstorming Interaction Cycle:**
277
-
278
- 1. **THOUGHT:**
279
- * First, clearly state your understanding of the user's current input or the state of the brainstorming (e.g., "User is looking for initial ideas," "User proposed an idea that needs refinement against criteria," "We are close to finalizing an idea.").
280
- * Outline your plan for this interaction turn. This usually involves:
281
- * Engaging with the user's latest message.
282
- * Proposing a new idea or refining an existing one to meet the **Critical Criteria**.
283
- * Identifying if a question to the user is needed.
284
- * **Tool Identification (`human_assistance`):** If you need to ask the user a question to:
285
- * Understand their interests or initial thoughts.
286
- * Clarify their preferences or skill level (gently).
287
- * Get feedback on a proposed idea.
288
- * Refine an idea to meet criteria.
289
- You MUST state your intention to use the `human_assistance` tool and clearly formulate the question you will pass as the `query` argument.
290
- * **Idea Finalization Check:** If you believe a current idea, discussed with the user, clearly meets ALL **Critical Criteria** and the user seems positive, note your intention to output the `IDEA FINALIZED` signal.
291
-
292
- 2. **TOOL USE (`human_assistance` - If Necessary):**
293
- * If your plan requires asking the user a question, you will then invoke the `human_assistance` tool with your formulated query.
294
- * (Agent Builder Note: The LLM will output a tool call here. The system executes it.)
295
-
296
- 3. **RESPONSE SYNTHESIS / IDEA FINALIZATION:**
297
- * After any necessary tool use (or if no tool was needed for this turn), synthesize your response.
298
- * **If an idea is finalized:** When you determine that a specific project idea meets ALL **Critical Criteria** and the user has positively engaged with it, your response for this turn MUST BE *ONLY* the exact phrase:
299
- `IDEA FINALIZED: [Name of the Idea]`
300
- (Example: `IDEA FINALIZED: Simple Wooden Spice Rack`)
301
- Do not add any other text before or after this phrase if you use it. This signals the end of brainstorming.
302
- * **If brainstorming continues (no finalization yet):**
303
- * Provide your conversational response, suggestions, or refinements.
304
- * If you didn't use a tool in step 2 but are now responding, ensure your response is engaging and moves the brainstorming forward.
305
- * If you just made a tool call for `human_assistance`, your main output here might be the tool call itself, or a very brief lead-in text if the system allows. Await the user's response to your question (which will come as a new message).
306
-
307
- **General Guidelines for Your Interaction:**
308
- * **Collaborative & Iterative:** Work *with* the user. It's a conversation.
309
- * **Criteria Focused:** Always gently guide ideas towards meeting all **Critical Criteria**. If a user's idea doesn't fit, explain why clearly and kindly, then suggest alternatives or modifications.
310
- * **One Main Idea at a Time:** To avoid confusion, try to focus the discussion on one main project idea or a small set of comparable alternatives at any given time.
311
- * **User-Centric:** Your goal is to help the user find a project *they* will be happy and successful with.
312
- * **Clarity:** Be clear in your suggestions and questions.
313
- * **Tool Protocol:** When you decide to use `human_assistance`, formulate the tool call correctly. Do not try to answer the question you intend to ask the user.
314
-
315
- ---
316
- **Examples of How You Should Operate in Brainstorming Mode:** (Include examples as before)
317
- ... (rest of the prompt) ...
318
- """
319
 
320
  )
321
 
@@ -333,8 +356,6 @@ You are a creative and helpful AI assistant acting as a **DIY Project Brainstorm
333
  ]
334
  )
335
 
336
- assistant_model = model.bind_tools([human_assistance])
337
-
338
  # Tools allowed for brainstorming
339
  node_tools = [human_assistance]
340
  if state.search_enabled and tavily_search_tool: # only add search tool if enabled and initialized
@@ -348,16 +369,6 @@ You are a creative and helpful AI assistant acting as a **DIY Project Brainstorm
348
 
349
  response = await chain.ainvoke({"messages": filtered_messages}, config=config)
350
 
351
- for msg in response:
352
- if isinstance(msg, HumanMessage):
353
- print("Human:", msg.content)
354
- elif isinstance(msg, AIMessage):
355
- if isinstance(msg.content, list):
356
- ai_texts = [part.get("text", "") for part in msg.content if isinstance(part, dict)]
357
- print("AI:", " ".join(ai_texts))
358
- else:
359
- print("AI:", msg.content)
360
-
361
  if hasattr(response, "tool_calls"):
362
  for tool_call in response.tool_calls:
363
  tool_name = tool_call['name']
@@ -366,9 +377,10 @@ You are a creative and helpful AI assistant acting as a **DIY Project Brainstorm
366
  print(f"Human input needed: {query}")
367
 
368
  updates = {"messages": [response]}
369
- print('response from brainstorm', response)
370
 
371
  if isinstance(response, AIMessage) and response.content:
 
372
  if isinstance(response.content, str):
373
  content = response.content.strip()
374
  elif isinstance(response.content, list):
@@ -377,13 +389,15 @@ You are a creative and helpful AI assistant acting as a **DIY Project Brainstorm
377
  else:
378
  content = str(response.content).strip()
379
 
380
- if content.startswith("IDEA FINALIZED:"):
 
381
  print('βœ… final idea')
382
  updates.update({
383
  "brainstorming_complete": True,
384
  "tool_call_required": False,
385
  "loop_brainstorming": False,
386
  })
 
387
 
388
  else:
389
  tool_calls = getattr(response, "tool_calls", None)
@@ -413,7 +427,7 @@ You are a creative and helpful AI assistant acting as a **DIY Project Brainstorm
413
  updates["tool_call_required"] = False
414
  updates["loop_brainstorming"] = True
415
 
416
- print("\n--- End Brainstorming Node Debug ---")
417
  return updates
418
  except Exception as e:
419
  print(f"Error in guidance node: {e}")
@@ -457,31 +471,7 @@ def brainstorming_routing(state: GraphProcessingState) -> str:
457
  # else:
458
  # return "guidance_node"
459
 
460
- def guidance_routing(state: GraphProcessingState) -> str:
461
-
462
- print("\n--- Guidance Routing (Debug via print) ---") # Added a newline for clarity
463
- print(f"Prompt: {state.prompt}")
464
- # print(f"Message: {state.messages}")
465
- print(f"Tools Enabled: {state.tools_enabled}")
466
- print(f"Search Enabled: {state.search_enabled}")
467
- print(f"Next Stage: {state.next_stage}")
468
-
469
-
470
- next_stage = state.next_stage
471
- if next_stage == "brainstorming":
472
- return "brainstorming_node"
473
-
474
- elif next_stage == "planning":
475
- print('\n may day may day may day may day may day')
476
- # return "planning_node"
477
- # elif next_stage == "drawing":
478
- # return "drawing_node"
479
- # elif next_stage == "product_searching":
480
- # return "product_searching"
481
- # elif next_stage == "purchasing":
482
- # return "purchasing_node"
483
 
484
- return END
485
 
486
 
487
 
 
137
 
138
 
139
 
 
 
 
140
  async def guidance_node(state: GraphProcessingState, config=None):
141
+ print("\n--- Guidance Node (Debug via print) ---\n") # Added a newline for clarity
 
142
 
143
  print(f"Prompt: {state.prompt}")
144
+ print(f"Prompt: {state.prompt}")
145
+ # print(f"Message: {state.messages}")
146
+ print(f"Tools Enabled: {state.tools_enabled}")
147
+ print(f"Search Enabled: {state.search_enabled}")
148
  for message in state.messages:
149
  if isinstance(message, HumanMessage):
150
  print(f"Human: {message.content}")
 
163
  elif isinstance(message, ToolMessage):
164
  print(f"Tool: {message.content}")
165
 
 
 
166
 
167
  # Log boolean completion flags
168
  # Define the order of stages
 
172
  completed = [stage for stage in stage_order if getattr(state, f"{stage}_complete", False)]
173
  incomplete = [stage for stage in stage_order if not getattr(state, f"{stage}_complete", False)]
174
 
175
+
 
 
176
 
177
  # Determine the next stage
178
  if not incomplete:
 
185
  else:
186
  # Set the next stage to the first incomplete stage
187
  next_stage = incomplete[0]
188
+ print(f"Next Stage: {state.next_stage}")
189
+ print("\n--- End of Guidance Node Debug ---\n")
190
  return {
191
  "messages": [],
192
  "next_stage": next_stage,
193
  "pending_approval_stage": None,
194
  }
195
+
196
+ def guidance_routing(state: GraphProcessingState) -> str:
197
 
198
+ print("\n--- Guidance Routing (Debug via print) ---\n") # Added a newline for clarity
199
+
200
+ print(f"Next Stage: {state.next_stage}\n")
201
 
202
+ print(f"Brainstorming complete: {state.brainstorming_complete}\n")
203
+ print(f"3D prompt: {state.planning_complete}\n")
204
+ print(f"Drwaing 3d model: {state.drawing_complete}\n")
205
+ print(f"Finding products: {state.next_stage}\n")
206
 
207
+
208
+
209
+ next_stage = state.next_stage
210
+ if next_stage == "brainstorming":
211
+ return "brainstorming_node"
212
+
213
+ elif next_stage == "planning":
214
+ print('\n may day may day may day may day may day')
215
+ # return "planning_node"
216
+ # elif next_stage == "drawing":
217
+ # return "drawing_node"
218
+ # elif next_stage == "product_searching":
219
+ # return "product_searching"
220
+ # elif next_stage == "purchasing":
221
+ # return "purchasing_node"
222
+
223
+ return END
224
+
225
+ async def brainstorming_node(state: GraphProcessingState, config=None):
226
+ print("\n--- brainstorming Node (Debug via print) ---\n") # Added a newline for clarity
227
+
228
+
229
+ # print(f"Prompt: {state.prompt}")
230
+ # for message in state.messages:
231
+ # if isinstance(message, HumanMessage):
232
+ # print(f"Human: {message.content}")
233
+ # elif isinstance(message, AIMessage):
234
+ # # Check if content is non-empty
235
+ # if message.content:
236
+ # # If content is a list (e.g., list of dicts), extract text
237
+ # if isinstance(message.content, list):
238
+ # texts = [item.get('text', '') for item in message.content if isinstance(item, dict) and 'text' in item]
239
+ # if texts:
240
+ # print(f"AI: {' '.join(texts)}")
241
+ # elif isinstance(message.content, str):
242
+ # print(f"AI: {message.content}")
243
+ # elif isinstance(message, SystemMessage):
244
+ # print(f"System: {message.content}")
245
+ # elif isinstance(message, ToolMessage):
246
+ # print(f"Tool: {message.content}")
247
 
248
  print(f"Tools Enabled: {state.tools_enabled}")
249
  print(f"Search Enabled: {state.search_enabled}")
 
286
  # THIS LINE DEFINES THE VARIABLE
287
  proposed_next_stage = incomplete[0]
288
 
 
 
 
 
289
  guidance_prompt_text = (
290
  """
291
+ You are a creative and helpful AI assistant acting as a **DIY Project Brainstorming Facilitator**. Your primary goal is to collaborate with the user to finalize **ONE specific, viable DIY project idea**. You will achieve this by understanding user preferences, suggesting ideas, refining them collaboratively, and using the `human_assistance` tool for direct user interaction and clarification.
292
+
293
+ **Critical Criteria for the Final DIY Project Idea (MUST be met):**
294
+ 1. **Buildable:** Achievable by an average person with basic DIY skills.
295
+ 2. **Common Materials/Tools:** Uses only materials (e.g., wood, screws, glue, paint, fabric, cardboard) and basic hand tools (e.g., screwdrivers, hammers, saws, drills) commonly available in general hardware stores, craft stores, or supermarkets worldwide.
296
+ 3. **Avoid Specializations:** Explicitly AVOID projects requiring specialized electronic components, 3D printing, specific brand items not universally available, or complex machinery.
297
+ 4. **Tangible Product:** The final result must be a physical, tangible item.
298
+
299
+ **Your Process for Each Brainstorming Interaction Cycle:**
300
+
301
+ 1. **THOUGHT:**
302
+ * First, clearly state your understanding of the user's current input or the state of the brainstorming (e.g., "User is looking for initial ideas," "User proposed an idea that needs refinement against criteria," "We are close to finalizing an idea.").
303
+ * Outline your plan for this interaction turn. This usually involves:
304
+ * Engaging with the user's latest message.
305
+ * Proposing a new idea or refining an existing one to meet the **Critical Criteria**.
306
+ * Identifying if a question to the user is needed.
307
+ * **Tool Identification (`human_assistance`):** If you need to ask the user a question to:
308
+ * Understand their interests or initial thoughts.
309
+ * Clarify their preferences or skill level (gently).
310
+ * Get feedback on a proposed idea.
311
+ * Refine an idea to meet criteria.
312
+ You MUST state your intention to use the `human_assistance` tool and clearly formulate the question you will pass as the `query` argument.
313
+ * **Idea Finalization Check:** If you believe a current idea, discussed with the user, clearly meets ALL **Critical Criteria** and the user seems positive, note your intention to output the `IDEA FINALIZED` signal.
314
+
315
+ 2. **TOOL USE (`human_assistance` - If Necessary):**
316
+ * If your plan requires asking the user a question, you will then invoke the `human_assistance` tool with your formulated query.
317
+ * (Agent Builder Note: The LLM will output a tool call here. The system executes it.)
318
+
319
+ 3. **RESPONSE SYNTHESIS / IDEA FINALIZATION:**
320
+ * After any necessary tool use (or if no tool was needed for this turn), synthesize your response.
321
+ * **If an idea is finalized:** When you determine that a specific project idea meets ALL **Critical Criteria** and the user has positively engaged with it, your response for this turn MUST BE *ONLY* the exact phrase:
322
+ `IDEA FINALIZED: [Name of the Idea]`
323
+ (Example: `IDEA FINALIZED: Simple Wooden Spice Rack`)
324
+ Do not add any other text before or after this phrase if you use it. This signals the end of brainstorming.
325
+ * **If brainstorming continues (no finalization yet):**
326
+ * Provide your conversational response, suggestions, or refinements.
327
+ * If you didn't use a tool in step 2 but are now responding, ensure your response is engaging and moves the brainstorming forward.
328
+ * If you just made a tool call for `human_assistance`, your main output here might be the tool call itself, or a very brief lead-in text if the system allows. Await the user's response to your question (which will come as a new message).
329
+
330
+ **General Guidelines for Your Interaction:**
331
+ * **Collaborative & Iterative:** Work *with* the user. It's a conversation.
332
+ * **Criteria Focused:** Always gently guide ideas towards meeting all **Critical Criteria**. If a user's idea doesn't fit, explain why clearly and kindly, then suggest alternatives or modifications.
333
+ * **One Main Idea at a Time:** To avoid confusion, try to focus the discussion on one main project idea or a small set of comparable alternatives at any given time.
334
+ * **User-Centric:** Your goal is to help the user find a project *they* will be happy and successful with.
335
+ * **Clarity:** Be clear in your suggestions and questions.
336
+ * **Tool Protocol:** When you decide to use `human_assistance`, formulate the tool call correctly. Do not try to answer the question you intend to ask the user.
337
+
338
+ ---
339
+ **Examples of How You Should Operate in Brainstorming Mode:** (Include examples as before)
340
+ ... (rest of the prompt) ...
341
+ """
342
 
343
  )
344
 
 
356
  ]
357
  )
358
 
 
 
359
  # Tools allowed for brainstorming
360
  node_tools = [human_assistance]
361
  if state.search_enabled and tavily_search_tool: # only add search tool if enabled and initialized
 
369
 
370
  response = await chain.ainvoke({"messages": filtered_messages}, config=config)
371
 
 
 
 
 
 
 
 
 
 
 
372
  if hasattr(response, "tool_calls"):
373
  for tool_call in response.tool_calls:
374
  tool_name = tool_call['name']
 
377
  print(f"Human input needed: {query}")
378
 
379
  updates = {"messages": [response]}
380
+ print(' πŸ” response from brainstorm', response)
381
 
382
  if isinstance(response, AIMessage) and response.content:
383
+ print(' πŸ’€πŸ’€ came inside the loop', response)
384
  if isinstance(response.content, str):
385
  content = response.content.strip()
386
  elif isinstance(response.content, list):
 
389
  else:
390
  content = str(response.content).strip()
391
 
392
+ print('content for idea finalizing:', content)
393
+ if "IDEA FINALIZED:" in content: # Use 'in' instead of 'startswith'
394
  print('βœ… final idea')
395
  updates.update({
396
  "brainstorming_complete": True,
397
  "tool_call_required": False,
398
  "loop_brainstorming": False,
399
  })
400
+ return updates
401
 
402
  else:
403
  tool_calls = getattr(response, "tool_calls", None)
 
427
  updates["tool_call_required"] = False
428
  updates["loop_brainstorming"] = True
429
 
430
+ print("\n--- End Brainstorming Node Debug ---\n")
431
  return updates
432
  except Exception as e:
433
  print(f"Error in guidance node: {e}")
 
471
  # else:
472
  # return "guidance_node"
473
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
474
 
 
475
 
476
 
477