ai-courseware/langchain-project/logs/langchain_bot_20251016.log

1053 lines
87 KiB
Plaintext
Raw Permalink Blame History

This file contains ambiguous Unicode characters

This file contains Unicode characters that might be confused with other characters. If you think that this is intentional, you can safely ignore this warning. Use the Escape button to reveal them.

2025-10-16 14:00:06,683 | INFO | langsmith_service | info:107 | ✅ LangSmith 监控已启用
2025-10-16 14:00:06,782 | INFO | ollama_service | info:107 | ✅ Ollama模型初始化成功
2025-10-16 14:00:06,783 | INFO | ollama_service | info:107 | 路由模型: qwen3:8b
2025-10-16 14:00:06,783 | INFO | ollama_service | info:107 | 对话模型: deepseek-v3.1:671b-cloud
2025-10-16 14:00:07,390 | INFO | workflow | info:107 | 🔗 SimpleChatChain 初始化完成
2025-10-16 14:00:07,396 | INFO | workflow | info:107 | 📊 工作流图构建完成
2025-10-16 14:00:07,396 | INFO | workflow | info:107 | 🚀 ChatWorkflow 初始化完成
2025-10-16 14:00:07,396 | INFO | workflow | info:107 | 🤖 使用 LangGraph 工作流模式
2025-10-16 14:00:19,273 | INFO | workflow | info:107 | 🚀 工作流开始: dd827309-6213-4f5b-938f-d8a6069ff6b2
2025-10-16 14:00:19,276 | ERROR | workflow | error:115 | ❌ 工作流执行失败: 'async for' requires an object with __aiter__ method, got coroutine
2025-10-16 14:04:05,200 | INFO | langsmith_service | info:107 | ✅ LangSmith 监控已启用
2025-10-16 14:04:05,318 | INFO | ollama_service | info:107 | ✅ Ollama模型初始化成功
2025-10-16 14:04:05,319 | INFO | ollama_service | info:107 | 路由模型: qwen3:8b
2025-10-16 14:04:05,322 | INFO | ollama_service | info:107 | 对话模型: deepseek-v3.1:671b-cloud
2025-10-16 14:04:05,839 | INFO | workflow | info:107 | 🔗 SimpleChatChain 初始化完成
2025-10-16 14:04:05,849 | INFO | workflow | info:107 | 📊 工作流图构建完成
2025-10-16 14:04:05,849 | INFO | workflow | info:107 | 🚀 ChatWorkflow 初始化完成
2025-10-16 14:04:05,850 | INFO | workflow | info:107 | 🤖 使用 LangGraph 工作流模式
2025-10-16 14:04:19,639 | INFO | langsmith_service | info:107 | ✅ LangSmith 监控已启用
2025-10-16 14:04:19,804 | INFO | ollama_service | info:107 | ✅ Ollama模型初始化成功
2025-10-16 14:04:19,805 | INFO | ollama_service | info:107 | 路由模型: qwen3:8b
2025-10-16 14:04:19,805 | INFO | ollama_service | info:107 | 对话模型: deepseek-v3.1:671b-cloud
2025-10-16 14:04:20,188 | INFO | workflow | info:107 | 🔗 SimpleChatChain 初始化完成
2025-10-16 14:04:20,196 | INFO | workflow | info:107 | 📊 工作流图构建完成
2025-10-16 14:04:20,196 | INFO | workflow | info:107 | 🚀 ChatWorkflow 初始化完成
2025-10-16 14:04:20,196 | INFO | workflow | info:107 | 🤖 使用 LangGraph 工作流模式
2025-10-16 14:04:20,197 | INFO | workflow | info:107 | 🚀 工作流开始: 697c4ab0-b4ff-4ccb-bdbe-f4055150928c
2025-10-16 14:04:20,197 | INFO | workflow | info:107 | 🚀 节点开始: intent_analysis
2025-10-16 14:04:20,197 | INFO | workflow | info:107 | 🚀 节点开始: intent_analysis
2025-10-16 14:04:20,345 | ERROR | workflow | error:115 | ❌ 节点错误: intent_analysis - All connection attempts failed
2025-10-16 14:04:20,345 | ERROR | workflow | exception:123 | 节点 intent_analysis 异常详情
Traceback (most recent call last):
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpx/_transports/default.py", line 101, in map_httpcore_exceptions
yield
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpx/_transports/default.py", line 394, in handle_async_request
resp = await self._pool.handle_async_request(req)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpcore/_async/connection_pool.py", line 256, in handle_async_request
raise exc from None
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpcore/_async/connection_pool.py", line 236, in handle_async_request
response = await connection.handle_async_request(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpcore/_async/connection.py", line 101, in handle_async_request
raise exc
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpcore/_async/connection.py", line 78, in handle_async_request
stream = await self._connect(request)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpcore/_async/connection.py", line 124, in _connect
stream = await self._network_backend.connect_tcp(**kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpcore/_backends/auto.py", line 31, in connect_tcp
return await self._backend.connect_tcp(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpcore/_backends/anyio.py", line 113, in connect_tcp
with map_exceptions(exc_map):
File "/usr/lib/python3.12/contextlib.py", line 158, in __exit__
self.gen.throw(value)
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpcore/_exceptions.py", line 14, in map_exceptions
raise to_exc(exc) from exc
httpcore.ConnectError: All connection attempts failed
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/mnt/e/code/courseware/langchain-project/src/services/ollama_service.py", line 122, in analyze_intent
response = await self._router_model.ainvoke(messages)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/langchain_core/language_models/chat_models.py", line 417, in ainvoke
llm_result = await self.agenerate_prompt(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/langchain_core/language_models/chat_models.py", line 1036, in agenerate_prompt
return await self.agenerate(
^^^^^^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/langchain_core/language_models/chat_models.py", line 994, in agenerate
raise exceptions[0]
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/langchain_core/language_models/chat_models.py", line 1164, in _agenerate_with_cache
result = await self._agenerate(
^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/langchain_ollama/chat_models.py", line 996, in _agenerate
final_chunk = await self._achat_stream_with_aggregation(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/langchain_ollama/chat_models.py", line 783, in _achat_stream_with_aggregation
async for chunk in self._aiterate_over_stream(messages, stop, **kwargs):
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/langchain_ollama/chat_models.py", line 920, in _aiterate_over_stream
async for stream_resp in self._acreate_chat_stream(messages, stop, **kwargs):
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/langchain_ollama/chat_models.py", line 728, in _acreate_chat_stream
async for part in await self._async_client.chat(**chat_params):
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/ollama/_client.py", line 736, in inner
async with self._client.stream(*args, **kwargs) as r:
File "/usr/lib/python3.12/contextlib.py", line 210, in __aenter__
return await anext(self.gen)
^^^^^^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpx/_client.py", line 1583, in stream
response = await self.send(
^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpx/_client.py", line 1629, in send
response = await self._send_handling_auth(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpx/_client.py", line 1657, in _send_handling_auth
response = await self._send_handling_redirects(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpx/_client.py", line 1694, in _send_handling_redirects
response = await self._send_single_request(request)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpx/_client.py", line 1730, in _send_single_request
response = await transport.handle_async_request(request)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpx/_transports/default.py", line 393, in handle_async_request
with map_httpcore_exceptions():
File "/usr/lib/python3.12/contextlib.py", line 158, in __exit__
self.gen.throw(value)
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpx/_transports/default.py", line 118, in map_httpcore_exceptions
raise mapped_exc(message) from exc
httpx.ConnectError: All connection attempts failed
2025-10-16 14:04:20,354 | INFO | workflow | info:107 | ✅ 节点完成: intent_analysis
2025-10-16 14:04:20,354 | ERROR | workflow | error:115 | ❌ 工作流执行失败: "WorkflowState" object has no field "next_node"
2025-10-16 14:04:20,354 | ERROR | workflow | error:115 | CLI 聊天错误: EOF when reading a line
2025-10-16 14:04:55,965 | INFO | langsmith_service | info:107 | ✅ LangSmith 监控已启用
2025-10-16 14:04:56,077 | INFO | ollama_service | info:107 | ✅ Ollama模型初始化成功
2025-10-16 14:04:56,077 | INFO | ollama_service | info:107 | 路由模型: qwen3:8b
2025-10-16 14:04:56,077 | INFO | ollama_service | info:107 | 对话模型: deepseek-v3.1:671b-cloud
2025-10-16 14:04:56,517 | INFO | workflow | info:107 | 🔗 SimpleChatChain 初始化完成
2025-10-16 14:04:56,523 | INFO | workflow | info:107 | 📊 工作流图构建完成
2025-10-16 14:04:56,524 | INFO | workflow | info:107 | 🚀 ChatWorkflow 初始化完成
2025-10-16 14:04:56,524 | INFO | workflow | info:107 | 🤖 使用 LangGraph 工作流模式
2025-10-16 14:04:56,524 | INFO | workflow | info:107 | 🚀 工作流开始: 95e78714-72f4-4a5c-80da-03d4d64635aa
2025-10-16 14:04:56,524 | INFO | workflow | info:107 | 🚀 节点开始: intent_analysis
2025-10-16 14:04:56,525 | INFO | workflow | info:107 | 🚀 节点开始: intent_analysis
2025-10-16 14:04:56,579 | ERROR | workflow | error:115 | ❌ 节点错误: intent_analysis - All connection attempts failed
2025-10-16 14:04:56,579 | ERROR | workflow | exception:123 | 节点 intent_analysis 异常详情
Traceback (most recent call last):
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpx/_transports/default.py", line 101, in map_httpcore_exceptions
yield
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpx/_transports/default.py", line 394, in handle_async_request
resp = await self._pool.handle_async_request(req)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpcore/_async/connection_pool.py", line 256, in handle_async_request
raise exc from None
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpcore/_async/connection_pool.py", line 236, in handle_async_request
response = await connection.handle_async_request(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpcore/_async/connection.py", line 101, in handle_async_request
raise exc
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpcore/_async/connection.py", line 78, in handle_async_request
stream = await self._connect(request)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpcore/_async/connection.py", line 124, in _connect
stream = await self._network_backend.connect_tcp(**kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpcore/_backends/auto.py", line 31, in connect_tcp
return await self._backend.connect_tcp(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpcore/_backends/anyio.py", line 113, in connect_tcp
with map_exceptions(exc_map):
File "/usr/lib/python3.12/contextlib.py", line 158, in __exit__
self.gen.throw(value)
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpcore/_exceptions.py", line 14, in map_exceptions
raise to_exc(exc) from exc
httpcore.ConnectError: All connection attempts failed
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/mnt/e/code/courseware/langchain-project/src/services/ollama_service.py", line 122, in analyze_intent
response = await self._router_model.ainvoke(messages)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/langchain_core/language_models/chat_models.py", line 417, in ainvoke
llm_result = await self.agenerate_prompt(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/langchain_core/language_models/chat_models.py", line 1036, in agenerate_prompt
return await self.agenerate(
^^^^^^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/langchain_core/language_models/chat_models.py", line 994, in agenerate
raise exceptions[0]
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/langchain_core/language_models/chat_models.py", line 1164, in _agenerate_with_cache
result = await self._agenerate(
^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/langchain_ollama/chat_models.py", line 996, in _agenerate
final_chunk = await self._achat_stream_with_aggregation(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/langchain_ollama/chat_models.py", line 783, in _achat_stream_with_aggregation
async for chunk in self._aiterate_over_stream(messages, stop, **kwargs):
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/langchain_ollama/chat_models.py", line 920, in _aiterate_over_stream
async for stream_resp in self._acreate_chat_stream(messages, stop, **kwargs):
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/langchain_ollama/chat_models.py", line 728, in _acreate_chat_stream
async for part in await self._async_client.chat(**chat_params):
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/ollama/_client.py", line 736, in inner
async with self._client.stream(*args, **kwargs) as r:
File "/usr/lib/python3.12/contextlib.py", line 210, in __aenter__
return await anext(self.gen)
^^^^^^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpx/_client.py", line 1583, in stream
response = await self.send(
^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpx/_client.py", line 1629, in send
response = await self._send_handling_auth(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpx/_client.py", line 1657, in _send_handling_auth
response = await self._send_handling_redirects(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpx/_client.py", line 1694, in _send_handling_redirects
response = await self._send_single_request(request)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpx/_client.py", line 1730, in _send_single_request
response = await transport.handle_async_request(request)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpx/_transports/default.py", line 393, in handle_async_request
with map_httpcore_exceptions():
File "/usr/lib/python3.12/contextlib.py", line 158, in __exit__
self.gen.throw(value)
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpx/_transports/default.py", line 118, in map_httpcore_exceptions
raise mapped_exc(message) from exc
httpx.ConnectError: All connection attempts failed
2025-10-16 14:04:56,588 | INFO | workflow | info:107 | ✅ 节点完成: intent_analysis
2025-10-16 14:04:56,588 | INFO | workflow | info:107 | 💬 路由到自然对话 (意图: IntentType.GENERAL_CHAT, 置信度: 0.3)
2025-10-16 14:04:56,588 | INFO | workflow | info:107 | 🚀 节点开始: natural_chat
2025-10-16 14:04:56,597 | INFO | workflow | info:107 | 💬 构建对话消息: 2 条消息
2025-10-16 14:04:56,598 | ERROR | workflow | error:115 | ❌ 节点错误: natural_chat - LangSmithService.log_model_call() got an unexpected keyword argument 'input_messages'
2025-10-16 14:04:56,598 | ERROR | workflow | exception:123 | 节点 natural_chat 异常详情
Traceback (most recent call last):
File "/mnt/e/code/courseware/langchain-project/src/workflows/nodes/natural_chat.py", line 50, in natural_chat_node
langsmith_service.log_model_call(
TypeError: LangSmithService.log_model_call() got an unexpected keyword argument 'input_messages'
2025-10-16 14:04:56,601 | ERROR | workflow | error:115 | CLI 聊天错误: EOF when reading a line
2025-10-16 14:05:55,496 | INFO | langsmith_service | info:107 | ✅ LangSmith 监控已启用
2025-10-16 14:05:55,595 | INFO | ollama_service | info:107 | ✅ Ollama模型初始化成功
2025-10-16 14:05:55,595 | INFO | ollama_service | info:107 | 路由模型: qwen3:8b
2025-10-16 14:05:55,595 | INFO | ollama_service | info:107 | 对话模型: deepseek-v3.1:671b-cloud
2025-10-16 14:05:56,031 | INFO | workflow | info:107 | 🔗 SimpleChatChain 初始化完成
2025-10-16 14:05:56,037 | INFO | workflow | info:107 | 📊 工作流图构建完成
2025-10-16 14:05:56,037 | INFO | workflow | info:107 | 🚀 ChatWorkflow 初始化完成
2025-10-16 14:05:56,037 | INFO | workflow | info:107 | 🤖 使用 LangGraph 工作流模式
2025-10-16 14:05:56,038 | INFO | workflow | info:107 | 🚀 工作流开始: d7f3b229-e66b-4b24-a14b-c655227ea69e
2025-10-16 14:05:56,038 | INFO | workflow | info:107 | 🚀 节点开始: intent_analysis
2025-10-16 14:05:56,038 | INFO | workflow | info:107 | 🚀 节点开始: intent_analysis
2025-10-16 14:05:56,099 | ERROR | workflow | error:115 | ❌ 节点错误: intent_analysis - All connection attempts failed
2025-10-16 14:05:56,100 | ERROR | workflow | exception:123 | 节点 intent_analysis 异常详情
Traceback (most recent call last):
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpx/_transports/default.py", line 101, in map_httpcore_exceptions
yield
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpx/_transports/default.py", line 394, in handle_async_request
resp = await self._pool.handle_async_request(req)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpcore/_async/connection_pool.py", line 256, in handle_async_request
raise exc from None
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpcore/_async/connection_pool.py", line 236, in handle_async_request
response = await connection.handle_async_request(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpcore/_async/connection.py", line 101, in handle_async_request
raise exc
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpcore/_async/connection.py", line 78, in handle_async_request
stream = await self._connect(request)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpcore/_async/connection.py", line 124, in _connect
stream = await self._network_backend.connect_tcp(**kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpcore/_backends/auto.py", line 31, in connect_tcp
return await self._backend.connect_tcp(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpcore/_backends/anyio.py", line 113, in connect_tcp
with map_exceptions(exc_map):
File "/usr/lib/python3.12/contextlib.py", line 158, in __exit__
self.gen.throw(value)
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpcore/_exceptions.py", line 14, in map_exceptions
raise to_exc(exc) from exc
httpcore.ConnectError: All connection attempts failed
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/mnt/e/code/courseware/langchain-project/src/services/ollama_service.py", line 122, in analyze_intent
response = await self._router_model.ainvoke(messages)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/langchain_core/language_models/chat_models.py", line 417, in ainvoke
llm_result = await self.agenerate_prompt(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/langchain_core/language_models/chat_models.py", line 1036, in agenerate_prompt
return await self.agenerate(
^^^^^^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/langchain_core/language_models/chat_models.py", line 994, in agenerate
raise exceptions[0]
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/langchain_core/language_models/chat_models.py", line 1164, in _agenerate_with_cache
result = await self._agenerate(
^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/langchain_ollama/chat_models.py", line 996, in _agenerate
final_chunk = await self._achat_stream_with_aggregation(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/langchain_ollama/chat_models.py", line 783, in _achat_stream_with_aggregation
async for chunk in self._aiterate_over_stream(messages, stop, **kwargs):
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/langchain_ollama/chat_models.py", line 920, in _aiterate_over_stream
async for stream_resp in self._acreate_chat_stream(messages, stop, **kwargs):
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/langchain_ollama/chat_models.py", line 728, in _acreate_chat_stream
async for part in await self._async_client.chat(**chat_params):
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/ollama/_client.py", line 736, in inner
async with self._client.stream(*args, **kwargs) as r:
File "/usr/lib/python3.12/contextlib.py", line 210, in __aenter__
return await anext(self.gen)
^^^^^^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpx/_client.py", line 1583, in stream
response = await self.send(
^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpx/_client.py", line 1629, in send
response = await self._send_handling_auth(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpx/_client.py", line 1657, in _send_handling_auth
response = await self._send_handling_redirects(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpx/_client.py", line 1694, in _send_handling_redirects
response = await self._send_single_request(request)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpx/_client.py", line 1730, in _send_single_request
response = await transport.handle_async_request(request)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpx/_transports/default.py", line 393, in handle_async_request
with map_httpcore_exceptions():
File "/usr/lib/python3.12/contextlib.py", line 158, in __exit__
self.gen.throw(value)
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpx/_transports/default.py", line 118, in map_httpcore_exceptions
raise mapped_exc(message) from exc
httpx.ConnectError: All connection attempts failed
2025-10-16 14:05:56,109 | INFO | workflow | info:107 | ✅ 节点完成: intent_analysis
2025-10-16 14:05:56,110 | INFO | workflow | info:107 | 💬 路由到自然对话 (意图: IntentType.GENERAL_CHAT, 置信度: 0.3)
2025-10-16 14:05:56,110 | INFO | workflow | info:107 | 🚀 节点开始: natural_chat
2025-10-16 14:05:56,113 | INFO | workflow | info:107 | 💬 构建对话消息: 2 条消息
2025-10-16 14:05:56,113 | INFO | workflow | info:107 | 🚀 节点开始: chat_generation
2025-10-16 14:05:56,143 | ERROR | workflow | error:115 | ❌ 节点错误: chat_generation - All connection attempts failed
2025-10-16 14:05:56,144 | ERROR | workflow | exception:123 | 节点 chat_generation 异常详情
Traceback (most recent call last):
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpx/_transports/default.py", line 101, in map_httpcore_exceptions
yield
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpx/_transports/default.py", line 394, in handle_async_request
resp = await self._pool.handle_async_request(req)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpcore/_async/connection_pool.py", line 256, in handle_async_request
raise exc from None
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpcore/_async/connection_pool.py", line 236, in handle_async_request
response = await connection.handle_async_request(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpcore/_async/connection.py", line 101, in handle_async_request
raise exc
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpcore/_async/connection.py", line 78, in handle_async_request
stream = await self._connect(request)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpcore/_async/connection.py", line 124, in _connect
stream = await self._network_backend.connect_tcp(**kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpcore/_backends/auto.py", line 31, in connect_tcp
return await self._backend.connect_tcp(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpcore/_backends/anyio.py", line 113, in connect_tcp
with map_exceptions(exc_map):
File "/usr/lib/python3.12/contextlib.py", line 158, in __exit__
self.gen.throw(value)
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpcore/_exceptions.py", line 14, in map_exceptions
raise to_exc(exc) from exc
httpcore.ConnectError: All connection attempts failed
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/mnt/e/code/courseware/langchain-project/src/services/ollama_service.py", line 191, in generate_chat_response_streaming
async for chunk in self._chat_model.astream(langchain_messages):
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/langchain_core/language_models/chat_models.py", line 615, in astream
async for chunk in self._astream(
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/langchain_ollama/chat_models.py", line 981, in _astream
async for chunk in self._aiterate_over_stream(messages, stop, **kwargs):
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/langchain_ollama/chat_models.py", line 920, in _aiterate_over_stream
async for stream_resp in self._acreate_chat_stream(messages, stop, **kwargs):
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/langchain_ollama/chat_models.py", line 728, in _acreate_chat_stream
async for part in await self._async_client.chat(**chat_params):
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/ollama/_client.py", line 736, in inner
async with self._client.stream(*args, **kwargs) as r:
File "/usr/lib/python3.12/contextlib.py", line 210, in __aenter__
return await anext(self.gen)
^^^^^^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpx/_client.py", line 1583, in stream
response = await self.send(
^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpx/_client.py", line 1629, in send
response = await self._send_handling_auth(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpx/_client.py", line 1657, in _send_handling_auth
response = await self._send_handling_redirects(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpx/_client.py", line 1694, in _send_handling_redirects
response = await self._send_single_request(request)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpx/_client.py", line 1730, in _send_single_request
response = await transport.handle_async_request(request)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpx/_transports/default.py", line 393, in handle_async_request
with map_httpcore_exceptions():
File "/usr/lib/python3.12/contextlib.py", line 158, in __exit__
self.gen.throw(value)
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpx/_transports/default.py", line 118, in map_httpcore_exceptions
raise mapped_exc(message) from exc
httpx.ConnectError: All connection attempts failed
2025-10-16 14:05:56,157 | INFO | workflow | info:107 | ✅ 节点完成: natural_chat
2025-10-16 14:05:56,157 | ERROR | workflow | error:115 | ❌ 工作流执行失败: LangSmithService.log_conversation_end() missing 2 required positional arguments: 'total_duration' and 'message_count'
2025-10-16 14:05:56,158 | ERROR | workflow | error:115 | CLI 聊天错误: EOF when reading a line
2025-10-16 14:06:36,129 | INFO | langsmith_service | info:107 | ✅ LangSmith 监控已启用
2025-10-16 14:06:36,229 | INFO | ollama_service | info:107 | ✅ Ollama模型初始化成功
2025-10-16 14:06:36,229 | INFO | ollama_service | info:107 | 路由模型: qwen3:8b
2025-10-16 14:06:36,229 | INFO | ollama_service | info:107 | 对话模型: deepseek-v3.1:671b-cloud
2025-10-16 14:06:36,645 | INFO | workflow | info:107 | 🔗 SimpleChatChain 初始化完成
2025-10-16 14:06:36,653 | INFO | workflow | info:107 | 📊 工作流图构建完成
2025-10-16 14:06:36,653 | INFO | workflow | info:107 | 🚀 ChatWorkflow 初始化完成
2025-10-16 14:06:36,653 | INFO | workflow | info:107 | 🤖 使用 LangGraph 工作流模式
2025-10-16 14:06:36,654 | INFO | workflow | info:107 | 🚀 工作流开始: f814ffd7-e96e-4c5b-906e-d52d8ba10197
2025-10-16 14:06:36,654 | INFO | workflow | info:107 | 🚀 节点开始: intent_analysis
2025-10-16 14:06:36,654 | INFO | workflow | info:107 | 🚀 节点开始: intent_analysis
2025-10-16 14:06:36,755 | ERROR | workflow | error:115 | ❌ 节点错误: intent_analysis - All connection attempts failed
2025-10-16 14:06:36,755 | ERROR | workflow | exception:123 | 节点 intent_analysis 异常详情
Traceback (most recent call last):
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpx/_transports/default.py", line 101, in map_httpcore_exceptions
yield
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpx/_transports/default.py", line 394, in handle_async_request
resp = await self._pool.handle_async_request(req)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpcore/_async/connection_pool.py", line 256, in handle_async_request
raise exc from None
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpcore/_async/connection_pool.py", line 236, in handle_async_request
response = await connection.handle_async_request(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpcore/_async/connection.py", line 101, in handle_async_request
raise exc
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpcore/_async/connection.py", line 78, in handle_async_request
stream = await self._connect(request)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpcore/_async/connection.py", line 124, in _connect
stream = await self._network_backend.connect_tcp(**kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpcore/_backends/auto.py", line 31, in connect_tcp
return await self._backend.connect_tcp(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpcore/_backends/anyio.py", line 113, in connect_tcp
with map_exceptions(exc_map):
File "/usr/lib/python3.12/contextlib.py", line 158, in __exit__
self.gen.throw(value)
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpcore/_exceptions.py", line 14, in map_exceptions
raise to_exc(exc) from exc
httpcore.ConnectError: All connection attempts failed
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/mnt/e/code/courseware/langchain-project/src/services/ollama_service.py", line 122, in analyze_intent
response = await self._router_model.ainvoke(messages)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/langchain_core/language_models/chat_models.py", line 417, in ainvoke
llm_result = await self.agenerate_prompt(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/langchain_core/language_models/chat_models.py", line 1036, in agenerate_prompt
return await self.agenerate(
^^^^^^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/langchain_core/language_models/chat_models.py", line 994, in agenerate
raise exceptions[0]
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/langchain_core/language_models/chat_models.py", line 1164, in _agenerate_with_cache
result = await self._agenerate(
^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/langchain_ollama/chat_models.py", line 996, in _agenerate
final_chunk = await self._achat_stream_with_aggregation(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/langchain_ollama/chat_models.py", line 783, in _achat_stream_with_aggregation
async for chunk in self._aiterate_over_stream(messages, stop, **kwargs):
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/langchain_ollama/chat_models.py", line 920, in _aiterate_over_stream
async for stream_resp in self._acreate_chat_stream(messages, stop, **kwargs):
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/langchain_ollama/chat_models.py", line 728, in _acreate_chat_stream
async for part in await self._async_client.chat(**chat_params):
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/ollama/_client.py", line 736, in inner
async with self._client.stream(*args, **kwargs) as r:
File "/usr/lib/python3.12/contextlib.py", line 210, in __aenter__
return await anext(self.gen)
^^^^^^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpx/_client.py", line 1583, in stream
response = await self.send(
^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpx/_client.py", line 1629, in send
response = await self._send_handling_auth(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpx/_client.py", line 1657, in _send_handling_auth
response = await self._send_handling_redirects(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpx/_client.py", line 1694, in _send_handling_redirects
response = await self._send_single_request(request)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpx/_client.py", line 1730, in _send_single_request
response = await transport.handle_async_request(request)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpx/_transports/default.py", line 393, in handle_async_request
with map_httpcore_exceptions():
File "/usr/lib/python3.12/contextlib.py", line 158, in __exit__
self.gen.throw(value)
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpx/_transports/default.py", line 118, in map_httpcore_exceptions
raise mapped_exc(message) from exc
httpx.ConnectError: All connection attempts failed
2025-10-16 14:06:36,771 | INFO | workflow | info:107 | ✅ 节点完成: intent_analysis
2025-10-16 14:06:36,772 | INFO | workflow | info:107 | 💬 路由到自然对话 (意图: IntentType.GENERAL_CHAT, 置信度: 0.3)
2025-10-16 14:06:36,772 | INFO | workflow | info:107 | 🚀 节点开始: natural_chat
2025-10-16 14:06:36,780 | INFO | workflow | info:107 | 💬 构建对话消息: 2 条消息
2025-10-16 14:06:36,781 | INFO | workflow | info:107 | 🚀 节点开始: chat_generation
2025-10-16 14:06:36,796 | ERROR | workflow | error:115 | ❌ 节点错误: chat_generation - All connection attempts failed
2025-10-16 14:06:36,797 | ERROR | workflow | exception:123 | 节点 chat_generation 异常详情
Traceback (most recent call last):
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpx/_transports/default.py", line 101, in map_httpcore_exceptions
yield
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpx/_transports/default.py", line 394, in handle_async_request
resp = await self._pool.handle_async_request(req)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpcore/_async/connection_pool.py", line 256, in handle_async_request
raise exc from None
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpcore/_async/connection_pool.py", line 236, in handle_async_request
response = await connection.handle_async_request(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpcore/_async/connection.py", line 101, in handle_async_request
raise exc
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpcore/_async/connection.py", line 78, in handle_async_request
stream = await self._connect(request)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpcore/_async/connection.py", line 124, in _connect
stream = await self._network_backend.connect_tcp(**kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpcore/_backends/auto.py", line 31, in connect_tcp
return await self._backend.connect_tcp(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpcore/_backends/anyio.py", line 113, in connect_tcp
with map_exceptions(exc_map):
File "/usr/lib/python3.12/contextlib.py", line 158, in __exit__
self.gen.throw(value)
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpcore/_exceptions.py", line 14, in map_exceptions
raise to_exc(exc) from exc
httpcore.ConnectError: All connection attempts failed
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/mnt/e/code/courseware/langchain-project/src/services/ollama_service.py", line 191, in generate_chat_response_streaming
async for chunk in self._chat_model.astream(langchain_messages):
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/langchain_core/language_models/chat_models.py", line 615, in astream
async for chunk in self._astream(
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/langchain_ollama/chat_models.py", line 981, in _astream
async for chunk in self._aiterate_over_stream(messages, stop, **kwargs):
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/langchain_ollama/chat_models.py", line 920, in _aiterate_over_stream
async for stream_resp in self._acreate_chat_stream(messages, stop, **kwargs):
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/langchain_ollama/chat_models.py", line 728, in _acreate_chat_stream
async for part in await self._async_client.chat(**chat_params):
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/ollama/_client.py", line 736, in inner
async with self._client.stream(*args, **kwargs) as r:
File "/usr/lib/python3.12/contextlib.py", line 210, in __aenter__
return await anext(self.gen)
^^^^^^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpx/_client.py", line 1583, in stream
response = await self.send(
^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpx/_client.py", line 1629, in send
response = await self._send_handling_auth(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpx/_client.py", line 1657, in _send_handling_auth
response = await self._send_handling_redirects(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpx/_client.py", line 1694, in _send_handling_redirects
response = await self._send_single_request(request)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpx/_client.py", line 1730, in _send_single_request
response = await transport.handle_async_request(request)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpx/_transports/default.py", line 393, in handle_async_request
with map_httpcore_exceptions():
File "/usr/lib/python3.12/contextlib.py", line 158, in __exit__
self.gen.throw(value)
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpx/_transports/default.py", line 118, in map_httpcore_exceptions
raise mapped_exc(message) from exc
httpx.ConnectError: All connection attempts failed
2025-10-16 14:06:36,809 | INFO | workflow | info:107 | ✅ 节点完成: natural_chat
2025-10-16 14:06:36,809 | INFO | workflow | info:107 | ✅ 工作流完成: f814ffd7-e96e-4c5b-906e-d52d8ba10197
2025-10-16 14:06:36,810 | ERROR | workflow | error:115 | CLI 聊天错误: EOF when reading a line
2025-10-16 14:19:51,074 | INFO | langsmith_service | info:107 | ✅ LangSmith 监控已启用
2025-10-16 14:19:51,198 | INFO | ollama_service | info:107 | ✅ Ollama模型初始化成功
2025-10-16 14:19:51,198 | INFO | ollama_service | info:107 | 路由模型: qwen3:8b
2025-10-16 14:19:51,198 | INFO | ollama_service | info:107 | 对话模型: deepseek-v3.1:671b-cloud
2025-10-16 14:19:52,010 | INFO | workflow | info:107 | 🔗 SimpleChatChain 初始化完成
2025-10-16 14:19:52,025 | INFO | workflow | info:107 | 📊 工作流图构建完成
2025-10-16 14:19:52,025 | INFO | workflow | info:107 | 🚀 ChatWorkflow 初始化完成
2025-10-16 14:19:52,026 | INFO | workflow | info:107 | 🤖 使用 LangGraph 工作流模式
2025-10-16 14:19:54,898 | INFO | workflow | info:107 | 🚀 工作流开始: f00e128c-0371-445a-a839-94e56af47e64
2025-10-16 14:19:54,898 | INFO | workflow | info:107 | 🚀 节点开始: intent_analysis
2025-10-16 14:19:54,899 | INFO | workflow | info:107 | 🚀 节点开始: intent_analysis
2025-10-16 14:19:55,087 | ERROR | workflow | error:115 | ❌ 节点错误: intent_analysis - All connection attempts failed
2025-10-16 14:19:55,087 | ERROR | workflow | exception:123 | 节点 intent_analysis 异常详情
Traceback (most recent call last):
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpx/_transports/default.py", line 101, in map_httpcore_exceptions
yield
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpx/_transports/default.py", line 394, in handle_async_request
resp = await self._pool.handle_async_request(req)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpcore/_async/connection_pool.py", line 256, in handle_async_request
raise exc from None
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpcore/_async/connection_pool.py", line 236, in handle_async_request
response = await connection.handle_async_request(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpcore/_async/connection.py", line 101, in handle_async_request
raise exc
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpcore/_async/connection.py", line 78, in handle_async_request
stream = await self._connect(request)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpcore/_async/connection.py", line 124, in _connect
stream = await self._network_backend.connect_tcp(**kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpcore/_backends/auto.py", line 31, in connect_tcp
return await self._backend.connect_tcp(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpcore/_backends/anyio.py", line 113, in connect_tcp
with map_exceptions(exc_map):
File "/usr/lib/python3.12/contextlib.py", line 158, in __exit__
self.gen.throw(value)
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpcore/_exceptions.py", line 14, in map_exceptions
raise to_exc(exc) from exc
httpcore.ConnectError: All connection attempts failed
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/mnt/e/code/courseware/langchain-project/src/services/ollama_service.py", line 122, in analyze_intent
response = await self._router_model.ainvoke(messages)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/langchain_core/language_models/chat_models.py", line 417, in ainvoke
llm_result = await self.agenerate_prompt(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/langchain_core/language_models/chat_models.py", line 1036, in agenerate_prompt
return await self.agenerate(
^^^^^^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/langchain_core/language_models/chat_models.py", line 994, in agenerate
raise exceptions[0]
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/langchain_core/language_models/chat_models.py", line 1164, in _agenerate_with_cache
result = await self._agenerate(
^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/langchain_ollama/chat_models.py", line 996, in _agenerate
final_chunk = await self._achat_stream_with_aggregation(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/langchain_ollama/chat_models.py", line 783, in _achat_stream_with_aggregation
async for chunk in self._aiterate_over_stream(messages, stop, **kwargs):
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/langchain_ollama/chat_models.py", line 920, in _aiterate_over_stream
async for stream_resp in self._acreate_chat_stream(messages, stop, **kwargs):
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/langchain_ollama/chat_models.py", line 728, in _acreate_chat_stream
async for part in await self._async_client.chat(**chat_params):
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/ollama/_client.py", line 736, in inner
async with self._client.stream(*args, **kwargs) as r:
File "/usr/lib/python3.12/contextlib.py", line 210, in __aenter__
return await anext(self.gen)
^^^^^^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpx/_client.py", line 1583, in stream
response = await self.send(
^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpx/_client.py", line 1629, in send
response = await self._send_handling_auth(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpx/_client.py", line 1657, in _send_handling_auth
response = await self._send_handling_redirects(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpx/_client.py", line 1694, in _send_handling_redirects
response = await self._send_single_request(request)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpx/_client.py", line 1730, in _send_single_request
response = await transport.handle_async_request(request)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpx/_transports/default.py", line 393, in handle_async_request
with map_httpcore_exceptions():
File "/usr/lib/python3.12/contextlib.py", line 158, in __exit__
self.gen.throw(value)
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpx/_transports/default.py", line 118, in map_httpcore_exceptions
raise mapped_exc(message) from exc
httpx.ConnectError: All connection attempts failed
2025-10-16 14:19:55,107 | INFO | workflow | info:107 | ✅ 节点完成: intent_analysis
2025-10-16 14:19:55,107 | INFO | workflow | info:107 | 💬 路由到自然对话 (意图: IntentType.GENERAL_CHAT, 置信度: 0.3)
2025-10-16 14:19:55,108 | INFO | workflow | info:107 | 🚀 节点开始: natural_chat
2025-10-16 14:19:55,129 | INFO | workflow | info:107 | 💬 构建对话消息: 2 条消息
2025-10-16 14:19:55,130 | INFO | workflow | info:107 | 🚀 节点开始: chat_generation
2025-10-16 14:19:55,141 | ERROR | workflow | error:115 | ❌ 节点错误: chat_generation - All connection attempts failed
2025-10-16 14:19:55,142 | ERROR | workflow | exception:123 | 节点 chat_generation 异常详情
Traceback (most recent call last):
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpx/_transports/default.py", line 101, in map_httpcore_exceptions
yield
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpx/_transports/default.py", line 394, in handle_async_request
resp = await self._pool.handle_async_request(req)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpcore/_async/connection_pool.py", line 256, in handle_async_request
raise exc from None
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpcore/_async/connection_pool.py", line 236, in handle_async_request
response = await connection.handle_async_request(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpcore/_async/connection.py", line 101, in handle_async_request
raise exc
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpcore/_async/connection.py", line 78, in handle_async_request
stream = await self._connect(request)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpcore/_async/connection.py", line 124, in _connect
stream = await self._network_backend.connect_tcp(**kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpcore/_backends/auto.py", line 31, in connect_tcp
return await self._backend.connect_tcp(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpcore/_backends/anyio.py", line 113, in connect_tcp
with map_exceptions(exc_map):
File "/usr/lib/python3.12/contextlib.py", line 158, in __exit__
self.gen.throw(value)
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpcore/_exceptions.py", line 14, in map_exceptions
raise to_exc(exc) from exc
httpcore.ConnectError: All connection attempts failed
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/mnt/e/code/courseware/langchain-project/src/services/ollama_service.py", line 191, in generate_chat_response_streaming
async for chunk in self._chat_model.astream(langchain_messages):
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/langchain_core/language_models/chat_models.py", line 615, in astream
async for chunk in self._astream(
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/langchain_ollama/chat_models.py", line 981, in _astream
async for chunk in self._aiterate_over_stream(messages, stop, **kwargs):
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/langchain_ollama/chat_models.py", line 920, in _aiterate_over_stream
async for stream_resp in self._acreate_chat_stream(messages, stop, **kwargs):
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/langchain_ollama/chat_models.py", line 728, in _acreate_chat_stream
async for part in await self._async_client.chat(**chat_params):
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/ollama/_client.py", line 736, in inner
async with self._client.stream(*args, **kwargs) as r:
File "/usr/lib/python3.12/contextlib.py", line 210, in __aenter__
return await anext(self.gen)
^^^^^^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpx/_client.py", line 1583, in stream
response = await self.send(
^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpx/_client.py", line 1629, in send
response = await self._send_handling_auth(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpx/_client.py", line 1657, in _send_handling_auth
response = await self._send_handling_redirects(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpx/_client.py", line 1694, in _send_handling_redirects
response = await self._send_single_request(request)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpx/_client.py", line 1730, in _send_single_request
response = await transport.handle_async_request(request)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpx/_transports/default.py", line 393, in handle_async_request
with map_httpcore_exceptions():
File "/usr/lib/python3.12/contextlib.py", line 158, in __exit__
self.gen.throw(value)
File "/mnt/e/code/courseware/langchain-project/venv/lib/python3.12/site-packages/httpx/_transports/default.py", line 118, in map_httpcore_exceptions
raise mapped_exc(message) from exc
httpx.ConnectError: All connection attempts failed
2025-10-16 14:19:55,150 | INFO | workflow | info:107 | ✅ 节点完成: natural_chat
2025-10-16 14:19:55,150 | INFO | workflow | info:107 | ✅ 工作流完成: f00e128c-0371-445a-a839-94e56af47e64
2025-10-16 14:22:42,866 | INFO | langsmith_service | info:107 | ✅ LangSmith 监控已启用
2025-10-16 14:22:42,969 | INFO | ollama_service | info:107 | ✅ Ollama模型初始化成功
2025-10-16 14:22:42,969 | INFO | ollama_service | info:107 | 路由模型: qwen3:8b
2025-10-16 14:22:42,970 | INFO | ollama_service | info:107 | 对话模型: qwen3:8b
2025-10-16 14:22:43,427 | INFO | workflow | info:107 | 🔗 SimpleChatChain 初始化完成
2025-10-16 14:22:43,436 | INFO | workflow | info:107 | 📊 工作流图构建完成
2025-10-16 14:22:43,436 | INFO | workflow | info:107 | 🚀 ChatWorkflow 初始化完成
2025-10-16 14:22:43,437 | INFO | workflow | info:107 | 🤖 使用 LangGraph 工作流模式
2025-10-16 14:22:53,645 | INFO | workflow | info:107 | 🚀 工作流开始: 6702168e-ec95-4fb7-b6f2-19604956f42a
2025-10-16 14:22:53,647 | ERROR | workflow | error:115 | ❌ 工作流执行失败: 'async for' requires an object with __aiter__ method, got coroutine
2025-10-16 14:23:52,834 | INFO | langsmith_service | info:107 | ✅ LangSmith 监控已启用
2025-10-16 14:23:52,929 | INFO | ollama_service | info:107 | ✅ Ollama模型初始化成功
2025-10-16 14:23:52,929 | INFO | ollama_service | info:107 | 路由模型: qwen3:8b
2025-10-16 14:23:52,930 | INFO | ollama_service | info:107 | 对话模型: qwen3:8b
2025-10-16 14:23:53,379 | INFO | workflow | info:107 | 🔗 SimpleChatChain 初始化完成
2025-10-16 14:23:53,388 | INFO | workflow | info:107 | 📊 工作流图构建完成
2025-10-16 14:23:53,388 | INFO | workflow | info:107 | 🚀 ChatWorkflow 初始化完成
2025-10-16 14:23:53,388 | INFO | workflow | info:107 | 🤖 使用 LangGraph 工作流模式
2025-10-16 14:24:34,108 | INFO | workflow | info:107 | 🚀 工作流开始: fc689362-a5a0-47c3-8d10-cc44bb61f24a
2025-10-16 14:24:34,110 | ERROR | workflow | error:115 | ❌ 工作流执行失败: 'async for' requires an object with __aiter__ method, got coroutine
2025-10-16 14:27:04,689 | INFO | langsmith_service | info:107 | ✅ LangSmith 监控已启用
2025-10-16 14:27:04,809 | INFO | ollama_service | info:107 | ✅ Ollama模型初始化成功
2025-10-16 14:27:04,809 | INFO | ollama_service | info:107 | 路由模型: qwen3:8b
2025-10-16 14:27:04,809 | INFO | ollama_service | info:107 | 对话模型: deepseek-v3.1:671b-cloud
2025-10-16 14:27:05,192 | INFO | workflow | info:107 | 🔗 SimpleChatChain 初始化完成
2025-10-16 14:27:05,197 | INFO | workflow | info:107 | 📊 工作流图构建完成
2025-10-16 14:27:05,198 | INFO | workflow | info:107 | 🚀 ChatWorkflow 初始化完成
2025-10-16 14:27:05,198 | INFO | workflow | info:107 | 🤖 使用 LangGraph 工作流模式
2025-10-16 14:27:29,333 | INFO | langsmith_service | info:107 | ✅ LangSmith 监控已启用
2025-10-16 14:27:29,429 | INFO | ollama_service | info:107 | ✅ Ollama模型初始化成功
2025-10-16 14:27:29,430 | INFO | ollama_service | info:107 | 路由模型: qwen3:8b
2025-10-16 14:27:29,430 | INFO | ollama_service | info:107 | 对话模型: deepseek-v3.1:671b-cloud
2025-10-16 14:27:29,832 | INFO | workflow | info:107 | 🔗 SimpleChatChain 初始化完成
2025-10-16 14:27:29,839 | INFO | workflow | info:107 | 📊 工作流图构建完成
2025-10-16 14:27:29,839 | INFO | workflow | info:107 | 🚀 ChatWorkflow 初始化完成
2025-10-16 14:27:29,839 | INFO | workflow | info:107 | 🤖 使用 LangGraph 工作流模式
2025-10-16 14:28:56,067 | INFO | langsmith_service | info:107 | ✅ LangSmith 监控已启用
2025-10-16 14:28:56,185 | INFO | ollama_service | info:107 | ✅ Ollama模型初始化成功
2025-10-16 14:28:56,186 | INFO | ollama_service | info:107 | 路由模型: qwen3:8b
2025-10-16 14:28:56,186 | INFO | ollama_service | info:107 | 对话模型: deepseek-v3.1:671b-cloud
2025-10-16 14:28:56,566 | INFO | workflow | info:107 | 🔗 SimpleChatChain 初始化完成
2025-10-16 14:28:56,573 | INFO | workflow | info:107 | 📊 工作流图构建完成
2025-10-16 14:28:56,573 | INFO | workflow | info:107 | 🚀 ChatWorkflow 初始化完成
2025-10-16 14:28:56,573 | INFO | workflow | info:107 | 🤖 使用 LangGraph 工作流模式
2025-10-16 14:28:56,573 | INFO | workflow | info:107 | 🚀 工作流开始: fa0fc0e9-6a5d-40fe-960b-7ea45243645b
2025-10-16 14:28:56,574 | INFO | workflow | info:107 | 🚀 节点开始: intent_analysis
2025-10-16 14:28:56,574 | INFO | workflow | info:107 | 🚀 节点开始: intent_analysis
2025-10-16 14:29:08,536 | WARNING | ollama_service | warning:111 | 意图分析响应JSON解析失败: <think>
好的,用户输入的是“你好”。首先,我需要确定用户的意图类型。根据提供的分类,可能的意图有订单查询、一般对话、问候或告别。
用户的消息是一个简单的“你好”这明显属于问候语。根据示例当用户说“你好”时意图是greeting置信度为0.9。这里用户没有提到任何订单相关信息,也没有进行告别,所以排除其他意图。
接下来检查是否有需要提取的实体。由于用户只是打招呼没有提到订单号、产品名称或问题类型所以extracted_entities应该是空的。
确认置信度因为这是典型的问候语所以置信度较高设为0.9。分析理由就是用户发送了简单的问候语,没有其他上下文。
最后按照JSON格式返回结果确保键名正确没有语法错误。比如intent的值是greetingconfidence是0.9reasoning说明情况extracted_entities为空对象。
</think>
{
"intent": "greeting",
"confidence": 0.9,
"reasoning": "简单问候语",
"extracted_entities": {}
}
2025-10-16 14:29:08,537 | INFO | workflow | info:107 | ✅ 节点完成: intent_analysis
2025-10-16 14:29:08,537 | ERROR | workflow | error:115 | ❌ 工作流执行失败: "WorkflowState" object has no field "next_node"
2025-10-16 14:29:08,538 | ERROR | workflow | error:115 | CLI 聊天错误: EOF when reading a line
2025-10-16 14:29:36,334 | INFO | langsmith_service | info:107 | ✅ LangSmith 监控已启用
2025-10-16 14:29:36,454 | INFO | ollama_service | info:107 | ✅ Ollama模型初始化成功
2025-10-16 14:29:36,454 | INFO | ollama_service | info:107 | 路由模型: qwen3:8b
2025-10-16 14:29:36,454 | INFO | ollama_service | info:107 | 对话模型: deepseek-v3.1:671b-cloud
2025-10-16 14:29:36,830 | INFO | workflow | info:107 | 🔗 SimpleChatChain 初始化完成
2025-10-16 14:29:36,836 | INFO | workflow | info:107 | 📊 工作流图构建完成
2025-10-16 14:29:36,836 | INFO | workflow | info:107 | 🚀 ChatWorkflow 初始化完成
2025-10-16 14:29:36,836 | INFO | workflow | info:107 | 🤖 使用 LangGraph 工作流模式
2025-10-16 14:29:36,836 | INFO | workflow | info:107 | 🚀 工作流开始: f13ded17-de28-4c4f-985b-0855526e5d58
2025-10-16 14:29:36,836 | INFO | workflow | info:107 | 🚀 节点开始: intent_analysis
2025-10-16 14:29:36,837 | INFO | workflow | info:107 | 🚀 节点开始: intent_analysis
2025-10-16 14:29:44,532 | WARNING | ollama_service | warning:111 | 意图分析响应JSON解析失败: <think>
好的,用户输入的是“你好”。首先,我需要确定用户的意图类型。根据提供的分类,可能的意图有订单查询、一般对话、问候或告别。
用户的消息是一个简单的“你好”这明显属于问候语。根据示例当用户说“你好”时意图是greeting置信度为0.9。这里用户没有提到任何订单相关信息,也没有进行告别,所以排除其他意图。
接下来检查是否有需要提取的实体。由于用户只是打招呼没有提到订单号、产品名称或问题类型所以extracted_entities应该是空的。
确认置信度因为这是典型的问候语所以置信度较高设为0.9。分析理由就是用户发送了简单的问候语,没有其他上下文。
最后按照JSON格式返回结果确保键名正确没有语法错误。检查是否遗漏了任何必填字段如intent、confidence和reasoning。确认所有部分符合要求后生成最终的JSON响应。
</think>
{
"intent": "greeting",
"confidence": 0.9,
"reasoning": "简单问候语",
"extracted_entities": {}
}
2025-10-16 14:29:44,532 | INFO | workflow | info:107 | ✅ 节点完成: intent_analysis
2025-10-16 14:29:44,532 | INFO | workflow | info:107 | 💬 路由到自然对话 (意图: IntentType.GENERAL_CHAT, 置信度: 0.5)
2025-10-16 14:29:44,532 | INFO | workflow | info:107 | 🚀 节点开始: natural_chat
2025-10-16 14:29:44,535 | INFO | workflow | info:107 | 💬 构建对话消息: 2 条消息
2025-10-16 14:29:44,535 | ERROR | workflow | error:115 | ❌ 节点错误: natural_chat - LangSmithService.log_model_call() got an unexpected keyword argument 'input_messages'
2025-10-16 14:29:44,535 | ERROR | workflow | exception:123 | 节点 natural_chat 异常详情
Traceback (most recent call last):
File "/mnt/e/code/courseware/langchain-project/src/workflows/nodes/natural_chat.py", line 50, in natural_chat_node
langsmith_service.log_model_call(
TypeError: LangSmithService.log_model_call() got an unexpected keyword argument 'input_messages'
2025-10-16 14:29:44,538 | ERROR | workflow | error:115 | CLI 聊天错误: EOF when reading a line
2025-10-16 14:30:44,848 | INFO | langsmith_service | info:107 | ✅ LangSmith 监控已启用
2025-10-16 14:30:44,987 | INFO | ollama_service | info:107 | ✅ Ollama模型初始化成功
2025-10-16 14:30:44,988 | INFO | ollama_service | info:107 | 路由模型: qwen3:8b
2025-10-16 14:30:44,988 | INFO | ollama_service | info:107 | 对话模型: deepseek-v3.1:671b-cloud
2025-10-16 14:30:45,365 | INFO | workflow | info:107 | 🔗 SimpleChatChain 初始化完成
2025-10-16 14:30:45,373 | INFO | workflow | info:107 | 📊 工作流图构建完成
2025-10-16 14:30:45,373 | INFO | workflow | info:107 | 🚀 ChatWorkflow 初始化完成
2025-10-16 14:30:45,374 | INFO | workflow | info:107 | 🤖 使用 LangGraph 工作流模式
2025-10-16 14:30:45,374 | INFO | workflow | info:107 | 🚀 工作流开始: 104977cf-8c8f-4ac5-af61-b49837e622dd
2025-10-16 14:30:45,374 | INFO | workflow | info:107 | 🚀 节点开始: intent_analysis
2025-10-16 14:30:45,374 | INFO | workflow | info:107 | 🚀 节点开始: intent_analysis
2025-10-16 14:30:50,704 | WARNING | ollama_service | warning:111 | 意图分析响应JSON解析失败: <think>
好的,用户输入的是“你好”。首先,我需要确定用户的意图类型。根据提供的分类,可能的意图有订单查询、一般对话、问候或告别。
用户的消息是一个简单的“你好”这明显属于问候语。根据示例当用户说“你好”时正确的意图是greeting置信度较高比如0.9。分析理由应该是“简单问候语”并且没有提取到任何实体信息。因此应该返回对应的JSON结构。需要确认是否有其他可能的意图但这里没有提到订单或产品相关信息所以排除其他类型。确定意图后按照格式输出结果。
</think>
{
"intent": "greeting",
"confidence": 0.9,
"reasoning": "简单问候语",
"extracted_entities": {}
}
2025-10-16 14:30:50,704 | INFO | workflow | info:107 | ✅ 节点完成: intent_analysis
2025-10-16 14:30:50,705 | INFO | workflow | info:107 | 💬 路由到自然对话 (意图: IntentType.GENERAL_CHAT, 置信度: 0.5)
2025-10-16 14:30:50,705 | INFO | workflow | info:107 | 🚀 节点开始: natural_chat
2025-10-16 14:30:50,711 | INFO | workflow | info:107 | 💬 构建对话消息: 2 条消息
2025-10-16 14:30:50,712 | INFO | workflow | info:107 | 🚀 节点开始: chat_generation
2025-10-16 14:30:54,424 | INFO | workflow | info:107 | ✅ 节点完成: natural_chat
2025-10-16 14:30:54,424 | ERROR | workflow | error:115 | ❌ 工作流执行失败: LangSmithService.log_conversation_end() missing 2 required positional arguments: 'total_duration' and 'message_count'
2025-10-16 14:30:54,425 | ERROR | workflow | error:115 | CLI 聊天错误: EOF when reading a line
2025-10-16 14:31:32,407 | INFO | langsmith_service | info:107 | ✅ LangSmith 监控已启用
2025-10-16 14:31:32,530 | INFO | ollama_service | info:107 | ✅ Ollama模型初始化成功
2025-10-16 14:31:32,530 | INFO | ollama_service | info:107 | 路由模型: qwen3:8b
2025-10-16 14:31:32,530 | INFO | ollama_service | info:107 | 对话模型: deepseek-v3.1:671b-cloud
2025-10-16 14:31:32,976 | INFO | workflow | info:107 | 🔗 SimpleChatChain 初始化完成
2025-10-16 14:31:32,982 | INFO | workflow | info:107 | 📊 工作流图构建完成
2025-10-16 14:31:32,983 | INFO | workflow | info:107 | 🚀 ChatWorkflow 初始化完成
2025-10-16 14:31:32,983 | INFO | workflow | info:107 | 🤖 使用 LangGraph 工作流模式
2025-10-16 14:31:32,983 | INFO | workflow | info:107 | 🚀 工作流开始: 1b70dff8-397e-4c81-9ff8-76e462a6782a
2025-10-16 14:31:32,983 | INFO | workflow | info:107 | 🚀 节点开始: intent_analysis
2025-10-16 14:31:32,984 | INFO | workflow | info:107 | 🚀 节点开始: intent_analysis
2025-10-16 14:31:38,232 | WARNING | ollama_service | warning:111 | 意图分析响应JSON解析失败: <think>
好的,用户输入的是“你好”。首先,我需要确定用户的意图类型。根据提供的分类,可能的意图有订单查询、一般对话、问候或告别。
用户的消息是一个简单的“你好”这明显属于问候语。根据示例当用户说“你好”时正确的意图是greeting置信度较高比如0.9。分析理由应该是“简单问候语”并且没有提取到任何实体信息。因此应该返回对应的JSON结构。需要确认是否有其他可能的意图但这里没有提到订单或产品相关信息所以排除其他类型。确定意图后按照格式输出结果。
</think>
{
"intent": "greeting",
"confidence": 0.9,
"reasoning": "简单问候语",
"extracted_entities": {}
}
2025-10-16 14:31:38,233 | INFO | workflow | info:107 | ✅ 节点完成: intent_analysis
2025-10-16 14:31:38,233 | INFO | workflow | info:107 | 💬 路由到自然对话 (意图: IntentType.GENERAL_CHAT, 置信度: 0.5)
2025-10-16 14:31:38,233 | INFO | workflow | info:107 | 🚀 节点开始: natural_chat
2025-10-16 14:31:38,236 | INFO | workflow | info:107 | 💬 构建对话消息: 2 条消息
2025-10-16 14:31:38,236 | INFO | workflow | info:107 | 🚀 节点开始: chat_generation
2025-10-16 14:31:39,377 | INFO | workflow | info:107 | ✅ 节点完成: natural_chat
2025-10-16 14:31:39,378 | INFO | workflow | info:107 | ✅ 工作流完成: 1b70dff8-397e-4c81-9ff8-76e462a6782a
2025-10-16 14:31:39,378 | ERROR | workflow | error:115 | CLI 聊天错误: EOF when reading a line
2025-10-16 14:34:41,876 | INFO | langsmith_service | info:107 | ✅ LangSmith 监控已启用
2025-10-16 14:34:41,983 | INFO | ollama_service | info:107 | ✅ Ollama模型初始化成功
2025-10-16 14:34:42,361 | INFO | ollama_service | info:107 | 路由模型: qwen3:8b
2025-10-16 14:34:42,361 | INFO | ollama_service | info:107 | 对话模型: deepseek-v3.1:671b-cloud
2025-10-16 14:34:42,762 | INFO | workflow | info:107 | 🔗 SimpleChatChain 初始化完成
2025-10-16 14:34:42,768 | INFO | workflow | info:107 | 📊 工作流图构建完成
2025-10-16 14:34:42,768 | INFO | workflow | info:107 | 🚀 ChatWorkflow 初始化完成
2025-10-16 14:34:42,769 | INFO | workflow | info:107 | 🤖 使用 LangGraph 工作流模式
2025-10-16 14:34:47,146 | INFO | workflow | info:107 | 🚀 工作流开始: 5f77da0e-b73f-4357-b0ea-14487fe69849
2025-10-16 14:34:47,146 | INFO | workflow | info:107 | 🚀 节点开始: intent_analysis
2025-10-16 14:34:47,147 | INFO | workflow | info:107 | 🚀 节点开始: intent_analysis
2025-10-16 14:34:52,629 | WARNING | ollama_service | warning:111 | 意图分析响应JSON解析失败: <think>
好的,用户输入的是“你好”。首先,我需要确定用户的意图类型。根据提供的分类,可能的意图有订单查询、一般对话、问候或告别。
用户的消息是一个简单的“你好”这明显属于问候语。根据示例当用户说“你好”时正确的意图是greeting置信度较高比如0.9。分析理由应该是“简单问候语”并且没有提取到任何实体信息。因此应该返回对应的JSON结构。需要确认是否有其他可能的意图但这里没有提到订单或产品相关信息所以排除其他类型。确定意图后按照格式输出结果。
</think>
{
"intent": "greeting",
"confidence": 0.9,
"reasoning": "简单问候语",
"extracted_entities": {}
}
2025-10-16 14:34:52,629 | INFO | workflow | info:107 | ✅ 节点完成: intent_analysis
2025-10-16 14:34:52,630 | INFO | workflow | info:107 | 💬 路由到自然对话 (意图: IntentType.GENERAL_CHAT, 置信度: 0.5)
2025-10-16 14:34:52,630 | INFO | workflow | info:107 | 🚀 节点开始: natural_chat
2025-10-16 14:34:52,638 | INFO | workflow | info:107 | 💬 构建对话消息: 2 条消息
2025-10-16 14:34:52,639 | INFO | workflow | info:107 | 🚀 节点开始: chat_generation
2025-10-16 14:34:54,005 | INFO | workflow | info:107 | ✅ 节点完成: natural_chat
2025-10-16 14:34:54,006 | INFO | workflow | info:107 | ✅ 工作流完成: 5f77da0e-b73f-4357-b0ea-14487fe69849
2025-10-16 14:37:50,531 | INFO | langsmith_service | info:107 | ✅ LangSmith 监控已启用
2025-10-16 14:37:50,635 | INFO | ollama_service | info:107 | ✅ Ollama模型初始化成功
2025-10-16 14:37:50,636 | INFO | ollama_service | info:107 | 路由模型: qwen3:8b
2025-10-16 14:37:50,636 | INFO | ollama_service | info:107 | 对话模型: deepseek-v3.1:671b-cloud
2025-10-16 14:37:51,033 | INFO | workflow | info:107 | 🔗 SimpleChatChain 初始化完成
2025-10-16 14:37:51,048 | INFO | workflow | info:107 | 📊 工作流图构建完成
2025-10-16 14:37:51,049 | INFO | workflow | info:107 | 🚀 ChatWorkflow 初始化完成
2025-10-16 14:37:51,049 | INFO | workflow | info:107 | 🤖 使用 LangGraph 工作流模式
2025-10-16 14:37:54,876 | INFO | workflow | info:107 | 🚀 工作流开始: d4d6c825-bc4c-40f0-8681-d88b3a6ce27c
2025-10-16 14:37:54,876 | INFO | workflow | info:107 | 🚀 节点开始: intent_analysis
2025-10-16 14:37:54,877 | INFO | workflow | info:107 | 🚀 节点开始: intent_analysis
2025-10-16 14:37:56,370 | INFO | workflow | info:107 | 🎯 意图分析: '你好' -> greeting (置信度: 0.90)
2025-10-16 14:37:56,371 | INFO | workflow | info:107 | ✅ 节点完成: intent_analysis
2025-10-16 14:37:56,371 | INFO | workflow | info:107 | 💬 路由到自然对话 (意图: IntentType.GREETING, 置信度: 0.9)
2025-10-16 14:37:56,371 | INFO | workflow | info:107 | 🚀 节点开始: natural_chat
2025-10-16 14:37:56,375 | INFO | workflow | info:107 | 💬 构建对话消息: 2 条消息
2025-10-16 14:37:56,375 | INFO | workflow | info:107 | 🚀 节点开始: chat_generation
2025-10-16 14:37:57,861 | INFO | workflow | info:107 | ✅ 节点完成: natural_chat
2025-10-16 14:37:57,861 | INFO | workflow | info:107 | ✅ 工作流完成: d4d6c825-bc4c-40f0-8681-d88b3a6ce27c
2025-10-16 14:40:43,646 | INFO | langsmith_service | info:107 | ✅ LangSmith 监控已启用
2025-10-16 14:40:43,762 | INFO | ollama_service | info:107 | ✅ Ollama模型初始化成功
2025-10-16 14:40:43,762 | INFO | ollama_service | info:107 | 路由模型: qwen3:8b
2025-10-16 14:40:43,762 | INFO | ollama_service | info:107 | 对话模型: deepseek-v3.1:671b-cloud
2025-10-16 14:40:44,151 | INFO | workflow | info:107 | 🔗 SimpleChatChain 初始化完成
2025-10-16 14:40:44,157 | INFO | workflow | info:107 | 📊 工作流图构建完成
2025-10-16 14:40:44,157 | INFO | workflow | info:107 | 🚀 ChatWorkflow 初始化完成
2025-10-16 14:40:44,157 | INFO | workflow | info:107 | 🤖 使用 LangGraph 工作流模式
2025-10-16 14:40:44,158 | INFO | workflow | info:107 | 🚀 工作流开始: b15a4eb7-8ebc-440e-a279-37003f2da53b
2025-10-16 14:40:44,158 | INFO | workflow | info:107 | 🚀 节点开始: intent_analysis
2025-10-16 14:40:44,158 | INFO | workflow | info:107 | 🚀 节点开始: intent_analysis
2025-10-16 14:40:45,704 | INFO | workflow | info:107 | 🎯 意图分析: '你好' -> greeting (置信度: 0.90)
2025-10-16 14:40:45,705 | INFO | workflow | info:107 | ✅ 节点完成: intent_analysis
2025-10-16 14:40:45,706 | INFO | workflow | info:107 | 💬 路由到自然对话 (意图: IntentType.GREETING, 置信度: 0.9)
2025-10-16 14:40:45,706 | INFO | workflow | info:107 | 🚀 节点开始: natural_chat
2025-10-16 14:40:45,707 | INFO | workflow | info:107 | 💬 构建对话消息: 2 条消息
2025-10-16 14:40:45,707 | INFO | workflow | info:107 | 🚀 节点开始: chat_generation
2025-10-16 14:41:01,467 | INFO | workflow | info:107 | ✅ 节点完成: natural_chat
2025-10-16 14:41:01,468 | INFO | workflow | info:107 | ✅ 工作流完成: b15a4eb7-8ebc-440e-a279-37003f2da53b
2025-10-16 14:41:01,468 | ERROR | workflow | error:115 | CLI 聊天错误: EOF when reading a line
2025-10-16 14:42:17,003 | INFO | langsmith_service | info:107 | ✅ LangSmith 监控已启用
2025-10-16 14:42:17,093 | INFO | ollama_service | info:107 | ✅ Ollama模型初始化成功
2025-10-16 14:42:17,094 | INFO | ollama_service | info:107 | 路由模型: qwen3:8b
2025-10-16 14:42:17,094 | INFO | ollama_service | info:107 | 对话模型: deepseek-v3.1:671b-cloud
2025-10-16 14:42:17,468 | INFO | workflow | info:107 | 🔗 SimpleChatChain 初始化完成
2025-10-16 14:42:17,474 | INFO | workflow | info:107 | 📊 工作流图构建完成
2025-10-16 14:42:17,474 | INFO | workflow | info:107 | 🚀 ChatWorkflow 初始化完成
2025-10-16 14:42:17,474 | INFO | workflow | info:107 | 🤖 使用 LangGraph 工作流模式
2025-10-16 14:42:27,736 | INFO | workflow | info:107 | 🚀 工作流开始: 90315e23-b247-47d7-83a0-970f6d76aeaa
2025-10-16 14:42:27,736 | INFO | workflow | info:107 | 🚀 节点开始: intent_analysis
2025-10-16 14:42:27,737 | INFO | workflow | info:107 | 🚀 节点开始: intent_analysis
2025-10-16 14:42:29,393 | INFO | workflow | info:107 | 🎯 意图分析: 'go语言的优势有哪些' -> general_chat (置信度: 0.95)
2025-10-16 14:42:29,393 | INFO | workflow | info:107 | ✅ 节点完成: intent_analysis
2025-10-16 14:42:29,393 | INFO | workflow | info:107 | 💬 路由到自然对话 (意图: IntentType.GENERAL_CHAT, 置信度: 0.95)
2025-10-16 14:42:29,394 | INFO | workflow | info:107 | 🚀 节点开始: natural_chat
2025-10-16 14:42:29,394 | INFO | workflow | info:107 | 💬 构建对话消息: 2 条消息
2025-10-16 14:42:29,394 | INFO | workflow | info:107 | 🚀 节点开始: chat_generation
2025-10-16 14:42:34,289 | INFO | workflow | info:107 | ✅ 节点完成: natural_chat
2025-10-16 14:42:34,290 | INFO | workflow | info:107 | ✅ 工作流完成: 90315e23-b247-47d7-83a0-970f6d76aeaa
2025-10-16 14:58:42,473 | INFO | langsmith_service | info:107 | ✅ LangSmith 监控已启用
2025-10-16 14:58:42,588 | INFO | ollama_service | info:107 | ✅ Ollama模型初始化成功
2025-10-16 14:58:42,588 | INFO | ollama_service | info:107 | 路由模型: qwen3:8b
2025-10-16 14:58:42,588 | INFO | ollama_service | info:107 | 对话模型: deepseek-v3.1:671b-cloud
2025-10-16 14:58:43,018 | INFO | workflow | info:107 | 🔗 SimpleChatChain 初始化完成
2025-10-16 14:58:43,024 | INFO | workflow | info:107 | 📊 工作流图构建完成
2025-10-16 14:58:43,025 | INFO | workflow | info:107 | 🚀 ChatWorkflow 初始化完成
2025-10-16 14:58:43,025 | INFO | workflow | info:107 | 🤖 使用 LangGraph 工作流模式
2025-10-16 14:58:43,026 | INFO | workflow | info:107 | 🚀 工作流开始: 3189b64b-e643-43ca-82e2-fb9197003758
2025-10-16 14:58:43,026 | INFO | workflow | info:107 | 🚀 节点开始: intent_analysis
2025-10-16 14:58:43,026 | INFO | workflow | info:107 | 🚀 节点开始: intent_analysis
2025-10-16 14:58:48,598 | INFO | workflow | info:107 | 🎯 意图分析: '你好' -> greeting (置信度: 0.90)
2025-10-16 14:58:48,599 | INFO | workflow | info:107 | ✅ 节点完成: intent_analysis
2025-10-16 14:58:48,599 | INFO | workflow | info:107 | 💬 路由到自然对话 (意图: IntentType.GREETING, 置信度: 0.9)
2025-10-16 14:58:48,599 | INFO | workflow | info:107 | 🚀 节点开始: natural_chat
2025-10-16 14:58:48,599 | INFO | workflow | info:107 | 💬 构建对话消息: 2 条消息
2025-10-16 14:58:48,600 | INFO | workflow | info:107 | 🚀 节点开始: chat_generation
2025-10-16 14:58:51,149 | INFO | workflow | info:107 | ✅ 节点完成: natural_chat
2025-10-16 14:58:51,149 | INFO | workflow | info:107 | ✅ 工作流完成: 3189b64b-e643-43ca-82e2-fb9197003758
2025-10-16 14:58:51,149 | ERROR | workflow | error:115 | CLI 聊天错误: EOF when reading a line
2025-10-16 14:59:01,147 | INFO | langsmith_service | info:107 | ✅ LangSmith 监控已启用
2025-10-16 14:59:01,241 | INFO | ollama_service | info:107 | ✅ Ollama模型初始化成功
2025-10-16 14:59:01,241 | INFO | ollama_service | info:107 | 路由模型: qwen3:8b
2025-10-16 14:59:01,241 | INFO | ollama_service | info:107 | 对话模型: deepseek-v3.1:671b-cloud
2025-10-16 14:59:01,613 | INFO | workflow | info:107 | 🔗 SimpleChatChain 初始化完成
2025-10-16 14:59:01,619 | INFO | workflow | info:107 | 📊 工作流图构建完成
2025-10-16 14:59:01,619 | INFO | workflow | info:107 | 🚀 ChatWorkflow 初始化完成
2025-10-16 14:59:01,619 | INFO | workflow | info:107 | 🤖 使用 LangGraph 工作流模式
2025-10-16 14:59:01,619 | INFO | workflow | info:107 | 🚀 工作流开始: c0b9d7cf-447c-4cc4-b958-c6e79a55654f
2025-10-16 14:59:01,619 | INFO | workflow | info:107 | 🚀 节点开始: intent_analysis
2025-10-16 14:59:01,620 | INFO | workflow | info:107 | 🚀 节点开始: intent_analysis
2025-10-16 14:59:03,645 | INFO | workflow | info:107 | 🎯 意图分析: '我的订单12345有问题' -> order_inquiry (置信度: 0.95)
2025-10-16 14:59:03,645 | INFO | workflow | info:107 | ✅ 节点完成: intent_analysis
2025-10-16 14:59:03,646 | INFO | workflow | info:107 | 🎯 路由到订单诊断 (置信度: 0.95)
2025-10-16 14:59:03,646 | INFO | workflow | info:107 | 🚀 节点开始: order_diagnosis
2025-10-16 14:59:04,149 | INFO | workflow | info:107 | 📦 获取订单信息: 12345
2025-10-16 14:59:04,651 | INFO | workflow | info:107 | 🚀 节点开始: diagnosis_thinking
2025-10-16 14:59:40,685 | INFO | workflow | info:107 | 🔍 生成诊断结果: 12345
2025-10-16 14:59:40,685 | INFO | workflow | info:107 | 🚀 节点开始: chat_generation
2025-10-16 14:59:57,255 | INFO | workflow | info:107 | 💬 生成最终回复: 12345
2025-10-16 14:59:57,257 | INFO | workflow | info:107 | ✅ 节点完成: order_diagnosis
2025-10-16 14:59:57,258 | INFO | workflow | info:107 | ✅ 工作流完成: c0b9d7cf-447c-4cc4-b958-c6e79a55654f
2025-10-16 14:59:57,258 | ERROR | workflow | error:115 | CLI 聊天错误: EOF when reading a line