-
Notifications
You must be signed in to change notification settings - Fork 476
Description
Environment
- Python Version: 3.12.12
- ms-agent Version: Installed from source (main branch)
- Model Provider: ModelScope API Inference
- Model Name:
Qwen/Qwen3-235B-A22B-Instruct-2507 - OS: macOS (darwin)
Bug Description
When running the generate_script stage of the singularity_cinema project, the LLM automatically creates a semantically-named subdirectory based on the user's input topic (e.g., spongebob_birthday) and writes script.txt, title.txt, etc. into that subdirectory. However, the on_task_end method checks for files at output_video/script.txt (without the subdirectory), causing an assertion failure.
Steps to Reproduce
-
Navigate to the project directory:
cd projects/singularity_cinema -
Run the command:
PYTHONPATH=../.. ms-agent run --config . \ --query "SpongeBob and Patrick celebrate birthday together" \ --trust_remote_code true \ --llm.service modelscope \ --llm.modelscope_api_key <your_api_key> \ --llm.modelscope_base_url https://api-inference.modelscope.cn/v1/ \ --llm.model Qwen/Qwen3-235B-A22B-Instruct-2507
-
Wait for the
generate_scriptstage to complete
Expected Behavior
Task completes successfully and proceeds to the next workflow stage.
Actual Behavior
Throws AssertionError and terminates the task.
Error Logs
# LLM creates subdirectory
[INFO:ms_agent] [generate_script] [tool_calling]:
{
"arguments": {"path": "spongebob_birthday"},
"tool_name": "file_system---create_directory"
}
[INFO:ms_agent] [generate_script] Directory: <output_video/spongebob_birthday> was created.
# LLM writes files to subdirectory
[INFO:ms_agent] [generate_script] [tool_calling]:
{
"arguments": {
"path": "spongebob_birthday/script.txt",
"content": "Bikini Bottom is especially lively today..."
},
"tool_name": "file_system---write_file"
}
[INFO:ms_agent] [generate_script] Save file <.../output_video/spongebob_birthday/script.txt> successfully.
# Assertion fails
[WARNING:ms_agent] Traceback (most recent call last):
File ".../ms_agent/agent/llm_agent.py", line 1081, in run_loop
await self.on_task_end(messages)
File ".../projects/singularity_cinema/generate_script/agent.py", line 39, in on_task_end
assert os.path.isfile(script)
AssertionError
Full Stack Trace
Traceback (most recent call last):
File ".../ms-agent/.venv/bin/ms-agent", line 10, in <module>
sys.exit(run_cmd())
File ".../ms_agent/cli/cli.py", line 31, in run_cmd
cmd.execute()
File ".../ms_agent/cli/run.py", line 150, in execute
return self._execute_with_config()
File ".../ms_agent/cli/run.py", line 209, in _execute_with_config
asyncio.run(engine.run(self.args.query))
File ".../ms_agent/workflow/chain_workflow.py", line 102, in run
outputs = await engine.run(inputs)
File ".../projects/singularity_cinema/generate_script/agent.py", line 53, in run
inputs = await super().run(messages, **kwargs)
File ".../ms_agent/agent/llm_agent.py", line 1118, in run
async for chunk in self.run_loop(messages=messages, **kwargs):
File ".../ms_agent/agent/llm_agent.py", line 1099, in run_loop
raise e
File ".../ms_agent/agent/llm_agent.py", line 1081, in run_loop
await self.on_task_end(messages)
File ".../projects/singularity_cinema/generate_script/agent.py", line 39, in on_task_end
assert os.path.isfile(script)
AssertionError
Root Cause Analysis
| Expected Path | Actual Path |
|---|---|
output_video/script.txt |
output_video/spongebob_birthday/script.txt |
output_video/title.txt |
output_video/spongebob_birthday/title.txt |
The on_task_end method has hardcoded file paths:
def on_task_end(self, messages: List[Message]):
script = os.path.join(self.work_dir, 'script.txt') # output_video/script.txt
assert os.path.isfile(script) # Fails! File is in subdirectoryProposed Fix (Recommended)
The optimal solution is to combine both prompt modification and code-level fallback:
-
Update system prompt in
agent.yamlto explicitly instruct the LLM not to create subdirectories:**IMPORTANT: Create these files directly in the root directory. Do NOT create any subdirectories.** -
Add recursive file search in
agent.pyas a fallback mechanism:import glob def on_task_end(self, messages: List[Message]): # Try direct path first script = os.path.join(self.work_dir, 'script.txt') title = os.path.join(self.work_dir, 'title.txt') # Fallback to recursive search if not found if not os.path.isfile(script): script_files = glob.glob(os.path.join(self.work_dir, '**/script.txt'), recursive=True) assert len(script_files) > 0, f"script.txt not found in {self.work_dir}" script = script_files[0] if not os.path.isfile(title): title_files = glob.glob(os.path.join(self.work_dir, '**/title.txt'), recursive=True) assert len(title_files) > 0, f"title.txt not found in {self.work_dir}" title = title_files[0] return super().on_task_end(messages)
This approach ensures:
- The prompt guides the LLM to follow the expected behavior
- The code gracefully handles cases where the LLM doesn't follow instructions
Related Files
projects/singularity_cinema/agent.yamlprojects/singularity_cinema/generate_script/agent.py
I'm willing to work on a PR for this. Please let me know if this approach works, or if you would recommend a different solution.