Bedrock LLM Error With Browser Extract
Introduction
Large Language Models (LLMs) have revolutionized the way we interact with technology, enabling us to perform complex tasks with ease. However, like any other software, LLMs are not immune to errors. In this article, we will delve into the world of Bedrock LLM error with browser extract, exploring the root cause of the issue and providing a step-by-step guide to resolve it.
Bug Description
OpenManus, a popular tool for extracting content from web pages, has stopped working with Bedrock due to an unexpected error. The error message reads:
"Unexpected error in ask_tool: An error occurred (ValidationException) when calling the Converse operation: A conversation must start with a user message. Try again with a conversation that starts with a user message."
This error occurs because the message log uses only a system prompt, without a user prompt. To resolve this issue, we need to modify the system prompt to include a user prompt.
Bug Solved Method
The solution to this bug lies in modifying the browser_use_tool.py
file in the app/tool
directory. Specifically, we need to change the messages
variable to include a user prompt.
diff --git a/app/tool/browser_use_tool.py b/app/tool/browser_use_tool.py
index 449e8e5..78bea9e 100644
--- a/app/tool/browser_use_tool.py
+++ b/app/tool/browser_use_tool.py
@@ -386,11 +386,9 @@ class BrowserUseTool(BaseTool, Generic[Context]):
prompt = f"""\
Your task is to extract the content of the page. You will be given a page and a goal, and you should extract all relevant information around this goal from the page. If the goal is vague, summarize the page. Respond in json format.
Extraction goal: {goal}
-
-Page content:
-{content[:max_content_length]}
"""
- messages = [{"role": "system", "content": prompt}]
+ messages = [{"role": "system", "content": prompt},
+ {"role": "user", "content": f"Page content: {content[:max_content_length]}"}]
# Define extraction function schema
extraction_function = {
As you can see, the messages
variable now includes a user prompt with the content of the page. This change resolves the error and allows OpenManus to work with Bedrock again.
Environment Information
To troubleshoot the issue, it's essential to have the following information:
- System version: This includes the operating system, version, and architecture.
- Python version: This includes the version of Python installed on your system.
- OpenManus version or branch: This includes the version or branch of OpenManus you are using.
- Installation method: This includes the method used to install OpenManus, such as
pip install -r requirements.txt
orpip install -e .
.
Extra Information
If you are still experiencing issues after applying the solution, please provide the following information:
- Error message: Include the exact error message you are seeing.
- System logs: Include any relevant system logs that may help diagnose issue.
- Configuration files: Include any relevant configuration files that may be related to the issue.
By following this guide, you should be able to resolve the Bedrock LLM error with browser extract and get OpenManus working again. If you have any further questions or issues, please don't hesitate to reach out.
Conclusion
In conclusion, the Bedrock LLM error with browser extract is a common issue that can be resolved by modifying the browser_use_tool.py
file. By including a user prompt in the messages
variable, we can resolve the error and get OpenManus working again. We hope this guide has been helpful in resolving your issue. If you have any further questions or need additional assistance, please don't hesitate to reach out.
Troubleshooting Tips
If you are still experiencing issues after applying the solution, here are some additional troubleshooting tips:
- Check the system logs for any relevant errors or warnings.
- Verify that the OpenManus version or branch is up-to-date.
- Try reinstalling OpenManus using a different installation method.
- Check the configuration files for any relevant settings that may be causing the issue.
Q: What is the Bedrock LLM error with browser extract?
A: The Bedrock LLM error with browser extract is an issue that occurs when using OpenManus with Bedrock, resulting in an unexpected error message. This error is caused by the message log using only a system prompt, without a user prompt.
Q: What is the error message I see when I encounter this issue?
A: The error message you see when encountering this issue is:
"Unexpected error in ask_tool: An error occurred (ValidationException) when calling the Converse operation: A conversation must start with a user message. Try again with a conversation that starts with a user message."
Q: How do I resolve the Bedrock LLM error with browser extract?
A: To resolve the Bedrock LLM error with browser extract, you need to modify the browser_use_tool.py
file in the app/tool
directory. Specifically, you need to change the messages
variable to include a user prompt.
Q: What changes do I need to make to the browser_use_tool.py
file?
A: You need to change the messages
variable to include a user prompt, like this:
diff --git a/app/tool/browser_use_tool.py b/app/tool/browser_use_tool.py
index 449e8e5..78bea9e 100644
--- a/app/tool/browser_use_tool.py
+++ b/app/tool/browser_use_tool.py
@@ -386,11 +386,9 @@ class BrowserUseTool(BaseTool, Generic[Context]):
prompt = f"""\
Your task is to extract the content of the page. You will be given a page and a goal, and you should extract all relevant information around this goal from the page. If the goal is vague, summarize the page. Respond in json format.
Extraction goal: {goal}
-
-Page content:
-{content[:max_content_length]}
"""
- messages = [{"role": "system", "content": prompt}]
+ messages = [{"role": "system", "content": prompt},
+ {"role": "user", "content": f"Page content: {content[:max_content_length]}"}]
# Define extraction function schema
extraction_function = {
Q: What is the role of the user prompt in resolving the Bedrock LLM error with browser extract?
A: The user prompt is essential in resolving the Bedrock LLM error with browser extract. By including a user prompt in the messages
variable, you are providing the necessary information for the conversation to start with a user message, which resolves the error.
Q: What are some additional troubleshooting tips if I am still experiencing issues after applying the solution?
A: If you are still experiencing issues after applying the solution, here are some additional troubleshooting tips:
- Check the system logs for any relevant errors or warnings.
- Verify that the OpenManus version or branch is up-to-date.
- Try reinstalling OpenManus using a different installation method.
- Check the configuration files for any relevant settings that may be causing the issue.
Q Can I get further assistance if I am still experiencing issues after applying the solution?
A: Yes, you can get further assistance by reaching out to the OpenManus support team or by posting on the OpenManus community forum. They will be happy to help you troubleshoot the issue and provide additional assistance if needed.
Conclusion
In conclusion, the Bedrock LLM error with browser extract is a common issue that can be resolved by modifying the browser_use_tool.py
file. By including a user prompt in the messages
variable, you can resolve the error and get OpenManus working again. We hope this Q&A article has been helpful in resolving your issue. If you have any further questions or need additional assistance, please don't hesitate to reach out.