Proxy Settings Not Working With Anthropic

by ADMIN 42 views

Introduction

When working with the beeai-framework, users may encounter issues with proxy settings not being applied correctly for Anthropic. This article aims to provide a detailed explanation of the problem, its reproduction steps, and the expected behavior. We will also explore possible solutions and workarounds to resolve this issue.

Describe the Bug

When initializing an AnthropicChatModel with custom proxy settings in the beeai-framework, the baseURL configuration is not being correctly applied. The proxy configuration is ignored, and the API calls are still being made directly to Anthropic instead of through the specified proxy URL.

To Reproduce

To reproduce this behavior, follow these steps:

Step 1: Install beeai-framework v0.1.9

First, ensure you have the latest version of the beeai-framework installed. You can do this by running the following command in your terminal:

npm install beeai-framework@0.1.9

Step 2: Initialize an AnthropicChatModel with custom baseURL

Next, create a new instance of the AnthropicChatModel with custom baseURL as shown below:

import { AnthropicChatModel } from 'beeai-framework';

const apiKey = "your-api-key";
const apiUrl = "https://your-proxy-server.com";

const model = new AnthropicChatModel("claude-3-7-sonnet-20250219", {}, {
  "apiKey": apiKey || process.env.IRIS_API_KEY,
  "baseURL": `${apiUrl || process.env.IRIS_API_URL}/api/proxy/anthropic`,
});

Step 3: Make a request using this model instance

Make a request using the model instance to observe the behavior:

const response = await model.invoke("Hello, how are you?");

Step 4: Observe that the request is sent to Anthropic's default API endpoint

Instead of being routed through the proxy URL specified in the baseURL parameter, the request is still being sent to Anthropic's default endpoint.

Expected Behavior

The API request should be routed through the proxy URL specified in the baseURL parameter, directing all traffic to ${apiUrl}/api/proxy/anthropic instead of Anthropic's default endpoint.

Code Snippets

Below are some code snippets that demonstrate the issue:

Current implementation that doesn't work

// Current implementation that doesn't work
import { AnthropicChatModel } from 'beeai-framework';

const apiKey = "your-api-key";
const apiUrl = "https://your-proxy-server.com";

const model = new AnthropicChatModel("claude-3-7-sonnet-20250219", {}, {
  "apiKey": apiKey || process.env.IRIS_API_KEY,
  "baseURL": `${apiUrl || process.env.IRIS_API_URL}/api/proxy/anthropic`,
});

// When using this model, requests still go to Anthropic's default endpoint
const response = await model.invoke("Hello, how are you?");

Set-up

The following set-up is required to reproduce this issue:

  • Bee version: v0.1.9
  • Model provider: Anthropic
  • beeai-framework version: 0.1.9

Additional Context

This issue appears to be specific to the Anthropic integration in the beeai-framework. Other model providers may be correctly handling the proxy settings. The problem might be in how the framework's AnthropicChatModel class processes the client configuration options, possibly ignoring the baseURL parameter or not passing it properly to the underlying API client.

Possible Solutions and Workarounds

To resolve this issue, you can try the following solutions and workarounds:

  • Update to the latest version of the beeai-framework: Ensure you have the latest version of the framework installed, as this issue may have been resolved in a newer version.
  • Check the AnthropicChatModel documentation: Review the documentation for the AnthropicChatModel class to ensure you are using it correctly and that the baseURL parameter is being passed correctly.
  • Use a different model provider: If the issue is specific to the Anthropic integration, try using a different model provider to see if the issue persists.
  • Customize the AnthropicChatModel class: If you have access to the source code of the AnthropicChatModel class, you can try customizing it to pass the baseURL parameter correctly.

Q: What is the issue with proxy settings not working with Anthropic in beeai-framework?

A: The issue is that the baseURL configuration is not being correctly applied when initializing an AnthropicChatModel with custom proxy settings in the beeai-framework. The proxy configuration is ignored, and the API calls are still being made directly to Anthropic instead of through the specified proxy URL.

Q: How do I reproduce this issue?

A: To reproduce this issue, follow these steps:

  1. Install beeai-framework v0.1.9
  2. Initialize an AnthropicChatModel with custom baseURL as shown below:
import { AnthropicChatModel } from 'beeai-framework';

const apiKey = "your-api-key";
const apiUrl = "https://your-proxy-server.com";

const model = new AnthropicChatModel("claude-3-7-sonnet-20250219", {}, {
  "apiKey": apiKey || process.env.IRIS_API_KEY,
  "baseURL": `${apiUrl || process.env.IRIS_API_URL}/api/proxy/anthropic`,
});
  1. Make a request using this model instance
  2. Observe that the request is sent to Anthropic's default API endpoint instead of the specified proxy URL

Q: What is the expected behavior?

A: The API request should be routed through the proxy URL specified in the baseURL parameter, directing all traffic to ${apiUrl}/api/proxy/anthropic instead of Anthropic's default endpoint.

Q: Why is this issue specific to the Anthropic integration?

A: This issue appears to be specific to the Anthropic integration in the beeai-framework. Other model providers may be correctly handling the proxy settings. The problem might be in how the framework's AnthropicChatModel class processes the client configuration options, possibly ignoring the baseURL parameter or not passing it properly to the underlying API client.

Q: How can I resolve this issue?

A: To resolve this issue, you can try the following solutions and workarounds:

  • Update to the latest version of the beeai-framework
  • Check the AnthropicChatModel documentation to ensure you are using it correctly and that the baseURL parameter is being passed correctly
  • Use a different model provider to see if the issue persists
  • Customize the AnthropicChatModel class to pass the baseURL parameter correctly

Q: What are the possible causes of this issue?

A: The possible causes of this issue include:

  • Ignoring the baseURL parameter in the AnthropicChatModel class
  • Not passing the baseURL parameter correctly to the underlying API client
  • A bug in the beeai-framework or AnthropicChatModel class

Q: How can I prevent this issue in the future?

A: To prevent this issue in the future, you can:

  • Regularly update to the latest version of the beeai-framework
  • Review the documentation for the AnthropicChatModel class to ensure you are using it correctly
  • Use a different model provider if the issue persists
  • Customize the AnthropicChatModel class to pass the baseURL parameter correctly

By following these steps and trying the possible solutions and workarounds, you should be able to resolve the issue with proxy settings not working with Anthropic in the beeai-framework.