AWS Bedrock Integration - "Selected model is not supported" Error

Hi Cursor Team and Community,

I’m having trouble getting AWS Bedrock to work properly in Cursor and would appreciate any guidance.

My Setup

  • Cursor Version: latest
  • AWS Region: us-east-1
  • Model I’m trying to use: global.anthropic.claude-opus-4-6-v1
  • AWS credentials: Access Key ID + Secret Access Key (configured directly in Cursor’s Bedrock settings)

What Works
I can successfully call the same model (Claude Opus 4.6) via Python using the boto3 SDK with the same AWS credentials. The model access is confirmed in the AWS Bedrock console.

The Problem
When I enable AWS Bedrock in Cursor’s settings and enter the model ID, I receive the following error:

“Selected model is not supported by bedrock, please use a different model”

I have already tried the following model ID formats without success:

  • global.anthropic.claude-opus-4-6-v1
  • us.anthropic.claude-opus-4-6-v1:0
  • anthropic.claude-opus-4-6-v1

I also confirmed that:

  • IAM permissions include bedrock:InvokeModel, bedrock:InvokeModelWithResponseStream, and bedrock:ListInferenceProfiles
  • Model access for Claude Opus 4.6 is granted in the AWS Bedrock console
  • The same credentials work perfectly via Python/boto3

My Questions

  1. Which exact model ID format does Cursor’s native Bedrock integration support for Claude Opus 4.6?
  2. Does Cursor currently support Claude Opus 4.6 via the native Bedrock integration, or is it not yet available?
  3. Is there a recommended workaround (e.g., using Bedrock Access Gateway as an OpenAI-compatible endpoint)?
1 Like

Hey! This is a known issue. The Bedrock validator in Cursor doesn’t recognize some newer model IDs yet, like Claude Opus 4.6.

A workaround is to add the model as a custom model instead of relying on the built-in Bedrock model list. A user in this thread confirmed that global.anthropic.claude-opus-4-6-v1 (without :0) works if you add it as a custom model.

The team is aware of the model name validation issue. Your report helps us prioritize it.

Let me know if the custom model approach works for you.

I face the same issue. Is there a fix?

It’s quite frustrating to have the AWS Bedrock option explicitly mentioned in the settings but consistently not working