Bedrock Application Inference Profile Support

Does anyone know if Application Inference Profile’s are supported when routing AI requests through Bedrock? Fundamentally I’m looking to segment AWS bedrock costs by team. Being able to create two separate inference profiles for anthropic.claude-sonnet-4-5-20250929-v1:0 and having both options show up in cursor would be ideal.

I’m guessing this is not currently supported given that Cursor maintains the list of supported bedrock models and there’s probably no way to propagate the inference profiles arn to the model select window.

Hey, good question.

Application Inference Profiles aren’t supported yet. Cursor uses the standard Bedrock base model IDs, so you can’t pass a custom inference profile ARN through the model picker.

As a workaround, you can set up separate Cursor workspaces for each team, each with its own IAM role and Bedrock config, so you can track costs at the AWS role level. It’s not as clean as inference profiles, but it works.