Does anyone know if Application Inference Profile’s are supported when routing AI requests through Bedrock? Fundamentally I’m looking to segment AWS bedrock costs by team. Being able to create two separate inference profiles for anthropic.claude-sonnet-4-5-20250929-v1:0 and having both options show up in cursor would be ideal.
I’m guessing this is not currently supported given that Cursor maintains the list of supported bedrock models and there’s probably no way to propagate the inference profiles arn to the model select window.