Faster Copilot++: We’ve made Copilot++ ~2x faster! This speed bump comes from a new model / faster inference. ~50% of users are already on this model, and it will roll out to everyone over a few days. If you’d like to enable the model immediately, you can control your model in the bottom bar of the editor.
Stable Claude Support: All the newest Claude models are available for Pro and API key users. Head to Settings > Models to toggle them on. Pro users get 10 requests / day for free and can keep using Claude at API-key prices for subsequent requests.
Does that mean we need to have an api key entered to continue using claude, or do you bill us from your end. If that is the case, how can we track usage/knowing when 10 requests have been used.
You just need to hit one button to opt-in to usage based pricing once you hit that limit. You’ll get charged the same amount that Anthropic would have charged you for those subsequent requests.
Let us know if you have feedback. We thought granting 10 claude req / day (i.e. 300 req / mo) + usage-based after that struck the right balance of sustainably covering costs, while giving pro users freedom.
This broke WSL & remote servers again, it’s the 404 on the code server download issue. I think I made an issue on github for it, maybe I found a fix last time. Application is currently unusable.
@truell20 … perhaps instead of charging 20 USD for pro users… you could charge 30 and give faster and better access to models like claude opus… users are always willing to pay a little more provided that they get access to much better models. i think the added 10 USD will easily cover your difference in cost…
good joke tangjun… there are only 3 plans now… free… pro and business. a plan name like advanced could bridge the gap between pro and advanced in price point. so Free - Pro-Pro±business does not seem all that bad…
Thank you so much for the update! I’m getting “invalid API key” error for Anthropic. Cursor version is 0.30.3. Checked the same key is working in another API call. Tried other keys and they all returned the same error.
I am having the same problem as @sangmin, but the steps mentioned to correct it did not work for me. I keep getting a message informing my APY Key is invalid, even though I have generated 3 different keys
@truell20 can we please also have support for using our aws creds? So we can access claude via aws/bedrock/claude-3?
client = AnthropicBedrock(
# Authenticate by either providing the keys below or use the default AWS credential providers, such as
# using ~/.aws/credentials or the “AWS_SECRET_ACCESS_KEY” and “AWS_ACCESS_KEY_ID” environment variables.
aws_access_key=“”,
aws_secret_key=“”,
# Temporary credentials can be used with aws_session_token.
# Read more at Temporary security credentials in IAM - AWS Identity and Access Management.
aws_session_token=“<session_token>”,
# aws_region changes the aws region to which the request is made. By default, we read AWS_REGION,
# and if that’s not present, we default to us-east-1. Note that we do not read ~/.aws/config for the region.
aws_region=“us-west-2”,
)
Cursor is useless at the moment. Why are you nerfing the responses? Heck, Claude will not respond with more than 1200 tokens, no continue button anymore. Back using Visual Studio after suppoirting you all for many many months, original discord. Why not remove the max_token value on the API and let it do its thing? 1200 tokens is nothing.