Would you like to play a game?
[*I am vibing through an idea that came to me walking in the park today⦠What do you think of the following vibing context:
Let me know what you think (this is a super cheesy Claude produced statement, but Ive been building out the project in Cursorβ¦ and I think its a nifty idea. and Iβd like your inout/hate on the idea.
============
Introducing CursorX Challenge: The Future of AI-Augmented Coding Competitions
Hey Cursor community!
Iβm excited to share a new competition format Iβve been developing that showcases what makes our community special - CursorX Challenge!
What is it?
CursorX Challenge is a series of coding competitions where teams solve problems by leveraging Cursorβs AI agents. Unlike traditional hackathons, these challenges specifically focus on how effectively teams can collaborate with AI to build amazing solutions.
The secret sauce? Teams must document their interactions with Cursorβs AI, sharing prompts and strategies that led to breakthroughs. Itβs not just about the end product - itβs about the journey of human-AI collaboration.
How it works:
- Challenge repos are created with specific goals across various domains (web, data science, games, etc.)
- Teams register and get access to these repos
- Collaborative development happens using Cursorβs AI capabilities
- Solutions are submitted as PRs to the challenge repo
- Public + expert judging determines winners based on functionality, innovation, and AI collaboration quality
Why this is different:
- Celebrates both technical skill AND effective AI collaboration
- Creates a library of proven AI interaction patterns
- Builds community knowledge around βAI-nativeβ development
- Pushes the boundaries of whatβs possible with current tools
First challenge coming soon!
Iβm finalizing the framework and will announce our first challenge domain next week. Categories weβre considering include:
- Web app development
- Data visualization
- Game mechanics
- AI systems
- Developer tools
Whoβs interested in participating? Would love to gauge interest and hear what challenge domains would excite you most!
Also looking for volunteers to help judge, provide feedback on the framework, or sponsor prizes for winning teams.
Letβs show the world what the Cursor community can build when human creativity meets AI assistance!
#CursorXChallenge #AICoderCompetition #CursorIDE
CursorX Challenge - Development Diary
Project Overview
This diary tracks the implementation of the CursorX Challenge repository - a framework for AI-powered coding competitions using Cursor IDE agents.
Development Log
[2023-04-04] - Project Initialization
- Created development_diary.md
- Planned SQLite database structure for challenge metrics tracking
- Designed database schema for comprehensive challenge management
Database Architecture Decision
The SQLite database will serve as the control plane for the CursorX Challenge, tracking:
- Challenges and their lifecycle
- Teams and participants
- Submissions and evaluations
- AI agent usage metrics
- Judging scores and feedback
- Timeline events
This approach allows for comprehensive metrics and analytics throughout the challenge lifecycle.
[2023-04-04] - Database and Scripts Implementation
-
Created SQLite database schema with tables for:
- Challenges: store information about each coding competition
- Teams: participant teams and their members
- Participants: individual competitor details
- Challenge_Teams: many-to-many relationship between challenges and teams
- Submissions: team solution submissions
- Judging_Criteria: evaluation metrics for each challenge
- Scores: judge evaluations of submissions
- AI_Usage_Metrics: detailed tracking of AI agent usage
- Events: timeline and activity tracking for challenges
-
Implemented PowerShell scripts for database operations:
scripts/db/initialize_database.ps1
: Sets up the SQLite database and schemascripts/db/generate_metrics.ps1
: Generates reports on challenge participation and AI usagescripts/db/track_ai_usage.ps1
: Tracks AI agent usage during challenges
-
Created challenge management script:
scripts/setup_challenge.ps1
: Sets up a new challenge with database entries and file structure
Database Schema Details
Core Tables
-
challenges: Stores information about each coding competition
- Primary fields: challenge_id, title, description, category, start_date, end_date, status
-
teams: Records teams participating in challenges
- Primary fields: team_id, team_name
-
participants: Stores individual competitor information
- Primary fields: participant_id, github_username, email, display_name, team_id, role
-
submissions: Tracks solution submissions from teams
- Primary fields: submission_id, challenge_id, team_id, repository_url, commit_hash, submission_time, status
Evaluation Tables
-
judging_criteria: Defines evaluation metrics for challenges
- Primary fields: criteria_id, challenge_id, name, description, weight, max_score
-
scores: Records judge evaluations of submissions
- Primary fields: score_id, submission_id, criteria_id, judge_id, score, feedback
Metrics Tables
-
ai_usage_metrics: Tracks AI agent usage during challenges
- Primary fields: metric_id, team_id, challenge_id, timestamp, prompt_count, tokens_used, feature_used, context_size, action_taken
-
events: Timeline and activity tracking for challenges
- Primary fields: event_id, challenge_id, team_id, event_type, description, timestamp, metadata
[2023-04-04] - Challenge Management Workflows Implementation
-
Implemented team management script:
scripts/register_team.ps1
: Registers teams for challenges and adds participants
-
Created submission and evaluation scripts:
scripts/record_submission.ps1
: Records team solution submissionsscripts/score_submission.ps1
: Manages the scoring of submissions by judges
-
Key features of workflow scripts:
- Comprehensive logging to track all actions
- Event-based tracking for comprehensive timeline
- Interactive and batch modes for different usage scenarios
- Validation to ensure data integrity
- JSON metadata for flexible data storage
Challenge Workflow Architecture
The implemented scripts support the complete challenge lifecycle:
-
Challenge Setup
- Create challenge in database
- Generate directory structure and configuration files
- Define judging criteria
-
Team Registration
- Register teams in the system
- Add team members with roles (leader/member)
- Track team participation across challenges
-
Competition Phase
- Monitor AI usage during development
- Track prompts, tokens, and AI features utilized
- Record events for timeline analysis
-
Submission Phase
- Record solution submissions with metadata
- Track repository URLs and commit hashes
- Maintain submission status
-
Evaluation Phase
- Support interactive judging workflow
- Score submissions based on defined criteria
- Provide weighted scoring with feedback
- Generate evaluation reports
-
Analysis Phase
- Generate comprehensive metrics
- Analyze team performance and AI utilization
- Create reports in various formats (Console, CSV, JSON)
[2023-04-04] - Documentation and Example Data
-
Created comprehensive README documentation:
README.md
: Main project documentationcontrol_plane/README.md
: Control plane documentation
-
Added example data and logs:
logs/setup_example.log
: Example log file showing system operationscontrol_plane/metrics/ai_usage_example.json
: Sample AI usage metricscontrol_plane/metrics/scoring_example.json
: Sample scoring data
-
Created placeholder directories for project structure
Summary of Accomplishments
We have successfully implemented a comprehensive framework for running AI-augmented coding competitions using the CursorX Challenge system. Key accomplishments include:
-
Database Control Plane: Designed and implemented a SQLite database that serves as the central management system for the entire challenge lifecycle.
-
Script Infrastructure: Created a suite of PowerShell scripts that handle all aspects of challenge management, from setup to evaluation.
-
AI Metrics Tracking: Implemented detailed tracking of AI agent usage, allowing for comprehensive analysis of how teams leverage AI assistance.
-
Evaluation System: Developed a flexible scoring system that allows judges to evaluate submissions based on customizable criteria.
-
Reporting Capabilities: Created tools to generate detailed metrics and reports in various formats for challenge analysis.
-
Event Timeline: Implemented an event-driven system to track all activities throughout a challenge.
-
Documentation: Provided comprehensive documentation for all aspects of the system.
The CursorX Challenge framework is now ready for implementation and can be extended with additional features as needed for specific competition requirements.
Next Steps
- Create team registration and management scripts
- Implement submission tracking and evaluation system
- Develop user interface for challenge administrators
- Add visualization components for metrics and analytics
- Create documentation for all components
- Develop example challenges demonstrating the framework
Interactive Dashboard Challenge
Overview
Build a responsive, interactive web dashboard that visualizes and analyzes data from multiple sources. This challenge tests your ability to create a modern web application with a focus on data visualization, interactivity, and user experience.
Challenge Details
- Category: Web Development
- Difficulty: Medium
- Duration: 14 days
- Start Date: 2023-06-01
- End Date: 2023-06-15
Objectives
Create a web dashboard that:
- Fetches data from at least two different APIs or data sources
- Visualizes the data in at least three different chart types
- Allows users to filter and interact with the data
- Provides insights based on the data analysis
- Presents a polished, responsive user interface
- Implements proper error handling and loading states
Requirements
Functional Requirements
-
Data Retrieval
- Fetch data from at least two public APIs or provided datasets
- Implement error handling for API failures
- Include loading indicators during data fetching
-
Data Visualization
- Create at least three different types of charts (e.g., bar, line, pie, scatter plot)
- Ensure visualizations are interactive (tooltips, zooming, filtering)
- Include a data table view with sorting and filtering capabilities
-
User Interface
- Design a responsive layout that works on both desktop and mobile devices
- Create an intuitive navigation system
- Implement theme switching (light/dark mode)
- Ensure accessibility compliance (WCAG Level AA)
-
User Interactions
- Enable filtering data based on multiple parameters
- Allow saving dashboard configurations
- Implement data export functionality (CSV, JSON)
- Add search functionality within the dashboard
Technical Requirements
-
Use one of the following frontend frameworks:
- React
- Vue
- Angular
- Svelte
-
Implement state management using an appropriate library for your chosen framework
-
Use a modern charting library (e.g., D3.js, Chart.js, Highcharts, etc.)
-
Include comprehensive test coverage (unit and integration tests)
-
Create well-documented, modular, and maintainable code
-
Implement proper performance optimization
Constraints
- You must build the frontend application using one of the specified frameworks
- All data visualization must be implemented on the client side
- You may use CSS frameworks, but custom styling will be evaluated positively
- Your solution must be deployed to a publicly accessible URL
- The application must function without errors in the latest versions of Chrome, Firefox, and Safari
Evaluation Criteria
Criteria | Description | Weight |
---|---|---|
Functionality | Does the dashboard successfully fetch, display, and allow interaction with data? | 30% |
Code Quality | Is the code well-structured, documented, and maintainable? | 25% |
User Experience | Is the interface intuitive, responsive, and visually appealing? | 25% |
AI Utilization | How effectively was AI used in the development process? | 20% |
For detailed evaluation guidelines, see evaluation.md.
Getting Started
- Register your team for the challenge
- Set up your development environment:
- Install Cursor IDE
- Clone this repository
- Install required dependencies
- Read through the challenge documentation
- Start coding!
Submission Guidelines
- Your final submission must be pushed to this repository before 2023-06-15 at 23:59:59.
- Ensure your repository includes:
- Source code
- Documentation
- AI Collaboration Log (see below)
- Installation/deployment instructions
AI Collaboration Log
Throughout the challenge, maintain a log (AI_COLLABORATION.md
) documenting how you collaborated with AI tools. Include:
- What features of the AI you used
- Examples of effective prompts
- How AI suggestions were incorporated
- Instances where AI assistance was particularly helpful
- Challenges faced when working with AI
Resources
- Cursor IDE Documentation
- AI Prompting Best Practices
- Public APIs List
- Data Visualization Best Practices
Support
If you have questions or need assistance:
- Open an issue in this repository
- Contact the challenge administrators at [email protected]
License
This challenge and all materials are provided under the MIT License.