I built an AI that “sees” UI/UX like a human — entirely with Cursor (no coding!)

Hey everyone :waving_hand:

I wanted to share something I’m genuinely excited about — and it wouldn’t exist without Cursor.

A few weeks ago, I was vibing with @cursor_ai, telling the AI:

> “Your UI looks off.”

The problem? Cursor checked the code, but not the *actual screen*. That’s when it hit me:

> What if AI could *see* the UI like a human — and give feedback?

So instead of just giving it screenshots manually, I thought… why not build a package that:

- **Understands context** (not just pixels)

- Flags UX issues

- Runs full WCAG accessibility checks

- Works locally and open-source

I didn’t even touch the code — I just guided Cursor step-by-step and reviewed its output.

The result is Jpglens :rocket:

Now, AI agents can *see beauty* and automatically run UI/UX tests.

This means AI-powered workflows in Cursor can catch both functionality and design issues — without manual review.

:small_blue_diamond: License: MIT (free forever)

-–

I’d love to hear:

- How would *you* integrate jpglens into your Cursor workflows?

- Any feature requests or crazy ideas?

Cursor literally made this project possible.

Thanks for building such a powerful dev companion :raising_hands: