Next 15 pattern confusion guardrails

Hey Cursor,

I realize this is a Claude thing and not a you thing but is there a way to automate guard rails in Cursor?

This issue of Claude rolling back on Next 14 patterns is so consistent and pervasive that I write an MD file just to correct it several times a day when the AI tries to overwrite my functional work. The .cursorrules file doesn’t seem to prevent it.

Is there a feature of cursor that lets me say “this stuff is important, don’t let the agent overwirte it without flagging me”?

Hey, you can try adding comments to important sections of code, like // DO NOT EDIT. Then, in your rules, you can specify that it should never delete the marked sections of code. This should work, but it’s not a 100% reliable solution, as the model might ignore this rule over time.

You know what would be great: NOTEPAD default buckets just like the YOLO black-list of commands.

How about a CHECK BOX in the IDE for the file that says “DO NOT EDIT” (Read only, knock-out-of-the-floppy…

&

Highlight, right-click "Send code to [NOTEPAD_FREEZER] <<-- Any functions, specific code segments, files, scripts etc - can be sent to the freezer and the AI cant modify anything in that - && they can be called as variables in other files.

1 Like

May you share the SSE,MD file?

Or sanitize it if need be and share the intent?

Thanks

Sure, this was a quick hack of a file but I refer Cursor to it all the time. It’s just a bunch of my messy notes crunched through ChatGPT and output to MD.

How SSE Has Changed in Next.js 15 and Node.js 22

In Next.js 15 and Node.js 22, modern runtime improvements and API design updates influence Server-Sent Events (SSE) implementation. Here’s a breakdown of key changes and considerations.


Key Changes in Node.js 22

Native Fetch API in Node.js

  • Simplifies HTTP handling:
    • Node.js 22 introduces built-in support for the Fetch API, which simplifies making HTTP requests and responses.
    • While this doesn’t directly impact SSE, it removes reliance on libraries like axios for server-side HTTP requests.

Improved Streams API

  • Enhanced streaming response support:
    • Updates to Node.js 22’s ReadableStream API make handling streaming responses (like SSE) more efficient.
    • These enhancements allow better integration with modern SSE implementations.

Better Compatibility with Edge Runtimes

  • Optimized for modern deployments:
    • Node.js 22 improves compatibility with edge runtimes like Vercel Edge Functions.
    • However, long-lived connections like SSE are typically unsupported in edge environments. Alternatives such as WebSockets or polling are recommended.

Enhancements in Next.js 15

App Router with Server Actions

  • Server-first logic and organization:
    • The App Router (app/) and Server Actions introduce a new structure for server-side logic.
    • SSE is implemented via custom API routes (app/api/).

SSE API Route Example:

// app/api/sse/route.js
export async function GET(req) {
  const encoder = new TextEncoder();
  const stream = new ReadableStream({
    start(controller) {
      const intervalId = setInterval(() => {
        const message = `data: ${JSON.stringify({ time: new Date() })}\n\n`;
        controller.enqueue(encoder.encode(message));
      }, 1000);

      req.signal.addEventListener('abort', () => {
        clearInterval(intervalId);
        controller.close();
      });
    },
  });

  return new Response(stream, {
    headers: {
      'Content-Type': 'text/event-stream',
      'Cache-Control': 'no-cache',
      Connection: 'keep-alive',
    },
  });
}


Edge Runtime Support

  • Runtime choice for SSE:
    • Edge runtimes have limitations on long-lived connections. Use the Node.js runtime for SSE by specifying:

      export const runtime = 'nodejs';
      
      

Improved Streaming APIs

  • Leverages React’s concurrent features:
    • Next.js 15 integrates React Streaming Rendering (e.g., <Suspense /> boundaries and React Server Components).
    • These features do not replace SSE but enhance server-rendered content delivery.

File-Based API Routes

  • Simplified organization:
    • API routes for SSE remain largely unchanged. However, the App Router provides better organization under app/api/.

Potential Gotchas

  • Edge Runtime Restrictions:
    • Long-lived SSE connections are unsupported on Edge runtimes. Stick to the Node.js runtime.
  • Streaming Limitations:
    • Ensure proper response flushing (res.flush()) for older client compatibility.
  • Concurrent React Features:
    • While React Concurrent mode changes data-fetching structure, it does not directly enhance SSE.

Best Practices for SSE in Next.js 15 and Node.js 22

  1. Use app/api/ for SSE Routes:
    • Place SSE logic in API routes to separate it from rendering logic.
  2. Stick to Node.js Runtime:
    • Long-lived connections work best in Node.js runtime.
  3. Leverage Improved Streaming APIs:
    • Use Node.js 22’s ReadableStream for efficient SSE handling.
  4. Handle Client Disconnects:
    • Use req.signal to manage disconnects and prevent memory leaks.
  5. Use HTTP/1.1:
    • Ensure SSE connections use HTTP/1.1, as HTTP/2 can break SSE behavior.
  6. CORS Configuration:
    • Configure appropriate CORS headers for browser clients.

Updated SSE Example for Next.js 15 and Node.js 22

Server (Next.js API Route):

// app/api/sse/route.js
export async function GET(req) {
  const encoder = new TextEncoder();
  const stream = new ReadableStream({
    start(controller) {
      const intervalId = setInterval(() => {
        const message = `data: ${JSON.stringify({ time: new Date() })}\n\n`;
        controller.enqueue(encoder.encode(message));
      }, 1000);

      req.signal.addEventListener('abort', () => {
        clearInterval(intervalId);
        controller.close();
      });
    },
  });

  return new Response(stream, {
    headers: {
      'Content-Type': 'text/event-stream',
      'Cache-Control': 'no-cache',
      Connection: 'keep-alive',
    },
  });
}


Client:

useEffect(() => {
  const eventSource = new EventSource('/api/sse');
  eventSource.onmessage = (event) => {
    console.log('Update:', JSON.parse(event.data));
  };
  eventSource.onerror = (error) => {
    console.error('SSE error:', error);
  };
  return () => {
    eventSource.close();
  };
}, []);


This implementation aligns with the latest capabilities and best practices in Next.js 15 and Node.js 22, ensuring a robust SSE setup for modern applications.

A source of truth feature would go a long ways. AI loves to spam as many versions of a file into the app as possible. The behavior is mostly harmless when it’s 5 versions of {cn} but dangerous as ■■■■ when it’s 2 versions of a critical sync controller.

Cursor could pretty easily build locking guardrails around this.

IMO _ we are in the pre-tsunami of Agentic Swarms. Where the sea recedes, (our belief that Agents are ‘really that capable’?, (“I mean, C’Mon - it keeps losing context”)) – but I can see where we are going.

I am so excited to get better nad better at wielding Agents.

I know that I am super weak in this ability ATM - but I know I can build cool things- but if I know some cool things that I can build – the amount of things the sother super smart people with actual access to FAANG level resources, the things they MUST have ALREADY been building must be nuts.

I just need to figure out how to pojint the Agents at filling up my bank account…

But Agents that register incredible code snippets and ffunctions and archetypes, and personas and all sorts of Agentic Tooling into vast instantaneous libararies that can be called.

We need a new package manager that is an AI Agent Persona’s Toolbox – Whereby when I say "build a thing that can do all the things, have payment DB, log, blah blah blah…

It can yank already successfully created AI Agent manifestations and can just check out complete modules that are plug and PL-AI

If that makes sense…

Hey, I updated this to be mindful of Vercel’s limitations:

Below is a friendly, comprehensive guide to Server-Sent Events (SSE) in Next.js 15 and Node.js 22—including how you can still make it work on Vercel without the guide yelling at you. We’ll cover recent SSE changes, potential pitfalls, IDE/tooling considerations, and how to stream logs from a serverless environment while respecting Vercel’s constraints.

Table of Contents

  1. Overview
  2. Key Changes in Node.js 22
  3. Enhancements in Next.js 15
  4. Potential Gotchas
  5. Best Practices for SSE in Next.js 15 & Node.js 22
  6. Example: SSE in Next.js 15 and Node.js 22
  7. Serving SSE on Vercel
  8. IDE Considerations & Tooling
  9. Final Recommendations
  10. Appendix: Reference Articles & Documentation

How SSE Has Changed in Next.js 15 and Node.js 22

In Next.js 15 and Node.js 22, modern runtime improvements and API updates have made SSE more convenient while introducing a few new caveats. Below is a deeper look at what’s changed, especially around file operations, streaming logs, and IDE configurations.


1. Key Changes in Node.js 22

1.1 Native Fetch API in Node.js

  • Direct fetch in Node
    Node.js 22 includes (experimental) support for the WHATWG fetch API, simplifying server logic for data requests.
    • If your SSE route needs to fetch from external services (e.g., logging or metrics), you can now do so natively.
    • Note that SSE itself is about writing to a response stream, whereas fetch is typically used for reading from other endpoints.

1.2 Improved Streams API

  • Web Streams vs. Node.js Streams
    Node.js 22 offers better support for the Web Streams API, but many Node libraries still expect the classic stream module.
    • If your SSE code or underlying libraries (like those accessing S3) require a Node.js Readable, you may see errors like stream.on is not a function if you pass them a Web ReadableStream.
    • Fix: Convert Web Streams to Node.js streams using Readable.fromWeb() or use a Node.js Readable from the start.

1.3 Better Edge Runtime Compatibility

  • Edge vs. Node
    Node.js 22 aligns well with edge environments, but long-lived connections (like SSE) still can be terminated by strict edge timeouts.
    • If you want stable SSE, ensure you’re on the Node.js runtime, not the Edge runtime. In Next.js, set export const runtime = 'nodejs'; in your route.

2. Enhancements in Next.js 15

2.1 App Router with Server Actions

  • New app/ directory
    Next.js 15’s App Router organizes server logic under app/api/. SSE routes live in app/api/sse/route.ts, for instance.
  • Server Actions
    These are great for form submissions and data mutations. SSE remains a custom API pattern.

2.2 React 18+ Streaming Features

  • Server Components & Suspense
    Concurrent React can progressively stream HTML to the client, which is different from SSE. If you need push-based updates, SSE remains the go-to pattern.
  • File-based API Routes
    You can place SSE logic in a dedicated route.ts under app/api/, which is simpler to maintain than older Next.js versions.

2.3 Edge Runtime Support

  • Potential Timeouts
    If you try to run SSE from the Edge runtime, you may face disconnections. Explicitly opt into runtime = 'nodejs' for more stable SSE connections.

3. Potential Gotchas

  1. Web Streams vs. Node Streams

    • Passing a Web ReadableStream to a library expecting Node.js streams triggers errors.
    • Fix: Use Readable.fromWeb() or revert to Node streams.
  2. Long-Lived Connections on Edge

    • Edge environments typically can’t support indefinite SSE. If you see abrupt connections closing, switch to the Node.js runtime.
  3. IDE/TypeScript Config

    • If using Web Streams or EventSource, add "DOM" in lib within your tsconfig.json.
    • If dealing with Node streams, add "types": ["node"].
  4. HTTP/2

    • Some SSE failures occur under HTTP/2, so ensure HTTP/1.1 is used if you see random disruptions.
  5. CORS

    • If the browser and SSE endpoint are on different domains, configure Access-Control-Allow-Origin headers.
  6. File-Based Logging

    • When streaming logs from your server’s S3 or disk operations, confirm your libraries are Node-stream-compatible (if you’re converting to/from Web streams).

4. Best Practices for SSE in Next.js 15 / Node.js 22

  1. Use app/api/ for SSE
    Keep SSE routes separate from UI, making your IDE aware of the difference between server and client code.

  2. Convert Streams Where Needed
    If your library wants a Node.js stream, but you have a Web ReadableStream, use Readable.fromWeb().

  3. Leverage req.signal for Cleanup
    Next.js 15 sets req.signal. If the client disconnects, you can close your SSE gracefully.

  4. Explicitly Use runtime = 'nodejs'

    arduino

    Copy code

    export const runtime = 'nodejs';

    This keeps your SSE route off Edge, where it might get terminated prematurely.

  5. Be Mindful of Timer Usage
    Use intervals or pings in SSE, but clear them when the request ends to prevent memory leaks.

  6. Test with Real Browsers
    SSE can behave differently across Chrome, Firefox, etc. Full end-to-end testing is recommended.

  7. Type Annotations for SSE Data
    If you’re sending JSON logs, define an interface so your IDE can auto-complete your SSE event object properties.


5. Example: SSE in Next.js 15 and Node.js 22

Server Route (app/api/sse/route.ts):

ts

Copy code

`export const runtime = ‘nodejs’; // Force Node.js runtime

export async function GET(req: Request) {
const encoder = new TextEncoder();

const stream = new ReadableStream({
start(controller) {
// Initial event
controller.enqueue(
encoder.encode(data: ${JSON.stringify({ type: 'info', message: 'Connection established' })}\n\n)
);

  const intervalId = setInterval(() => {
    const now = new Date();
    const payload = {
      type: 'update',
      timestamp: now.toISOString(),
      message: `Current time: ${now.toLocaleTimeString()}`,
    };
    controller.enqueue(encoder.encode(`data: ${JSON.stringify(payload)}\n\n`));
  }, 1000);

  // If the client disconnects
  req.signal.addEventListener('abort', () => {
    clearInterval(intervalId);
    controller.close();
  });
},

});

// Return streaming response
return new Response(stream, {
headers: {
‘Content-Type’: ‘text/event-stream’,
‘Cache-Control’: ‘no-cache’,
‘Connection’: ‘keep-alive’
}
});
}`

Client (React Component):

tsx

Copy code

`import { useEffect } from ‘react’;

export default function SSELogs() {
useEffect(() => {
const eventSource = new EventSource(‘/api/sse’);

eventSource.onmessage = (e) => {
  const data = JSON.parse(e.data);
  console.log('[SSE] Received:', data);
};

eventSource.onerror = (err) => {
  console.error('[SSE] Error:', err);
};

return () => {
  eventSource.close();
};

}, );

return

Check the console for SSE updates
;
}`


6. Serving SSE on Vercel (Without Getting Yelled At!)

Yes, you can use SSE on Vercel, but note:

  1. Serverless Function Timeouts

    • Hobby/Free plan: ~10s
    • Pro plan: ~30s
      If your SSE route runs longer, Vercel terminates it.
  2. Short-Burst SSE with Auto-Reconnect

    • Stream logs for the duration of your function’s time limit, then send an event like data: {"type": "end"}.
    • The client sees the end event, closes, and reconnects. This approximates real-time logs in short intervals.
  3. Polling / Hybrid

    • Alternatively, poll every few seconds if SSE reconnect logic feels too cumbersome.
  4. Dedicated Real-Time Services

    • If you need indefinite streaming (e.g., a continuous S3→ImageKit log pipeline), a dedicated Node server or specialized real-time platform is best.

Example: SSE with a 25-second limit that “resets”:

js

Copy code

`// app/api/logs/route.js
export const runtime = ‘nodejs’;
export async function GET(req) {
const encoder = new TextEncoder();
const start = Date.now();
const maxDuration = 25000; // 25s
let timer;

const stream = new ReadableStream({
start(controller) {
timer = setInterval(() => {
const now = Date.now();
if (now - start >= maxDuration) {
controller.enqueue(encoder.encode(‘data: {“type”: “end”}\n\n’));
clearInterval(timer);
controller.close();
return;
}
const logLine = { message: 'Sync log: ’ + new Date().toISOString() };
controller.enqueue(encoder.encode(data: ${JSON.stringify(logLine)}\n\n));
}, 1000);

  req.signal.addEventListener('abort', () => {
    clearInterval(timer);
    controller.close();
  });
},
cancel() {
  clearInterval(timer);
}

});

return new Response(stream, {
headers: {
‘Content-Type’: ‘text/event-stream’,
‘Cache-Control’: ‘no-cache’,
‘Connection’: ‘keep-alive’
}
});
}`

Then the client reconnects whenever it receives type: "end".


7. IDE Considerations & Tooling

  1. TypeScript Configuration

    • "lib": ["DOM"] if you use ReadableStream or EventSource.
    • "types": ["node"] if you deal with Node.js streams or older Node APIs.
  2. ESLint & Prettier

    • Ensure your config knows about Next.js 15’s app/api/ routes.
  3. Auto-Completion

    • If you see warnings about ReadableStream, verify you have the correct "lib" entries in your tsconfig.json.
    • For SSE, the DOM EventSource type is included if "DOM" is in lib.
  4. Server vs. Browser

    • Don’t mix window or DOM APIs in your server route. Keep them separate so your IDE doesn’t complain.

8. Final Recommendations

  • Short-Burst SSE on Vercel is fine for sync logs—just handle timeouts gracefully.
  • For continuous streaming that lasts longer than your plan’s limit, consider a dedicated solution.
  • Always confirm you’re on runtime = 'nodejs' to avoid Edge timeouts.
  • Convert streams if you have a mismatch between Web Streams and Node.js streams.
  • Test thoroughly in real browsers to confirm SSE behaves as expected.

Appendix: Reference Articles & Documentation

  1. MDN: Server-Sent Events
    https://developer.mozilla.org/en-US/docs/Web/API/Server-sent_events
    Baseline SSE specification, explaining EventSource and data formats.

  2. MDN: Streams API
    https://developer.mozilla.org/en-US/docs/Web/API/Streams_API
    Overview of the Web Streams standard, including ReadableStream, WritableStream, etc.

  3. Node.js 22 Docs: Web Streams
    https://nodejs.org/docs/latest-v22.x/api/webstreams.html
    Details on Node’s native Web Streams support, including conversions with Readable.fromWeb() and toWeb().

  4. Node.js 22 Docs: Globals (fetch)
    https://nodejs.org/docs/latest-v22.x/api/globals.html#fetch
    Explains the built-in fetch, removing the need for external HTTP libraries.

  5. Next.js Documentation: App Router & API Routes
    https://nextjs.org/docs/app/building-your-application/routing/router-handlers
    Shows how to write streaming responses in app/api/.

  6. Next.js Docs: runtime = 'nodejs'
    https://nextjs.org/docs/app/api-reference/next-config-js#runtime
    Force a Node.js environment instead of Edge, critical for SSE.

  7. SSE vs. WebSockets vs. Polling
    https://ably.com/topic/server-sent-events
    Great overview of real-time communication approaches.

  8. GitHub Discussions: SSE with Next.js
    https://github.com/vercel/next.js/discussions/12562
    Community threads covering various SSE implementations and troubleshooting.

  9. AWS S3 / GCP Storage Docs

  10. TypeScript Configuration Handbook
    https://www.typescriptlang.org/docs/handbook/tsconfig-json.html
    Ensures your project recognizes the correct types for SSE, Node, or DOM APIs.


Wrapping Up

This guide is designed to keep SSE running smoothly in Next.js 15 and Node.js 22—including Vercel use cases—without scolding you for wanting logs in real time. Remember:

  • Short-burst SSE can work on serverless platforms.
  • Reconnect logic is easy to implement on the client side.
  • If you need unlimited streaming, consider a dedicated environment.
3 Likes