AI models should confront our ideas with some critical thoughts first

I had this nagging feeling about AI copilots for quite some time, that they are too agreeable. Now that I think about it I never felt that they know more than I do. In fact I only could finalise the tasks for which I had prior experience and knowledge before. I felt like I was a lead developer guiding a knowledgeable but incapable junior. I don’t know if it was an intentional design approach to these systems, but it has to change.

For a week I worked on a system that was meant to sync the posts between facebook group and my local ticketing system only to find out that facebook, which was always quite douchebaggy about its API support, has deprecated Group API without any notice or explanation. So a colossal amount of time and effort went basically nowhere. I’m considering alternative solutions now, but what I believe should have happened is that Cursor when I was explaining a task to it instead of happily jumping onto it would confront me with some concerns and critical thoughts first!

I’m sure we’ll get there eventually/probably quite soon but for now, it’s best to accept that any AI will happily tell you how to create your own nuke when serving it a recipe for apple pie. And yes, OMG can it be a total waste of your time. You have my sympathies. Here’s to the future arriving yesterday!

1 Like