Now we need to feed it what we are coding manually but since they can already understand what is happening on our screens (like sending a screen shot of an error to ChatGPT) then why not have an app that actively does that and can be aware of the context of the code AND the results of the code?
I can imagine an assistant in the IDE given constant feedback as you compile and run …
Maybe even give it total access to your local machine… For example the code can’t find a file then the assistant can see you did …/…/…something instead of …/…/something and can be aware of that also… of maybe npm configs… etc…
I guess the main thing is some sort of constant awareness of your entire screen. .so simple example I had the other day with unity and chatGPT
I had a button that was not changing color after a certain amount of time… I used the code it gave me and I told it I need it to change color after 2 seconds…
If somehow the AI was aware of the running of the game and could translate what it saw to match up its understanding then that woudl be next level right?