

Good thing that’s not the case then.


Good thing that’s not the case then.


It’s not about stupid or smart. It’s a tool, not a person. If you don’t get the same results that other people get with the same tool, then what could possibly be the problem other than how the person is using the tool?


No, it’s not. It doesn’t have intention. It’s literally just a tool. If you don’t get the results you expect with a tool when other people do get those results, then the problem isn’t the tool.


The symptoms you describe are caused by bad prompting. If an AI is providing over-complicated solutions, 9 times out of 10 it’s because you didn’t constrain your problem enough. If it’s referencing tools that don’t exist, then you either haven’t specified which tools are acceptable or you haven’t provided the context required for it to find the tools. You may also be wanting too much out of AI. You can’t expect it to do everything for you. You still have to do almost all the thinking and engineering if you want a quality project - the AI is just there to write the code. Sure, you can use an AI to help you learn how to be a better engineer, but AIs typically don’t make good high-level decisions. Treat AI like an intern, not like a principal engineer.


Source: trust me bro
Middle managers don’t get to pocket any of the unspent budget. That’s crazy talk.
Must just be a skill issue.