Thoughts on AI while debugging

Recently, I was exploring message brokers, particularly Kafka, leveraging Docker Compose for the setup. Among the services utilized was Kafdrop, a dashboard facilitating the monitoring of Kafka topics and partitions. While the publisher functioned seamlessly, encountering networking challenges within the Docker services hindered the loading of events in the Kafdrop dashboard. I know that service name can be used as host url to connect with within the application as well as other siblings containers withing the compose. But that didn't work out. So, I head directly to AI Copilot specifically as it is already installed on my vscode. Given it has access to work space therefore the context as well. Spent more than hour, but couldn't get the Kafdrop work. Then I've tried Claud, which give me the fix which turns out it's an env variables with Kafka to let it connect to both publisher as well as Kafdrop. Telling this story to pinpoint something, that is with the emerge of LLMs and every couple of days we see new LLM coming. Now instead of just Googling and looking for Stackoverflow answers, I spend time trying to different LLMs, I was lucky that I tried Claud as my second shot, who know if I tried other LLMs would I get the fix immediately. This brings me to a conclusion, at this stage, I'm in a loophole, which means yesterday I was spending time looking for the answer on Google/stackoverflow, now I'm spending time shopping around trying different LLMs hoping to get an answer fixing my issue. Not sure, with new LLMs coming each day, are we really saving time using LLMs looking for the correct answer?

Feb 8, 2025 - 14:17
 0
Thoughts on AI while debugging

Recently, I was exploring message brokers, particularly Kafka, leveraging Docker Compose for the setup. Among the services utilized was Kafdrop, a dashboard facilitating the monitoring of Kafka topics and partitions. While the publisher functioned seamlessly, encountering networking challenges within the Docker services hindered the loading of events in the Kafdrop dashboard.

I know that service name can be used as host url to connect with within the application as well as other siblings containers withing the compose. But that didn't work out.

So, I head directly to AI Copilot specifically as it is already installed on my vscode. Given it has access to work space therefore the context as well.
Spent more than hour, but couldn't get the Kafdrop work.

Then I've tried Claud, which give me the fix which turns out it's an env variables with Kafka to let it connect to both publisher as well as Kafdrop.

Telling this story to pinpoint something, that is with the emerge of LLMs and every couple of days we see new LLM coming. Now instead of just Googling and looking for Stackoverflow answers, I spend time trying to different LLMs, I was lucky that I tried Claud as my second shot, who know if I tried other LLMs would I get the fix immediately.

This brings me to a conclusion, at this stage, I'm in a loophole, which means yesterday I was spending time looking for the answer on Google/stackoverflow, now I'm spending time shopping around trying different LLMs hoping to get an answer fixing my issue. Not sure, with new LLMs coming each day, are we really saving time using LLMs looking for the correct answer?