Insights

Is Human Software Development Doomed With ChatGPT?

Examining how Chat-based General Purpose Translator (ChatGPT) will affect software development.

Goal

Examine whether AI will take over software development as many will fear.

What's ChatGPT?

ChatGPT is a cutting-edge conversational AI model developed by OpenAI that has the ability to generate code. Its advanced deep learning algorithms enable it to write code in a variety of programming languages, including Python, Java, and C++.

Using ChatGPT for code generation can bring numerous benefits, including:

  1. Increased productivity: ChatGPT can generate code quickly and efficiently, freeing up developers' time for more complex tasks.
  2. Improved accuracy: ChatGPT's deep learning algorithms ensure that the code it generates is syntactically correct, reducing the likelihood of bugs and errors.
  3. Personalization: ChatGPT can be fine-tuned for specific use cases, allowing it to generate code that is customized to the user's needs.
  4. Automation: ChatGPT can automate repetitive tasks such as writing boilerplate code, freeing up developers' time and reducing the risk of human error.
  5. Enhanced collaboration: By using ChatGPT, developers can work more closely together, sharing code snippets and discussing code-related questions in real-time.

Overall, ChatGPT's ability to write code represents a major advancement in software development, providing the world with a powerful tool to increase productivity, improve accuracy, and automate repetitive tasks.

Here's where I come in and point out that I did not write the above explanation. It's all written by ChatGPT. "So it's good at writing paragraphs, anyone can do that. It can't possibly write code, right?" WRONG. Its coding ability is beyond any human's capability. I won't demonstrate that in this blog because you can easily see for yourself, or just watch this video by Clement (Co-founder of AlgoExpert) where ChatGPT scores the maximum score in a coding interview.

It's all over. Time to be a burger flipper again.

Pepe Frog McDonalds

Not so fast. While its ability to code and answer questions is impressive, it's really not THAT impressive. If you would just ignore all the hype for a second and just try out ChatGPT for yourself in a field where you are an EXPERT (it can even be something like cooking scrambled eggs), you'll quickly realize that it's just a distant cousin of 'no code tool' (the next BIG THING that will allow non-technical people to create apps/websites. HA).

While it can write code prompts, it can't architecture it. While it can explain what it just wrote, it can't explain your company's ancient legacy codebase. While it sounds confident ALL the time, it is quite often, straight-up wrong. It's like listening to medical advice written by your average 16-year-old whose hobby is googling a bunch of medical advice. It will sound very confident, it will cite the source, and it will be right a lot of the time, but you really shouldn't trust it and instead should really visit an actual doctor. That's what ChatGPT really is doing. It's pooling a bunch of online resources and regurgitating them back to you in a very confident way. Software is no different, you can't just trust anyone to write your software for you and expect it to be of quality.

To back my claim further, I present you a list of problems with ChatGPT, ironically written by ChatGPT.

  1. Bias: ChatGPT is trained on a large corpus of text data, which may contain biases and stereotypes. These biases can be reflected in the model's responses and output, leading to inappropriate or offensive content.
  2. Misunderstanding context: The model can struggle to understand the context of a conversation and may provide irrelevant or nonsensical responses.
  3. Lack of common sense: ChatGPT has been trained on text data and may not have a thorough understanding of the world and its workings, leading to incorrect answers to questions or misunderstandings of concepts.
  4. Limited creativity: While ChatGPT can generate text that is similar to human writing, it is limited in its ability to be truly creative. Its outputs are based on patterns it has learned from the training data, so its responses may be repetitive or predictable.
  5. Safety and security: As a machine learning model, ChatGPT can be vulnerable to attacks such as adversarial examples or data poisoning. This can result in the model generating incorrect or harmful outputs.
  6. Reliance on training data: ChatGPT's outputs are only as good as the data it was trained on. If the training data is limited in scope or quality, the model's outputs will also be limited.

It's important to use ChatGPT with caution and an understanding of its limitations. In many cases, it may be necessary to review and verify the model's outputs before using them in real-world applications.

Conclusion

I am not saying ChatGPT is a bad tool. I do use it quite often as my senior pair programmer. But that's all it is, a tool. The claim that it would actually replace software developers is nonsense, at least in the current iteration of the tool. Please excuse me while I get back to deving.

👋 Hey Sitecore Enthusiasts!

Sign up to our bi-weekly newsletter for a bite-sized curation of valuable insight from the Sitecore community.

What’s in it for you?

  • Stay up-to-date with the latest Sitecore news
  • New to Sitecore? Learn tips and tricks to help you navigate this powerful tool
  • Sitecore pro? Expand your skill set and discover troubleshooting tips
  • Browse open careers and opportunities
  • Get a chance to be featured in upcoming editions
  • Learn our secret handshake
  • And more!
Sitecore Snack a newsletter by Fishtank Consulting