ChatGPT as part of the evolution of programming languages

In the 1940s von Neumann and his colleagues created conceptual models for computer architectures that were oriented toward the engineering problems of building computing devices, and not for making it easier for humans to write programs. The Lambda Calculus and also the design of the Prolog programming language are the first real efforts that I am aware of to place emphasis on how we humans think and solve problems.

I  had a thought earlier today that I keep coming back to: there are concise programming languages that can be more difficult to write code with, but once done the code is more valuable because of its conciseness that yields better readability.

I have been fascinated by, and use Copilot and ChatGPT to write code and sometimes it works well enough. What will the effects of ChatGPT and future LLMs be on the popularity of niche languages like Prolog and APL?

All things considered I would often rather have a concise program in Prolog or a flavor of Lisp than a much larger program written in a verbose language (e.g., Java). If I can describe a problem and generate a quality program in any language, then I will ask for generation in my favorite languages.

Because of its possible utility for so many tasks, ChatGPT and future similar systems really seem like programming environments to me.

Collecting textual context data for a problem, appending a series of questions to be answered, and sending the resulting text off to ChatGPT also seems like programming to me.


Popular posts from this blog

Time and Attention Fragmentation in Our Digital Lives

Ruby Sinatra web apps with background work threads