Are programmers soon to become extinct?

by Prof Barry Dwolatzky

SedibaJuvenileSkullThe discussion at this week’s JCSE “Architecture Forum” got me thinking about the future of programming. Some of the software architects present seem to believe quite strongly that the art of writing a computer program – something I’ve been doing since the early 1970’s – will soon be automated. Images of the recent fossil discoveries by scientists from my own University flashed into my mind. Will palaeontologists of the future display the fossilised remains of the long extinct “Computer Programmer”?

 Programming is about abstraction. At the heart of the computer – or “programmable device” – is some low level binary machine code. Some of us may have learnt to craft these instructions by hand at some point in our careers, but since the 1960’s programmers have worked at a higher level of abstraction. The so-called “general-purpose languages” – FORTRAN, Cobol, C, Java and C# – allow the programmer to focus on relatively high-level issues when translating a design into a program. Using these languages we write our programs using assignments, loops and conditionals. Various structuring mechanisms have been introduced to help us deal with complexity. These include functions, procedures, classes and methods. Libraries of reusable functions and classes have also long been available to simplify the task of programming.

At a level of abstraction higher than these general-purpose languages are domain-specific specialised languages. Do we still use the term ‘4th generation languages” – or 4GL? These languages allow the programmer to think about the problem to be solved at a higher level. While these certainly gained some level of adoption, millions of computer programmers around the world still develop software using languages like C, C++ and Java.

In the 1990’s the concept of “model-driven development” appeared. Programs were designed using a modelling language like UML and tools were then used to automatically generate programs in the language of choice. These generated programs were in fact structured shells – a programmer still had to write a lot of C++ or Java code. The UML tools did not automate the detailed programming, but rather the high-level structuring.

The discussion at the Architecture Forum centred around architectural representations. Some of these are at higher levels of abstraction than the detailed design expressed in UML. The issue raised was this: Using modern architectural representation notations the “business rules” of the required software can be captured. Is it possible now, or will it soon be possible, to automatically generate running software from these representations? If this is possible and becomes widespread the computer programmer as we know him/her will become extinct.

My own view is that this will not happen – at least in the foreseeable future. The work of writing the detailed steps of a program is best done by a skilled human programmer. The variation and complexity of detailed programming does not lend itself to machine automation.  But maybe I’m just a dinosaur of the information age refusing to accept my own imminent extinction!!??

11 thoughts on “Are programmers soon to become extinct?

  1. Hi,

    I don’t believe programmers will become extinct in the near future. I still see the need for them, perhaps using different tools as they evolve and the methodologies evolve. If it is the case that we can get the tools to generate the code from a specification, then programmers will probably evolve to become the people to capture these specifications in the tools used. As I said, the tools may then change but the “programmers” will still be required.

    What I do see though is that the lifespan of a programmer is already decreasing, but that’s not because of technology but rather other issues of finances, etc.

    Versha

  2. This controversy has been going on since the last millennium. I’m still of the old school, “the code is the design”.

    MDA, UML, BPEL etc, are tools that I would use for a particular problem domain to get me started towards the final “design”. However sexy these tools are, every project ends up with some issue at some level like a timing dependency, this ripples through the levels of abstraction and you finally realise the design is flawed but it’s not something that can be explicitly catered for in UML or whatever you’re using. Anyhow there is no longer any time for a redesign so you pop open the hood to the source and you put in a GOTO statement and ship it.

    It’s analogous to Godel’s incompleteness theorem. I conjecture that there will always be a required piece of source code that cannot be auto-generated from any higher level of abstraction.

  3. “Soon” we have fully automated code generation – we hear this evangelium since about 30 years. However, “soon” is not scientifically verifiable or falsifiable hypothesis. At this moment, nobody knows how to translate a high-level system specification into machine-executable binary code at the push of 1 button. Our only success story so far are compilers, for which this is indeed already possible, but they translate programs to programs, not specifications to programs, which is different.

    Thus we still need programmers, however the work-field for programmers is going to change for sure; new types of programming jobs will have to be done: Somebody will have to implement all those yet missing transformation tools, without which MDE in all its glory cannot work. Those tools, as long as they do not yet exist, cannot implement themselves, like the mythical Baron von Munchhausen who pulled himself out of a swamp by his own hair.

    Also think about the finest-level micro-programming for the firmware of operating systems, whereby the underlying hardware changes almost every year: Show me any Software Architect who can generate such highly optimized mirco-code for the firmware of new hardware device from a platform-independent model through MDE, and I’ll give him a bottle of finest Champagne.

    Finally, an analogy: Some thousands years ago, human society was an “agricultural” society through and through. For the most advanced societies of nowadays this is no longer true; i.e. it would be wrong to call Japan or Germany “agricultural” societies because the whole focus has shifted, and most people in those societies are no longer peasants or farmers. Yet both Japan and Germany still have some peasants working on their fields (however in small numbers), because even the most sophisticated high-tech engineer needs some bread on the table for supper.

  4. Hi,

    Funny I was discussing this in morning with a colleague while commuting to work. Programmers are a dying species I believe. Programmers evolved to developers, what is the difference one may ask….A programmer is someone who programs an application line by line and does not depend on any IDE to do his job, this applies both to the UI of the application and the functionality while a developer depends a lot on the IDE to do his job. Most developers today are only productive when they use certain tools i.e. creating a simple asp.net application might be difficult to do if someone who is used to Visual Studio suddenly has to use Dreamweaver to develop the same application. As these tools evolve the slowly eliminate the need for a programmer but give way for a developer who is also slowly giving way to a designer.

    We will still need programmers for operating systems and direct hardware programming but this will be a very niche market which no body will really care about, just Stefan’s analogy on agricultural societies we still need them but no body really cares about them.

  5. When the 4GL’s made their debut, business people embraced them as a way to side-step the perceived ‘time consuming discipline’ of IT. Some stand-alone applications were fine as long as the same person was around to maintain them, but as soon as the person left or integration was required, they fell apart.

    What was discovered was that the person creating an application still had to use an IT Developer’s mindset to produce a successful, efficient and maintainable application and the trained and experienced Developer was just better at it.

    My view is that the tool and the role title may change but the disciplined developer mindset will always be required. Besides, how would we maintain all the legacy stuff? It is very expensive to replace it.

  6. I was once told that software development is 80% design, and 20% coding and maintenance. Although this comment seem to still true, These days, prototyping, documenting, and/or testing are critical aspects of a software engineering position. Added to that, knowledge in algorithms, security, user interfaces, or other fields (e.g. any of the sciences including mathematics, physics, and psychology) is recommended.

    I can understand why the role of programmer is going by the wayside, but it will never be totally extinct. Instead, it will be blended in with many other roles on the job.

    Lets look back
    people (and especially the press) have been screaming that programming is dead for over a decade.

    When VB was first announced THAT was said to be the end of programming. Now anyone with a computer could point and click an application together.
    The same was hailed years later with graphical design tools. The business analyst could (it was said to be coming “soon”) just put his business rules into some piece of software (who’d write that the stories didn’t mention), press a button, and he’d have a full blown application. No more need for those pesky programmers and software engineers.

    Ask in your average office (say the accounting department of a retail store) for someone to write some software to automate a common task and you get glassy stares. Maybe someone has a cousin who “does something with computers” who might be able to but that’s about it. Those people all say they know MS Office but they’ve never seen the macro editor and would be completely lost if you started it and walked out the room (they couldn’t even close it down most likely).

  7. programming is not only the produced software code but is a way of thinking and is an art that cannot be extinct.
    There has been in the last decade great progress that seek to unify languages and ways to produce code but the code is not what produce the software but the knowledge of the programmer and these are inneficiencies that cannot be addressed by automating the art of writing software programs.

    It will be some time before there is unity amongst the different software producing giants who have given us the .nets and the java’s and c’s and until such time specialisation will remain.

  8. Some really great comments received on this post.

    Reading the comments, I think the consensus is the programmers are not about to become extinct. I liked Thomas’ comment that “programming” is a “way of thinking” – not just writing code. This got me thinking about how we write programs. How do I write programs? Am I a ‘good’ programmer? What is a ‘good’ programmer?

    Read my new post – “Are you a good programmer? I’m not!” – and tell me what you think.

  9. Depends on the “Programmer”….To be a Dodo or not to be a Dodo!

    I think the programmers of the future will have to have specific talents that extend beyond coding.
    i.e. they will have to be multi disciplined.

    Degree in medicine and an ability to pull together objects into a medical application….you may survive.
    Degree in Electronics and an ability to pull together objects into a system application…..you may survive.
    Business knowledge and an object building capability that solves business problems (not increase their complexity!)….you may survive.

    And above all an ability to communicate (English not gobledygoop “Acronimia”) …..you may survive.

    Become a translator between “Acronimia”(SW developers lingua franca) and English and you will thrive.

    Synergistic association is the key (its what unrelated other industry talent you can add to the coding talent that is key)

    Just a coder that won’t budge out of the “I’m a software Engineer so I don’t do that” bubble and I’m afraid your job will be outsourced to Bangalore along with building cars,analysing x-rays and digging ditches.

    The minute that business logic people could talk to software people in a common dialect (pictures), the location of that software “talent” became irrelevant.
    We should not be xenophobic about the poor guy that’s left his home and crawled across the border, we should be more worried about the one that didn’t leave home but used his beer money to buy a laptop and broadband.
    Its not perfect yet but at R3500 per month for an experienced coder with no long term contract sitting in India drinking their own coffee and using their own building is pretty attractive compared to the alternative in SA at ten times the price.
    The coder in India can build the “objects” for the business thinker to put together.
    From a business owners point of view it also offer me more security as I wont have key employees walking out the door with my IP(actually the synergistic glue between all the objects).
    The guy in India has no idea what my app does so he wont be contacting my clients with a copy any-time soon either.
    Its win win.
    My suggestion to prospective SW developers is either become a very very good, efficient, fast and cheap coder of objects or draw on other talents and learn to put objects together for industries you are familiar with.

    The alternative is ..Dodo.

  10. This line of thinking usually goes along the line of creating a UML or similar model, and somehow generating an entire running system from that. I suppose that would be possible in theory, but practically it makes little sense. Modeling the detailed steps taken by a running program is expensive, both in the amount of time it takes to create such a model, and in the poor quality code that is generated from it.

    One supposedly viable option is to combine some sort of BPM tool with a rules engine and only model higher level components. If you actually manage to combine all of these components and generate a fully running system, a lot of thought will still need to be put into building the processes and rules, not to mention catering for the complexities already ‘solved’ in more traditional programming, like concurrent version control and refactoring. And at some point, the process will terminate at an endpoint where something completely unexpected has to be done like parsing a custom file, or generating an image, or opening a socket and reading some bytes. These are things that are so difficult to model and so easy to code, that the modeling argument really falls apart very quickly.

    This is why agilists are spending time on coding more effectively rather than even trying to model large running systems. They have realised that models are usually inaccurate, inadequate, and cumbersome. It is far more productive to create frameworks that make programming easier, more transparent, and more accessible such as Cucumber and Spring ROO. Short iterations of try-break-fix get things done and creates high quality maintainable systems. Thus far round-trip engineering seems to be achieving the opposite, and the increasing complexity in system architectures is not improving the situation.

  11. Interesting thought.
    Will a programmer ever become extinct, I think not. There are many tools which generate programs (code) which does what the person doing the generation wants. However, a lot of careful thought needs to go into the design and coding of the template of the generator. That is an art form in itself. My dad always remarked that everything which looks simple is a result of some hard work.

    Programmers shows a specific thought pattern and some specific personality traits which cannot be learnt, they are either present or not.

    I would like to believe that programmers are inquisitive, but I have been proven wrong many times.

    I would say that before we loose programmers, we might see the loss of some other IT titles;
    project administrators, business analysts, systems analysts . . . it would seem that IT has this ability to generate a need for more and more resources, many of whom add little value to the final delivery and are overhead if nothing else.
    Of course the move to servers made life on the budget front just as challenging, place two or more servers in a room and the grow faster in numbers than rabbits.

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>