The story behind software engineering.

In the 60’s programmers had to write their programs and could not test them, they were sent to technician and processed.

The first widely used programming language was Fortran – released in 1957 for mathematical and scientific computing, then Cobol, in 1962 for business applications.

One of the most important transitions was to use a time-sharing model instead of batch processing, which was and industrial process.

And thus...

THE SOFTWARE CRISIS

Late
Over Budgeted
Low Quality

Even though software engineering changed the world, one fourth of software projects failed, so the crisis stayed throughout the 60’s, 70’s and in the 80’s..
In 1968 NATO organized a software conference that forked the discipline of Software Engineering (Up to that point hardware was the main focus).
There were 3 main problems to be addressed:

Eliminate Human Error

Apply math to programming & develop formal methods, this was very expensive. People assumed that if they did enough math they wouldn’t need tests.

Eliminate Internal Complexity

The source code of projects became really complex, BUT! Most complexity is essential complexity: Not complexity in the solution, but complexity in the problem.

Eliminate Project Variability

Eliminate the craftsmanship, and create an industry with something like assembly lines. This meant standardizing the design process, which was impossible because software is not a defined process, it involves context.

All of these attempts

FAILED...

So we found an alternative strategy:
Software as an empirical process

AGILE:

1. Observation
2. Hypothesis
3. Experiment

Soooo…

basically, the Scientific Method.

In the 70’s, modular pieces where introduced, which communicated via interfaces. Projects could be fragmented which became a key idea in systems thinking.

In the 80’s Object Oriented programming was introduced and they could be independent and hold specific states. This was the abstraction that allowed programmers to interact with something that almost physical with Graphical user interfaces (GUI’s).

Due to computing power increasing following Moore’s Law (The idea that thepower of integrated circuits doubled every two years), engineers stopped being careful about the amount of memory and processing power used in code and this led to “wasteful” software.

In 1989, the world wide web was developed by Tim Berners-Lee and in the 90’s the browser was created with graphical access.
In the 90’s developers grew tired of only having access to only executable binaries and not the real source code, so open-source rose in popularity and has since become the major reason for software engineering productivity.

Then came the development of commodity computers and the “could” that allow applications to be updated and accessed in real time.

After that came mobile development with smartphones and tablets.