The evolution of software deployment over time.

Since the beginning of the IT industry, Software deployment has continued to evolve. We can categorize it as 5 ways how and where software programs are being deployed.

Punch Cards, Paper Tapes, and Mainframes

During the early days of computing, software deployment involved the use of punch cards or paper tape, which contained the software’s code. This code was loaded onto the computer’s memory manually, and the software was executed by the computer. Typically, software deployment was carried out on physical servers or mainframes, which were large, expensive, and challenging to manage. Specialized skills and resources were required to maintain and operate these systems.

In the 1950s and 1960s, batch-processing systems were developed, enabling more automated software deployment. Programs were submitted to the computer operator in batches, who would then load them onto magnetic tape and execute them sequentially. software deployment typically took place on physical servers or mainframes. These systems were large, expensive, and difficult to manage, requiring specialized skills and resources to maintain and operate.

Bare Metal Servers

In the 1970s and 1980s, the development of operating systems and application programming interfaces (APIs) led to a more standardized approach to software deployment. This makes it easier to install and configure software and allows greater compatibility between different hardware and software platforms. Therefore, system administrators prepare the physical servers for the software to be deployed. This will involve installing the operating system, and appropriate device drivers, ensuring enough memory/disk/processor is available, installing prerequisites, etc. They will also be responsible for hardware upgrades and more.
This is called a “bare metal” environment. There is a strong coupling between the physical hardware and the deployed software as one is highly dependent on the other. Here, the deployment unit is a real server.

Ages of VM

The concept of virtual machine (VM) dates back to the 1960s when IBM developed a virtualization system called CP-40 for its mainframe computers. Virtualization technology evolved in the 1970s and 1980s with the development of virtual memory systems and hypervisors. Virtual memory systems allowed programs to use more memory than was physically available, by swapping data between memory and disk storage. A hypervisor, also known as a virtual machine monitor, allows multiple virtual machines to run on a single physical machine, each with its own set of resources such as CPU, memory, and storage. Developers can now deploy to simulated servers rather than directly to given hardware.
This allows great flexibility in upgrading and migrating, without having to worry about minor hardware changes. In the event of a hardware failure, system administrators can migrate virtual machines to different hardware and avoid issues

Containerized Deployment The successor to virtual machines is container deployment. Containerization solves the problem by wrapping an application and its dependencies into a lightweight, portable container that can run on any machine with a compatible container runtime. These technologies allow system administrators to “section off” an operating system so that different applications can run on the same system without interfering with each other.

Today, virtualization and container technologies are widely used in cloud computing environments, allowing organizations to run multiple applications on a pool of physical resources and scale them up or down as needed to meet changing demands.

Serverless

Serverless computing was the concept which has evolved over Utility Computing to provide more efficient and cost-effective ways to deliver computing services.

In each case there is a notion of “where” the software runs, here in serverless computing we do not have to worry about provisioning servers to run the code which eliminates most of the infrastructure management, lower cost, faster deployment, and improved scalability.

Today, serverless computing is emerging with new services and capabilities being added by cloud providers on an ongoing basis. Serverless computing will likely continue to play an important role in the future of software development and deployment, as organizations seek to build more scalable, efficient, and cost-effective applications.