When discussing cloud platforms, cloud computing, and software development for the cloud it's important to know that, for most development projects, developing for the cloud is the same as developing a traditional multi-tier application. Your applications will still run on a traditional operating system (usually headless Linux) and use the same frameworks.
The differences in development reside mostly in the initial architecting of the product. When considering cloud technologies, it's important to decide whether you're looking to develop a traditional monolithic application, a set of microservices, or whether you'll forgo that entirely and go with a serverless architecture.
Monolithic applications are what you could consider the "default" architecture. All your services and all your application logic runs through a single, monolithic, application. Monolithic applications are the simplest to conceptualize, maintain, and develop. They do not require shared data contracts or an inter-process communication layer. They will, however, have a larger footprint and require larger hardware requirements as everything is hosted in a single place. This can often lead to higher costs when deploying to the cloud. Additionally, monolithic applications introduce single points of failure that could take entire services offline should errant code be introduced (Kharenko, 2015.)
Microservices, as the name implies, involve designing your application as a set of pseudo-independent services that share data contracts and often an inter-process communication layer such as RabbitMQ or Kafka. Each individual microservice endeavors to remain small, compact, and quick. Breaking up your services in this fashion allows your infrastructure management software (often Kubernetes) to selectively scale individual portions of your application, leading to an overall reduction in cost and footprint as the application scales to meet growing customer demand (Kharenko, 2015.)
Serverless architecture is the idea of taking microservices to their extreme logical conclusion. Instead of developing applications, the developers endeavor to encapsulate all functionality into independent autonomous functions that rely completely on the cloud provider's interface (What Is Serverless Architecture?, n.d.) The benefit to this is that you would no longer be building a traditional application that needs to be managed, configured, and hosted on the cloud. Hosting, management, and scaling are all handled at the cloud provider level. Some drawbacks to this method include: existing applications must be completely re-written and re-designed to work as serverless, generally an internet connection is required at all times during development, not an intuitive way to build applications leading to longer development times and developer dissatisfaction.
I slightly touched on the cost of architecting for the cloud. Putting a serverless architecture aside, deploying microservices or a monolithic application in the cloud vs a traditional in-house deployment can save significant costs -- depending on the organization and application.
A large web application designed to be served to multiple regions would likely save a lot of money going with a pre-built cloud solution such as AWS which already has servers set up in multiple regions. The costs savings being equipment, network engineers, and land.
However, a smaller application meant to serve a single local development studio or office would save considerable amounts of money long-term investing in a single server and hosting it on-premises. While the upfront cost of the server and a network engineer to set it up would be higher (I'd estimate $5-10000 assuming you're paying professionals and getting a half-decent server), the ongoing costs plummet to the cost of electricity and, should an internet outage occur, the local office would still have access to its core systems (Hughes, 2018.)
I believe the question: "Do today's cloud-based platforms make traditional operating systems obsolete? Why or why not?" is self-evident through my previous responses. No, cloud-based platforms do not make traditional operating systems obsolete because cloud platforms rely on traditional operating systems to serve their content. The question has never been (and likely will never be) "Cloud vs Traditional Operating Systems" -- this implies a fundamental misunderstanding of what the Cloud is. The real question is one of deployment: "Cloud vs On-Prem" or "Cloud vs Self-Hosted" is a lot closer to the core concern we, as engineers, need to concern ourselves with.
If I were to develop "prototype" code, I would not concern myself with which operating system it's designed to run on. I'd choose a trivially cross-platform technology that is well-suited to the space and run with it. For web servers, possibly something like Golang for its native binaries and cross-compilation. For graphics applications, I'd either use a cross-platform game engine or a cross-platform graphics API such as Vulkan or OpenGL. I don't like the question. Answering it directly though, assuming my prototype is meant to be run as a server somewhere, I would ensure it runs on Linux as Linux, and Linux Containers power most of the servers on the internet.
- Paul
References
What is Serverless Architecture? (n.d.). Www.twilio.com. https://www.twilio.com/docs/glossary/what-is-serverless-architecture
Hughes, A. (2018). Blog: On Premise vs. Cloud: Key Differences, Benefits and Risks | Cleo. Cleo. https://www.cleo.com/blog/knowledge-base-on-premise-vs-cloud
Kharenko, A. (2015, October 9). Monolithic vs. Microservices Architecture. Medium; Microservices Practitioner Articles. https://articles.microservices.com/monolithic-vs-microservices-architecture-5c4848858f59