prosperops logo

Implementing FinOps, Part 3.3: CFO’s Guide to Information Technology Basics

Implementing FinOps, Part 3.3: CFO's Guide to Information Technology Basics

When exploring the inter-departmental dynamics of business, drawing the analogy of doing business in multilingual environments often helps, since the concepts and lingo of technology can feel like a different language to other groups within the organization. Many finance practitioners have probably noticed faces starting to go blank when they introduce lingo like “EBITDA” and “Net Present Value” to some of their colleagues. Similarly, they may have found themselves lost when technical personas introduce terms like “latency” or “block storage” to a conversation. 

Fortunately, only a 10,000-foot view of the building blocks of IT is necessary for finance practitioners to become sufficiently conversant in technology. So, let’s hop in a metaphorical helicopter for a quick tour around IT basics! In this blog, we will help financial practitioners understand the fundamental blocks of technology to better comprehend cloud computing, eventually paving the way for smoother implementation of FinOps. Read on!

Introduction to IT Basics

Most likely, this article is being read online, suggesting that the reader is sitting in front of a Mac or a PC. This machine embodies nearly all of the concepts and components common to any IT environment, no matter how large. That means that by virtue of using a computer every day, the most tech-phobic among us are already familiar with many IT concepts without realizing it.

Understanding the Operating System

Let’s start by understanding the operating system

When someone is asked whether they use a Mac or a Windows PC, they associate both the hardware (i.e. the laptop itself) and the operating system running on it (Windows or MacOS) as being one item, but in reality they are entirely independent. True, Apple sells its MacOS operating system exclusively on proprietary hardware, but that barrier is purely commercial in nature and is intended to both increase its own profitability and reduce technical support costs. So, what is an operating system? The answer: “a collection of software that provides an environment on which computer programs can run.” 

An analogy between an aquarium and a computer can be used to better understand this. Think of the empty glass fish tank as being the laptop hardware before any software is loaded. In this state, there is no way for fish to live in the aquarium – no water, no oxygenation and heating of the water, fish food, etc. One can think of the programs we use on our laptop as the fish in the aquarium. Before they can survive there, the right environment needs to be created, so an operating system is analogous to the collection of water, heat, and oxygenation that allows fish to thrive in an aquarium. Microsoft Windows and Apple’s MacOS are different environments, which means that software has to be written to operate in one or the other. The aquarium analogy would be freshwater and saltwater tanks, each only able to support the lives of saltwater or freshwater fish.

Understanding Virtualization

Normally our laptop and desktop computers run a single operating system. In large IT environments, however, the utilization of computer hardware can be greatly increased by packing multiple operating systems onto a single computer in a process called virtualization

For instance, consider the use of virtualization in a home office scenario: an iMac might be used for 99% of work. However, there could be an email archive application that needs to run once a week to archive all emails, but the archive application only operates on Windows. Without virtualization, the purchase of an entirely separate computer running Windows would be necessary, which would be used for a mere 10 minutes per week and would otherwise sit idle. This is not an efficient use of resources. To avoid this waste, a program called “Virtualbox” could be installed on the iMac, allowing Microsoft Windows to run as a “program” within MacOS. The email archive app would then run on that instance of Windows for a few minutes each week, eliminating the need for a separate computer. 

In large IT environments, a single computer may host dozens of these virtual machines running side-by-side for various reasons, reducing IT costs by greatly increasing the utilization of each computer and thereby reducing the number that must be purchased. As a sidenote, the large computers that provide services such as hosting websites in large enterprises are called servers. Servers are essentially gigantic computers providing services to other computers. 

Understanding Computer Hardware

Next let’s examine computer hardware. 

  • CPUs: Every computer has at least one Central Processing Unit, or CPU, which you can think of as the brain that processes information. As an oversimplification, imagine the CPU as receiving inputs, processing them, and then producing outputs. For example, when you enter a formula into a cell in your spreadsheet, the CPU is responsible for calculating the result for the cell. 
  • Memory: Every computer also has memory which we must keep very distinct in our minds from storage, because confusion between the two is very common. Memory, or ‘RAM’ as it is often called, is essentially the computer’s capacity to process and ‘think about’ multiple items simultaneously. Have you ever had your computer slow down because you had too many programs and browser tabs open? Most likely the cause of the slowdown was the laptop’s RAM filling up, such that it couldn’t “think about” more items at the same time. 
  • Storage: Information that must be retained but is not currently being processed is saved to storage. For example, when you save and close a spreadsheet workbook it is retained on your computer’s “hard drive” so you can open it later. While ‘hard drive’ traditionally refers to mechanical disk drives, most modern storage now uses solid-state technology.
  • Network Bandwidth: If you have ever been on a slow WiFi connection you have already dealt with the consequences of limited bandwidth. Consider a WiFi access point like a water pipe; there’s only so much data (water) that can flow through at any given time. So, if too many people are using the access point, the “pipe” is at capacity and slowdowns are encountered. In large IT environments, most networks are wired, not wireless, and a complex combination of software and hardware is required to direct and secure network traffic. Techies use the fancy term latency to describe delays in data transmission, which are most often caused by a particular portion of the network encountering too much traffic.

Congratulations! You have now passed “tech 101.” Armed with this knowledge, you’re now ready to delve into the basics of cloud computing.

Stay tuned to our Implementing FinOps series to learn more about a smooth collaboration between Finance and Technology. 

For more insights into FinOps and how to enhance your cloud cost management, visit our blog at ProsperOps or book a demo call to understand how our FinOps experts can help you save more on the cloud.

About the writer: Rich Hoyer is a FinOps thought leader with over 27 years of experience in IT and Finance, and he is the CEO and co-founder of FinOptik, a niche cloud financial management agency.

Share

Facebook
Twitter
LinkedIn
Reddit

Get started for free

Request a Free Savings Analysis

3 out of 4 customers see at least a 50% increase in savings.

Get a deeper understanding of your current cloud spend and savings, and find out how much more you can save with ProsperOps!

Submit this form to request your free cloud savings analysis.

🚀 Discount automation now available for Google Cloud Platform (GCP) in Early Access!