Project FOCUS: A New Age of FinOps Visibility

It’s easy for managers and team leaders to get caught up in the cultural scrum of FinOps. Hobbling many FinOps projects, however, is a lack of on-the-ground support for the DevOps teams that are having to drive this widespread change – this is how all too many FinOps projects become abandoned on the meeting room table. If FinOps is about democratization, engineers need more support than just time and resource-intensive certifications. Recognizing this, the FinOps Foundation has put forward a new set of guidelines – and supporting GitHub repositories – to promote instant and actionable change at the development level. Learn more about the wider FinOps Foundation here

This new schema is the FinOps Open Cost and Usage Specification (FOCUS), and this article takes a look at the initiative, which aims to lend universal guardrails to any organization interested in achieving FinOps – no matter their specific cloud vendor of choice.

What is FinOps FOCUS and How Does it Work?

To see how this is achieved, let’s zoom in on the Command Line Interface (CLI): this is how your network and cloud engineers interact with cloud resources. As it’s possible to manually pull cost data via your cloud vendor’s command-line tool, many FinOps initiatives have begun there. Unfortunately, they quickly run into the following roadblock: these reams of data need to be placed in a spreadsheet and normalized across APIs, compute resources, applications, and even different cloud providers. The time this process takes means the data is out of sync with real-time – which is why many FinOps attempts have started and stagnated precisely at the CLI. 

FOCUS places a – well, focus – on giving engineers the open source tools to efficiently pull and normalize cloud cost and usage data across multi-cloud setups. At the foundation of any FinOps project is the ability to segment data into relevant areas. FOCUS is built in a way that universally breaks data down into two types. ‘Dimensions’ are qualitative (think providers, projects, and resource types), while ‘metrics’ are quantitative values (ie. cost data). These two data types can be directly compared by placing them in a singular column. FOCUS builds off this basis by segmenting your cloudscape into columns, which can then be cross-referenced to form a multi-dimensional understanding of your cloud cost data.

 The following breakdown shows how every column’s data point builds off one another to form this cohesive, vendor-agnostic view.

Usage Unit & Quantity 

Key to FOCUS’ universal implementation are these basic components of your cloud usage. A Usage Unit represents the smallest possible unit of a given resource or service used or purchased. Breaking your cloud usage into its smallest possible units promotes granular visibility. The Quantity column then simply scales up the per-unit view to the precise number of resources currently being used in your cloud. 

Availability Zone

Building on those precise usage metrics, the Availability Zone column compares usage against deployment location. 

SKUs

When scaling up a resource within your cloud, there’s the underlying cloud vendor’s product to contend with: FOCUS deals with this discrepancy by naming all paid resources SKUs. Each SKU is listed with its associated cloud vendor’s name and an ID that relates to the specific resource being paid for. Finally, SKUPriceID provides 

the price that that resource is incurring at its current usage level.

Billed Cost

Finally, the vendors’ invoicing datasets are connected up to the cost it’s incurring over time. This is inclusive of reduced rates and discounts. With the fundamentals of your cloud cost and usage identified, FOCUS then offers further guardrails to break each charge down into accessible parts. 

Charge Category

Charge Category is used to establish which resources are under usage-based cost models, and which have been purchased outright. While nothing special on its own, the charge subcategory column allows a deeper layer of context to be applied. 

When resources are under the ‘usage’ category, there are three values that can be issued to it: On-Demand; Used Commitment; and Unused Commitment. Resources in the ‘purchased’ category are those that have already been bought up-front. In the final type of charge category (‘Adjustment’), the focus is switched to how the billing has changed after the fact: Refund or Credit values show how much money was either transferred back to your company (when exchanging credits) or credits issued by the provider (in the case of usage discounts).

Pricing Category

While the last column distinguishes between pre-purchased and ongoing purchase models, this drills deeper into the precise category of every purchase. On-demand are those resources purchased on an as-needed basis. Dynamic are the resources with variable charges, such as those with unit prices that can be rapidly changed by the provider – think spot instances. Commitment-based pricing categories are your reserved instances (specified by the provider’s pre-established Commitment Discount ID). 

Effective Cost

Combining pricing, usage, and timing data, the Effective Cost column is the macroscopic view of your cloud’s cost. Note that the language of the FOCUS schema is angled to match accounting terms, so when the Effective Cost column takes into account the amortization of one-time or recurring purchases, don’t be thrown off. Amortization simply refers to how upfront costs have been used or distributed over time.

Charge Frequency

Charge Frequency indicates how often a charge will occur. Along with the charge period related columns,

the Charge Frequency is commonly used to understand recurrence periods (e.g., monthly, yearly), forecast

upcoming charges, and differentiate between one-time and recurring fees for purchases. Finally, you’ll need to convert all of this vital data into usable reports and actionable projects. This is how FOCUS supports that.

Service Category

Keeping a handle on what resource is serving which function can be difficult. The service category column provides a high-level classification to each resource, based on its broad function. Within FOCUS, it’s only possible to include one service category: this is commonly used to identify which resource provider or broad architecture it’s based on.  

Sub Account ID

Drilling slightly deeper into each resource’s purpose is the Sub Account: this provider-support construct allows a resource to be connected to a specific billing account via the sub account ID. This can help break down a mass of resources into the broader strokes of accounts.

Tags

Tags are an essential part of FinOps: the FOCUS schema recognizes this and supports their implementation into wider resource identification. To support engineers, a tag is mandatory – and finalized when the engineer selects a specific tag out of a number of pre-set options. This allows the rapid identification of resources even as they’re set up, and can highlight what project, team, and customer the resource is fulfilling.

FOCUS’ open standard allows for a no-stone-unturned approach that directly supports engineers and builds a fantastic foundation for FinOps optimization. All executed at the CLI via open-source code, the application of FOCUS promises a new wave of FinOps success.

Who is FinOps FOCUS For? 

The FOCUS project places a key focus on streamlining engineers’ workflows. With this in mind, it’s vital to note FOCUS’ potential for cross-pollination, as the common schema and terminology is designed to slot in not only with just about any cloud vendor, but also with pre-existing accounting and financial discussions. Furthermore, the initial spec also aims to be extendable to other SaaS products. 

The human approach built into FOCUS extends to the clean, readable display names across every data point and column. This promotes the easy and rapid implementation of data into the continuous improvement pipeline, rather than wasting your engineers’ time and energy on obscure names. Furthermore, thanks to the segmentation of cloud cost into columns (rather than bulky actual vs amortized datasets), the storage requirements for the total dataset is far smaller – and require less compute power to sort through, thanks to the schema’s higher degree of precision. 

Crucially, FOCUS also champions the involvement of cloud providers themselves. With Microsoft, AWS, and Google Cloud all supporting the project, FOCUS aims to guide providers toward billing datasets that are widely accessible. The companies involved recognize that transparent cloud billing enhances the innovation and experimentation that the cloud facilitates. Furthermore, FOCUS simplifies the process of developing and refining applications on Azure, AWS, and Oracle by providing a clear understanding of billing. Fundamentally, these cloud providers recognize that improved cooperation among business, technical, and finance teams boosts overall productivity.

The widespread implementation of FOCUS will streamline the process of allocating, analyzing, monitoring, and optimizing expenses across multiple providers, making it as straightforward as dealing with a single provider and enabling more efficient resource usage. Gone are the days of aggressive cloud product isolation: as FinOps expertise becomes increasingly transferable, so should the providers themselves. This benefits practitioners, vendors, and consultants as they transition to organizations utilizing various cloud or SaaS products. By eliminating the need to navigate proprietary data formats, organizations can concentrate on enhancing FinOps functions that offer tangible benefits.

Getting Started & Growing with FOCUS

FOCUS was built to be as rapidly-implementable as possible. Given this, many practitioners’ first stage of getting started with FOCUS is via the converter. This CLI utility converts billing data from cloud providers such as AWS, Azure, and Oracle into the appropriate datasets outlined above. This converter is key to ingesting the large files and formats of different cloud providers – especially when the provider’s data files don’t innately adhere to FOCUS’ data requirements. 

The importance of the open-source converter cannot be overstated: the conversion process carves FOCUS’ columns and specifications into your cloud data, encoding a data-deep FinOps understanding. Where the appropriate data doesn’t exist in your provider’s data files, a best-effort conversion takes place. For a technical dive into the provider-specific conversion rules, each rule can be examined in the open source ‘conversion_configs’ directory.

As an open-source project, FOCUS places a key focus on incremental iterations. Not only is this reflected in the modularity of the converter and the regular updates, it further plays a huge part in FOCUS’ future. As you get started with FOCUS, you may find yourself in a niche use case that the specification doesn’t yet have in mind. 

This is why FOCUS promotes practitioners’ own contributions –  a few key mission statements help keep everything consistent:

  • Identify the essential FInOps capability that your practitioners need to perform. This helps strip the issue down to the precise dimensions, metrics, and cost/usage attributes required. 
  • Be scenario-driven. Contributing practitioners are asked to define columns on the basis of real scenario requirements – meaning each column demands its own use case. 
  • Simplify. Just because dimensions or metrics can be added, doesn’t mean they need to be.
  • Consider the existing data that already exists in major providers’ datasets. Try and align the FinOps use case with the data already recorded by the major providers. 
  • While simplicity should be sought, prioritize accuracy and consistency
  • Data and column names should be presented in a legible and easily-understandable form, with no unnecessary jargon.
  • If a term must be used which has either an unclear or multiple definitions, it should be clarified in the glossary.

Overseeing the final implementation of each change is the FOCUS group  – a community of FinOps practitioners, providers, and FinOps vendors. With these tenants and steering group in hand, FOCUS is poised to transform cloud cost visibility. 

To get started with FOCUS, clone the converter’s repository from GitHub or download the schema directly, making sure to reference the JSON schema file in your pre-existing data validation scripts. 

FOCUS Alone Won’t Do It

While FOCUS is revolutionary in its own right, it’s important to take a step back and evaluate how its new streams of data are ingested and actioned within your organization’s current processes. Without the support of dedicated FinOps practitioners and processes, the data collected by FOCUS simply reaches a bottleneck. GlobalDots’ expertise in the FinOps field helps support upstream FinOps transformations through a varied and dedicated suite of tools. These can fit both before and after FOCUS implementation: for instance, by automating reserved instance purchasing, your engineers are granted the free time to implement FOCUS and see the changes happen in real time. 

No matter where you are in your FinOps journey, GlobalDots’ Cloud Cost Optimization Solution grants your team the decades of experience and strategy to jumpstart your next step.

Latest Articles

Complying with AWS’s RI/SP Policy Update: Save More, Stress Less

Shared Reserved Instances (RIs) and Savings Plans (SPs) have been a common workaround for reducing EC2 costs, but their value has always been limited. On average, these shared pools deliver only 25% savings on On-Demand costs—far below the 60% savings achievable with automated reservation tools. For IT and DevOps teams, the trade-offs include added complexity, […]

Itay Tal Head of Cloud Services
5th December, 2024
How Optimizing Kafka Can Save Costs of the Whole System

Kafka is no longer exclusively the domain of high-velocity Big Data use cases. Today, it is utilized on by workloads and companies of all sizes, supporting asynchronous communication between even small groups of microservices.  But this expanded usage has led to problems with cost creep that threaten many companies’ bottom lines. And due to the […]

Itay Tal Head of Cloud Services
29th September, 2024
How E-commerce TrustMeUp Achieved 40% Faster Delivery and 25% Bandwidth Savings with GlobalDots & CloudFront

A popular e-commerce platform was growing fast, but that growth created challenges. With a poorly optimized cloud setup, the company faced content quality problems, as well as ongoing security issues. The only way to solve the problem was to optimize their CloudFront distribution – leading them to work with GlobalDots’ innovation experts. Using the solution […]

Itay Tal Head of Cloud Services
11th September, 2024
EBS-Optimized Instances: A Guide to Cut Costs and Maintain Performance

A recent study of over 100 enterprises found more than 15% of AWS cloud bills comes from Elastic Block Store (EBS). But what can you do to cut those costs without impacting performance? The key is to select EBS-optimized instances. With the right combination of EBS-optimized instances and EBS volumes, companies consistently maintain at least […]

Ganesh The Awesome Senior Pre & Post-Sales Engineer at GlobalDots
19th May, 2024

Unlock Your Cloud Potential

Schedule a call with our experts. Discover new technology and get recommendations to improve your performance.

    GlobalDots' industry expertise proactively addressed structural inefficiencies that would have otherwise hindered our success. Their laser focus is why I would recommend them as a partner to other companies

    Marco Kaiser
    Marco Kaiser

    CTO

    Legal Services

    GlobalDots has helped us to scale up our innovative capabilities, and in significantly improving our service provided to our clients

    Antonio Ostuni
    Antonio Ostuni

    CIO

    IT Services

    It's common for 3rd parties to work with a limited number of vendors - GlobalDots and its multi-vendor approach is different. Thanks to GlobalDots vendors umbrella, the hybrid-cloud migration was exceedingly smooth

    Motti Shpirer
    Motti Shpirer

    VP of Infrastructure & Technology

    Advertising Services