Sleepless nights over spiking cloud cost? we have an open source and gratis solution!



"Footwear, Jacket and Cap paired up with Sleeves on

In this article, we'll delve into setting up a hackathon using Google Cloud's unique Generative AI and Machine Learning capabilities. We’ll do this in a safe an secure manner by employing the powers of Budget Alerts and API Quotas. Using this easy solution you’ll be able to

✔️ Focus on innovation and groundbreaking new features
✔️ Hyper speed of development
✔️ All the right resources at your fingertips

And prevent the nasty bits like:

❌ Spending thousands of dollars on Google Cloud for 24 hours of usage
❌ Training ML models on massive unfiltered data sets
❌ Inadvertently launching a DDOS attack and funding the receiving side of it

Try it out: a Hackathon using secured sandbox projects

Hackathons are dynamic events where teams come together to create innovative projects or features within a limited timeframe. However, amidst the excitement of ideation and development, there's a looming concern for project organizers: the potential for runaway cloud costs.

But why is this so important? While GCP offers a plethora of powerful services and resources, it's easy for enthusiastic teams to unintentionally rack up expenses, especially when experimenting with advanced features. To avoid such budgetary nightmares and ensure a smooth hackathon experience, we'll explore the intricate details of configuring budget alerts and dealing with service quotas through Terraform.

Along the way, we'll address the challenges faced when integrating service quotas with Terraform, shedding light on the cumbersome aspects of this process and providing insights into how to navigate them effectively. By the end of this article, you'll be well-equipped to create a cost-effective and secure GCP environment for your hackathon, empowering teams to innovate without the fear of unexpected financial consequences.

Creating Google Cloud Projects and Roles (Manually?)

Manually creating multiple projects in Google Cloud Platform (GCP), especially when they require distinct user access and roles, is time-consuming and prone to human error. This process can be particularly challenging when setting up projects for multiple hackathon teams. To overcome these issues, using infrastructure as code (IaC) tools like Terraform can simplify project provisioning, user management, and role assignments. This approach ensures consistency, scalability, and reduces administrative overhead. ZENsoftware offers an open-source project that includes essential features such as budget alerts, user roles, and service quotas to address these challenges. The GitHub - zensoftwarenl/google-cloud-hackathon project.



Logo of Poll Everywhere showing text "Limits & Quotas

Setting up Quotas in GCP (Manually?)

Setting up quotas for Google Cloud Platform (GCP) resources is an essential step in managing your cloud infrastructure efficiently. In an ideal world, this process should be straightforward and hassle-free, but in reality, it is anything but that. While the initial expectation is that setting quotas should be easy, considering it is a vital part of cost management, it turned out to be unexpectedly challenging when using Terraform to handle the task. These difficulties stem from a combination of factors, including inadequate documentation, confusing terminology, a lack of clarity about unit formats, and the inherently manual nature of the process. We'll explore these challenges in detail, shedding light on the complexities that can arise when working with GCP quotas through Terraform.

Read The ‘Fine’ Documentation

Firstly, the documentation can be somewhat insufficient and disjointed. GCP and Terraform tend to use different terminology when referring to quotas and limits. While Terraform consistently employs the term "limit," GCP labels them as "units." This disparity in terminology can cause confusion, making it challenging to align Terraform code with GCP's quota management.

Furthermore, the format of the “unit” can be cryptic. GCP introduces various "unit" formats, such as /d/project in one context and /min/project/user in another (other examples include /project/region, /min/project) . Deciphering these variations and comprehending which format applies to a specific quota can become a source of bewilderment.


Confused? Logo of Karachi Port Trust reads "I'm so confused".

Moreover, understanding the precise unit format for a particular quota is not readily accessible through documentation. This vital information often remains hidden, requiring users to delve into GCP's command-line tools or web interfaces to unearth the specifics. For example, the GCP documentation states the “unit” format requires the values to be formatted with {} in the unit, but in Terraform using these brackets in the unit will not work. Even when inspecting the requests sent out by the console.cloud.google.com the request unit is written with {} while in Terraform this will absolutely not work.

In summary, while Terraform excels in automating various aspects of infrastructure management, the intricacies surrounding GCP quota management remain a challenging and less streamlined facet of cloud resource provisioning. Navigating unclear documentation, inconsistent terminology, and non-standardized unit formats can transform what should be a straightforward procedure into a formidable task, highlighting the need for improvements in this area to enhance user efficiency and experience.

Getting Stuff done using code

To demystify the complex process of setting up GCP quotas using Terraform, we embarked on a journey of exploration, uncovering the intricate details of each quota. Here's how we navigated through the labyrinth of quota management.

Our journey began by inspecting the outgoing requests generated by the GCP Console. This exercise allowed us to peer into the internal format and structure that GCP used for managing quotas. However the “fun” part is that the API used by Terraform is different from the one used by the console. This resulted in the format not being correct and ended up being a useless exercise.

Moving on, we turned to the user interface (UI) for further insights. Enabling a specific service within the GCP Console and listing its associated quotas in the UI revealed not only the quota Metric but also started to unveil the underlying format for the unit. Here the documentation inconsistency comes back again, in this UI the value for the unit is called “Limit name” and is not in the format of slashes but instead is written as a text, for example: WritesPerMinutePerProjectPerRegion, subscriberPerMinutePerProjectPerRegion, default/USER-100s. For some quotas, you could estimate the correct format of slash-based “unit” value, but for some, this didn’t work (like the last example I showed earlier).

Armed with the knowledge of quota names and some formatted units, we harnessed the power of the gcloud command-line tool. Specifically, we employed the command: gcloud alpha services quota list --service=<service> --consumer=projects/<projectid> --format=json > output.json This command allowed us to export the list of quotas in a more readable JSON format, consolidating crucial information about each quota. The output file was our treasure trove of quota insights. Here, we could discern the format for each service-quota combination. This knowledge was invaluable as it provided a foundation for understanding how quotas were structured. However, it's important to note that while we could gather some unit formats from this output, not all of them were correct or complete and they were formatted using the {} format, the format not supported by Terraform. These discrepancies meant that we still had to engage in a meticulous process of trial and error, systematically exploring various variations of the unit format until we found the correct one for each specific quota.

Magenta rect,circle w/ nums, text: "l/min

Excerpt of the output of the gcloud command showing possible unit notation

Using this process we were able to generate a list of quotas that we included in our open source project GitHub - zensoftwarenl/google-cloud-hackathon. Next to you being able to use this quota list and project to start managing your quotas through Terraform, I also hope that knowing the steps I took to find the correct format for the unit value of each quota helps you manage your projects better.

Using the our free and open-source project

Our open-source project is designed to streamline the process of setting up GCP quotas using Terraform. With this project, you can use it locally or within a Terraform Cloud provider, making it easier than ever to manage your cloud resources efficiently.

Using this tool is as simple as updating the values of variables to align with your specific team and role format requirements for your hackathon projects. Whether you need to set up quotas for multiple teams with different access levels or define specific roles for each participant, this open-source solution offers the customisation you need to tailor the quotas to your exact specifications.

But our commitment to your cloud success doesn't stop there. If you find yourself in need of expert guidance and support for your cloud services, we also offer a comprehensive consultancy service. Our team of experienced professionals can assist you in optimising your cloud infrastructure, ensuring cost-effectiveness, security, and performance. Contact us for more information.

So, whether you're just starting with our open-source project or seeking assistance to supercharge your cloud services, we've got you covered. Reach out to us today and discover how we can empower your hackathon teams and enhance your cloud operations.

Tell me more about setting up a GenAI Hackathon and Cloud Landing Zones