3 Use Cases for GCP Bigtable

Bigtable was the solution Google created when it needed a database that could deliver real-time access to the petabytes of data Google Search generated. Having initially used it to power the search engine and core services such as Gmail and Google Maps, Google launched GCP as a service for its customers in 2015. With the global cloud database and Database as a Service (DBaaS) market size forecast to more than double — from $12 billion in 2020 to $24.8 billion by 2025 — companies worldwide are looking at a range of cloud-based options for accessing their data. 

In this article, we discuss GCP (Google Cloud Platform) Bigtable’s specific characteristics to help you decide whether it is appropriate for your application, and we outline three use cases:

What Is Bigtable?

GCP Bigtable is Google Cloud’s petabyte-scale NoSQL database service for demanding, data-driven workloads that need low latency, high throughput, and scale insurance. Bigtable supports vast scalability, simple administration/management, and cluster resizing without downtime. It can scale to billions of rows and thousands of columns, enabling you to store petabytes of data, and supports high read-and-write throughput at low latency. You can use GCP Bigtable to store and query data such as the following:

·         Time-series data (e.g. CPU and memory usage over time for multiple servers)

·         Adtech data, (e.g. customer preferences and buying histories)

·         Financial data (e.g. exchange rates, stock prices)

·         Internet of Things (IoT) data (e.g. energy usage for appliances) 

What Should You Use GCP Bigtable for?

Put simply, GCP Bigtable is best suited for applications that require fast access to big data. In Gartner reviews, it earns a top rating for high-speed, high-volume processing. Apart from its suitability for datasets in the terabytes, here are some other Bigtable characteristics that help determine its suitability for a particular application:

·         It is also an option if you need to store and query data over long periods of time. You should opt for an alternative analytical storage system for irregular or short-term storage and querying of data.

·         Bigtable is also suitable for use cases requiring extremely high levels of throughput — tens to hundreds of thousands of queries every second. For just a few queries per second, choose an alternative system.

·         Bigtable offers basic access to data in the form of lookups and simple scans across keys. If you need secondary indexes or more complex forms of access, you should use a relational database.  

·         Performance is affected if you store individual data elements that exceed 10 megabytes in size. To store larger, unstructured objects (e.g. video files) use Cloud Storage.

Best Bigtable Use Cases

One of the most common use cases for Bigtable is in a large-scale data processing system that performs MapReduce operations. It is also an excellent storage option for users of Cloud Dataflow or Cloud Dataproc. 

However, possibly Bigtable’s best fit is for real-time analytics. For applications that need to perform analytics on events as they happen, it works really well. This is a common use case with financial services and the Internet of Things. (BigQuery is probably a better choice for interactive analytics, which run SQL queries on a data warehouse). 

In short, Bigtable on GCP is best for analytical data with intensive read/write operations, which happens in adtech, financial or IoT data.  

1.     Bigtable for adtech

Using GCP Bigtable, you can integrate very large volumes of unrefined data from multiple sources. This could be in an effort to promote consistent customer activity across channels. You can use it to gather and compare large volumes of data related to buying behavior to identify common patterns that could help you to increase sales.

2.     Bigtable for financial data

The modern financial world is too complex for traditional financial data pipelines. The presence of multiple financial exchanges and global user demand means these kinds of data pipelines must be ultrafast, dependable, and secure. GCP Bigtable can help.

You can use it to construct models based on historical behaviour to predict financial trends. It is also useful for storing and consolidating market data, trade activity, and other data.

3.     Bigtable for IoT data

When it comes to IoT, Bigtable can help you monitor both normal and abnormal behavior by ingesting and analyzing huge volumes of time series data from sensors in real time. This allows customers to build dashboards and drive analytics on their data as it is generated.

Conclusion

These use cases illustrate how GCP Bigtable can be a great fit for applications that require fast access to extremely large datasets and extremely high levels of throughput. For these reasons, it can be the ideal solution for adtech, financial, or IoT data. 

Stone Door Group is a member of the Google Partner Advantage program and retains a bench of Google Certified Cloud Architects and Data Engineers. To learn more about how we can help you with your next big data project, talk to Stone Door Group’s experts.

 

About the Author

RJ Daskevich, DCM is a senior consultant and training instructor for Stone Door Group’s AI and Machine Learning practice. He is both a Hadoop Certified Developer and Google Certified Data Engineer. Stone Door Group helps customers transition to an AI and ML, enabling digital enterprise in a consumable way. To speak with RJ and our team of experts, send us an email at letsdothis@stonedoorgroup.com