Google has launched its Bigtable NoSQL database on to its Cloud Platform. Announcing the launch at the Strata Hadoop conference in London Cory O’Connor, Product Manager at Google said the launch of Google Cloud Bigtable would offer users “twice the price performance and single digit latencies” with typical latencies of 6ms.
Bigtable is the NoSQL database that underpins Google’s largest systems and drives Google’s Search, Analytics and Gmail services, and until now if you wanted to use it you needed to setup your own systems and build the services yourself.
According to a blog post by Cory the reason for the launch is partly down to the growing potent markets in the Internet of Things (IoT) and enterprises and data-driven organisations who need to crunch huge data sets quickly and who “must become adept at efficiently deriving insights from their data. In this environment, any time spent building and managing infrastructure rather than working on applications is a lost opportunity.”
Cory goes on to say that the service will be “fully managed, high-performance, extremely scalable “ well he would wouldn’t he? The service is aimed at any organisations that need to handle huge volumes of data, including businesses in the financial services, AdTech, energy, biomedical, and telecommunications industries.
As Cory explained at the Open Source at the conference the Bigtable solution will be able to be accessed through the HBase API and is natively integrated with much of the existing big data and Hadoop ecosystem and supports Google’s big data products. Additionally, data can be imported from or exported to existing HBase clusters through simple bulk ingestion tools using industry-standard formats.
Google also claims they can have a Cloud Bigtable cluster up and spinning in less than 10 seconds and data is scaled automatically, “so there’s no need to do complicated estimates of capacity requirements.”
The service is available initially as a beta release in multiple locations worldwide.