Google Bigtable vs BigQuery for storing large number of events

Johan picture Johan · Dec 23, 2015 · Viewed 16k times · Source

Background

We'd like to store our immutable events in a (preferably) managed service. Average size of one event is less than 1 Kb and we have between 1-5 events per second. The main reason for storing these events is to be able to replay them (perhaps using table scanning) once we create future services that might be interested in these events. Since we're in the Google Cloud we're obviously looking at Google's services as first choice.

I suspect that Bigtable would be a good fit for this but according to the price calculator it'll cost us more than 1400 USD per month (which to us is a big deal):

enter image description here

Looking at something like BigQuery renders a price of 3 USD per month (if I'm not missing something essential):

enter image description here

Even though a schema-less database would be better suited for us we would be fine with essentially storing our events as a blob with some metadata.

Questions

Could we use BigQuery for this instead of Bigtable to reduce costs? For example BigQuery has something called streaming inserts which to me seems like something we could use. Is there anything that'll bite us in the short or long term that I might not be aware of if going down this route?

Answer

Solomon Duskis picture Solomon Duskis · Dec 23, 2015

Bigtable is great for large (>= 1TB) mutable data sets. It has low latency under load and is managed by Google. In your case, I think you're on the right track with BigQuery.