Questions tagged [google-bigquery]
Google BigQuery is a Google Cloud Platform product providing serverless queries of petabyte-scale data sets using SQL. BigQuery provides multiple read-write pipelines, and enables data analytics that transform how businesses analyze data.
I have been having some difficulties with this issue for a few days and I cannot figure it out.
(Running this is Bigquery)
For a marketing client, I am trying to obtain impressions and clicks per ...
I have shiny app deployed on GCP and I am using gcloud SDK run queries on bigquery. Is it somehow possible to add multiuser authentication on gcloud SDK. I am using bq sdk because it is the only way ...
I am working with a openstreetmap dataset in Bigquery, I made a query that return a list like this
here is the query I am using
Want to use google Big query as source from Oracle data integrator.
Simba drivers were downloaded but having difficulty in getting connected to GBQ from ODI.
I'm missing a project that i've been working on a couple of months ago.
The datasets are still there in bigquery. Also the project is still in BQ. It's just no longer a project in my cloud platform....
Would someone be able to explain how to create date partitioned table while using a loadjob in google Bigquery using JobConfig.
I was wondering how I could calculate the difference between elements in arrays that have only two elements, by subtracting the lesser from the greater.
I started by referring to the code sample from ...
Will result in a error.
I also tried a more sophisticated approach like a few variations of:
import google.datalab.bigquery as bq
I am not able to update a big query table from a storage file. I have latest data file and transfer runs successfully. But it say "8:36:01 AM Detected that no changes will be made to the destination ...
I try to rename bigquery rows in an Apache Beam Pipeline in Python like in the following example :
Having 1 PCollection with the full data and 1 other with only 3 fields renamed col1 in col1.2, col2 ...
I want to compare hard-coded values with table metadata (columns, data_types). How to realize in BigQuery the following usual SQL logic?
(select 'complaint_type' as column_name, 'STRING' as data_type
job_config = bigquery.LoadJobConfig()
# job_config.autodetect = True
# job_config.source_format = bigquery.SourceFormat.NEWLINE_DELIMITED_JSON
job_config = bigquery.LoadJobConfig(schema=[
I want to compare table metadata (columns, data_types) with some hardcoded values.
How to realize in BigQuery the following usual SQL logic?
select column_name, data_type
I am using BigQuery to query an external data source (also known as a federated table), where the source data is a hive-partitioned parquet table stored in google cloud storage. I used this guide to ...
There is a lot of documentation on gcp about querying sharded/wildcard tables , but I can't seem to figure out how to create or insert data into such as table.
Here's a trivial and mostly ...