Influence Looker’s Roadmap and join the BI Customer Council (BICC)
Would you like to influence Google Cloud’s BI roadmap more? I’d like to invite you to join the Google Cloud BI...
•
Would you like to influence Google Cloud’s BI roadmap more? I’d like to invite you to join the Google Cloud BI...
I am trying to create an SA360 data transfer into BigQuery using the UI.In the Custom Columns field I am using...
Is it possible to assign a dataproc cluster (server or serverless) to the spark connection in the bigquery. I ...
Say that I committed some code that has not yet been pushed to production.Is it possible to:1. See the commits...
Is there any way to print the responses of a Google Form as a summary? We are trying to print the responses to...
Talking about the data export from GA4 to BigQuery, in logs explorer I see an InsertJob for the events_yyyymmd...
in the workflow_settings.yaml I want to add a var that is an array:Example of workflow_settings.yamldefaultPro...
Hi team,Can i assign bq slots to a particular user for querying tables? I already have reservations for the sa...
We are facing two issues in a specific Dataflow pipeline. First, the pipeline is not generating logs at any st...
Hello,I'm looking for some clarification regarding usage of the Global PubSub Endpoint."According to this docu...
When creating looker instance, I face with the issues, "You do not have quota available for this option "Howev...
Hello, I'm new to GCP and I'm trying to use Data Profiling from Dataplex. I have a couple of questions:Is ther...
Hello,We are using Google Ads Data Transfer to export campaign information and gclid data from Google Ads to B...
Hello Team,Stuck between A & B. Please help out.Your developers have been thoroughly logging everything that h...
Hi there,I have a CSV file of around 10GB stored in a GCP bucket. I perform basic operations on this file, suc...
Here is my setup:Kinesis --> pubsub ingestion --> topic "T" --> BigQuery Subscription "S"The destination table...
Hello,Maybe I'm missing something, but I cannot get PubSub Subscription to BigQuery sink to work using directi...
The daily report has not appeared in BigQuery since 2024-11-12. This stream has been operational since 2023 wi...
Good day, I am trying to setup a google cloud function with eventarc as trigger (dataflow event), everything i...
Hello!I've set up a repository in Google Dataform and successfully connected it to GitHub. After creating a de...
I have limited understanding of google cloud. I'm seeking assistance in finding a suitable solution for implem...
Hello Community,Is there a way to trigger the workflow config in dataform only when a commit is merged in the ...
Hi everyone, I created a table from a Python script that collects data from an API and stores it in BigQuery. ...
Hey there, I'm new to GCP and I'm trying to find the most fitting solution in my project.The project consists ...
So I have a gsheet with 1.5k row and a few simple formulas that is not heavy at all to run. Im consuming this ...
I noticed https://cloud.google.com/bigquery/docs/partitioned-tables page was modified recently and now it stat...
Hi there,I have started using BI engine for one of the projects and when i ran a query recently i noticed that...
Dear All,I have an GA4 dataset on bigquery and because of slow arriving events going to the same event (portio...
Hi Team,How to create a udf in bigquery that takes, n number of parameters of different datatypes.for example ...
Hello, the error in the title occers every time and I would like to know how to solve.My code is below.SELECT ...
Our team has several data workflows where we first read the start date from one table in BigQuery, and then us...
User | Likes Count |
---|---|
3 | |
2 | |
2 | |
1 | |
1 |
In part one of this series, we explored how to achieve high availability for the SAP application layer using Windows Server Failover Clustering (WSFC) on Google Cloud Platform (GCP). Now, let's dive into the crucial aspect of database layer high availability, focusing on MS SQL Server's Always On Availability Groups.
This two-part series focuses on achieving high availability for your SAP deployments on Google Cloud Platform (GCP) using Windows Server Failover Clustering (WSFC). In part one, we'll delve into the intricacies of ensuring SAP application layer high availability.
Healthcare Data Engine(HDE) is a popular GCP based solution built on Cloud Healthcare APIs (CHC API) to help healthcare stakeholders transition to FHIR and promote interoperability between multiple data origination sources. HDE provides users the ability to run Mapping pipelines which helps in converting healthcare data into FHIR and Reconciliation pipelines helps form a longitudinal patient record view.