Our data is in PSA, and we have everything we need to build out a dimensional model. In this video we will define a conceptual model and use the LeapFrogBI Platform to automate the development of all required ETL. Finally, we will deploy our data to Power BI.
We have a persistent staging area loaded with our YouTube Analytics data. Now we need to do a bit of data discovery to better understand how our requirements will best be met. With this information in hand we will be able to create a suitable dimensional model.
Our YouTube data has been downloaded from the Reporting API, and we created a simple LeapFrogBI Platform project which will parse the downloaded flat files and load a persistent staging area. Now it’s time to deploy our project and load the target schema.
We are on the road to getting our YouTube Analytics data into Power BI. So far we have created a simple console application that collects reports from the YouTube Reporting API. Now it is time to load the downloaded data into a persistent staging area. The LeapFrogBI Platform will be used to automate all ETL […]
Now that we know a little bit about the YouTube Reporting API and OAuth 2.0, it is time to build a .NET console application that can interact with the Reporting API and automate the ongoing requirement to download our YouTube Analytics data.
YouTube’s Reporting API uses OAuth 2.0. In this video, we will use the Google Developer Console to create a project enabling our upcoming application to authenticate and gain authorization. We will also review the Reporting API quotas. Follow us! Email *NameSubmit Learn more Learn about ReadyForBI™, our complete BI in the cloud solution that includes […]
Would you like to use Power BI to analyze your YouTube Analytics data? Same here. In this first video, we briefly review the YouTube Reporting API which will be used to collect YouTube data. In subsequent videos, we will download YouTube data, create a dimensional model, and ultimately use Power BI to visualize our YouTube […]
Introduction Your project has a single dimension data flow defined. Now it’s time to complete an iterative deployment and load our target dimension. Success is only minutes away!
IntroductionSo far we’ve setup a project and created a stage component. Now it’s time to define the rest our dimension data flow. To accomplish this at lightning speed we will solicit the help of a design pattern. Prepare to be transformed into an ETL ninja!
Introduction Your project is setup. Now it is time to begin building a dimension data flow, and the first step in our data flow is a stage processes. In this video, we will create a process which will collect data from a sample CRM data source, move the data to our destination environment, and load […]
Introduction Welcome to the Getting Started video series. In this short series, we you are going to become familiar with the LeapFrogBI Platform as we step through the process of creating a single dimension data flow. In this first video, we will start by setting up a project.
The field [dbo].[Precedence].[JobNotify] in theÂ LeapFrogBIÂ console databaseÂ is used by the poll job to determine which email to send and when to send it.Â The package named â€œBI_Nofity.dtsxâ€ (step 2 of the poll agent job) consumes and sets these values.Â Here’s what the values represent.