Our data is in PSA, and we have everything we need to build out a dimensional model. In this video we will define a conceptual model and use the LeapFrogBI Platform to automate the development of all required ETL. Finally,
We have a persistent staging area loaded with our YouTube Analytics data. Now we need to do a bit of data discovery to better understand how our requirements will best be met. With this information in hand we will be
Our YouTube data has been downloaded from the Reporting API, and we created a simple LeapFrogBI Platform project which will parse the downloaded flat files and load a persistent staging area. Now it’s time to deploy our project and load
We are on the road to getting our YouTube Analytics data into Power BI. So far we have created a simple console application that collects reports from the YouTube Reporting API. Now it is time to load the downloaded data
Now that we know a little bit about the YouTube Reporting API and OAuth 2.0, it is time to build a .NET console application that can interact with the Reporting API and automate the ongoing requirement to download our YouTube
YouTube’s Reporting API uses OAuth 2.0. In this video, we will use the Google Developer Console to create a project enabling our upcoming application to authenticate and gain authorization. We will also review the Reporting API quotas.
Would you like to use Power BI to analyze your YouTube Analytics data? Same here. In this first video, we briefly review the YouTube Reporting API which will be used to collect YouTube data. In subsequent videos, we will download
At LeapFrogBI we use the term data solution to refer to the portion of the overall analytics system that acquires data and makes it report-ready. The data solution (not the reporting software) is the most important factor in determining what
(This is the final part of a three-part series illuminating the challenges of data analytics and describing a proven solution to the problem.)
In part one, Measuring the Cost of Doing Business With Insufficient Analytics, we learned that, on
In part one of this series, Measuring the Cost of Doing Business with Insufficient Analytics, I referenced a series of studies conducted over the past five years showing that data analysts spend about 80% of their time organizing data.
A recent survey conducted by CrowdFlower and summarized on Forbes found data scientists spend most of their time massaging rather than modeling or mining data for insights. It seems 79% of their time is spent either accessing or preparing data,
The “data lake”, a catchy new buzzword in analytics circles, has many people wondering if they still need a data warehouse. You may have heard that you can run analysis directly against the data lake, and that’s true. This quickly