Archive for the ‘google analytics’ Category

Dec 09

For a many years now, E-Nor has witnessed and documented the dominance of Google Analytics in the Fortune 500. Also, as the digital consulting agency for many United States government entities, we’ve noticed our government take steps towards digital measurement relying on Google Analytics as their tool of choice.

We wondered how the rest of the world is doing! With E-Nor’s global footprint, we’ve seen a significant adoption of Google Analytics and Google Analytics Premium in regions throughout the world, such as the Middle East and Europe.

Here are our findings:


References (we excluded the US in the spreadsheet since we covered that in the previous post).

More to come on Latin America and Asia Pacific.

Dec 03

In today’s world, a visitor may touch your digital presence from a variety of devices and locations – and multiple times before they buy your product or contact you. They may visit your site from their desktop at home, then go to work and visit your site from their mobile app. Finally, their conversion may happen days later when they come home from work and then log into your system.

Conversion doesn’t start with conversion. There’s a whole process that preludes it – including research, awareness, and interest. Different segments of your traffic may have different behaviors and a different path to conversion. To truly optimize your site and understand your data, you need a full view of how users move through the decision-making funnel toward conversion and really be able to dissect and understand them. In this post, we’d like to show you how!

What is Session Stitching?

“Stitching” in this context is basically grouping visits/sessions (that otherwise wouldn’t be connected) using a unifying key(s). There are a couple “keys” we can use in Google Analytics to do this.

For example, already native to Google Analytics is the the User-ID feature, which let’s you group or “stitch” together sessions based on id’s provided by your backend system when users log into your website or app.

But do you, as an internet surfer, log into, every single time you research a product? Most likely not, only when you’re ready to buy (or if you want to see if something is Prime eligible :) ). So for marketers to really be able to connect the dots of the entire path of their visitors, we need to somehow include the visits where the user may not have logged in.

In this post, we’d like to talk about some advanced stitching techniques within GA using a combination of a couple different keys – User-ID (native to GA) and Custom Dimensions (clientID’s and userID’s stored in custom dimensions).

Note: When implementing these methods, there’s potential to store user identifiable information that violates your privacy policy (as well as the general privacy of your visitors). Check with your legal team and company privacy policy and ensure compliance.

Basic Session Stitching in GA – Native User-ID and Session Unification

As mentioned, out-of-box, Google Analytics allows you to group a users sessions based on login using the “User-ID” feature. Even further, it allows you to go backwards and associate otherwise disconnected portions of these sessions (since the user may not have been logged in during the whole visit) using session unification.

Without session unification, session 2 is separated:

With session unification, session 2 is connected:

(images from Session Unification Help Article)

What about those sessions where these visitors never logged in? We’re still missing that data. The clue/key that can provide more insight there is the clientID.

Intermediate Session Stitching – Associating Anonymous and Logged-in Users (clientID’s and userID’s)

GA identifies each device + browser with a unique clientID. If there are sessions that contain a clientID (from a device) and a userID (the user on that device logged in), we’re going to assume that any session with that particular clientID, even where the visitor isn’t logged in and a userID is absent, is that same user. This is a powerful inference – because you now have a whole new batch of sessions that you can potentially associate with the user and group together – more pieces to your funnel.



Advanced Session Stitching – Users Across Multiple Devices

A user visits your site at home on his desktop then at work on his work smartphone. There are now two ClientID’s for the same user. They don’t log in on their first visit, but log in on their second visit. They get back home, and log in on their third.

You probably can see where we’re going with this. The second we can associate a clientID and a userID, we can make an assumption that they belong to the same person/user. So if you can associate 2 different clientID’s with the same userID, those clientID’s belong to the same person/user as well. Once again, we’ve opened insights to more visits that can be grouped as the same user. Now, anywhere you see either clientID or the userID, you can group them as one user and potentially the same funnel!



How to Implement Advanced Session Stitching

While we discussed there already is a native User-Id feature within GA, in order to really enable advanced session stitching and get the full robust advantage of Google Analytics features and reporting, you will need to store the clientID as well as the userID in a Custom Dimension.

  • Create the 2 Custom Dimensions in GA.
    • userID (or UUID or AccountID or MemberID, etc) – User Scope
    • clientID – Session Scope
  • Since the clientID is the ID that GA uses internally, we’ll need to read that from the Universal Analytics cookie or ask analytics.js for the value using the .get method. Once that’s done, we can store that in a custom dimension.
  • The userID is any ID that you assign to a user when they authenticate (as long as it isn’t PII). This will be determined by your organization, and once that’s done, store that into a custom dimension.

Now you have the flexibility of segmenting and filtering based on these two parameters since they are custom dimensions! Using custom dimension session stitching to connect single users to multiple sessions and multiple devices to single users provides a customer centric approach. We can hone in on the actions of individual customers to identify patterns of movement through content and devices/browsers. Knowing this information can help us refine the sales funnel and identify new types of users for advanced segmentation through further analysis outside of GA.


Stitching sessions through userID and clientID custom dimensions certainly provides more meaningful information than GA’s native User-ID and session unification, but it isn’t perfect.

Some of you advanced analytics ninjas already may have thought, “We’re assuming that once a user logs into a device, anytime you see that device it’s that user? What if someone else is using that same device? Or what if you actually do find multiple users using the same device/browser? What if the same user has multiple login credentials?”

Also, any device that the user never authenticates on can’t be stitched, so you could be missing some pieces.

While this is all true, this is the nature of identifying trends. Sure, you can take it with a grain of salt, but stitching provides vast trends giving us direction/clarity and unique perspectives on the same data. This could lead to optimization that may not have been thought of before.


Analyzing and Stitching Outside of GA

Another consideration of the stitching sessions through custom dimensions is that the real useful analysis needs to be performed outside of GA. While the custom reports in this blog post show examples of clientID and userID custom dimensions within the GA interface, they simply identify the sessions to be analyzed. To understand the behavior in the sessions that lead to conversion, much more data will need to be exported from GA and analyzed in analysis and BI tools.

What cases have you run into that you’d like to see stitched? Comment below!

Nov 17


Google Analytics’ Bot & Spider Filtering

Google Analytics recently released a feature called “Bot and Spider Filtering”. You might have read about it already but haven’t had a chance to implement it or process how it might affect your organization or even your effectiveness as a digital analyst. E-Nor recommends turning on Google Analytics’ Bot & Spider Filter as a best practice for all Google Analytics accounts (except for raw data Views – which we’ll explain at the end of the post).*

As an added bonus, it is super easy to do!

What is Bot & Spider Filtering?

Bot and Spider Filtering filters out sessions from bots and spiders (as defined by Google) from your Google Analytics data. It’s important to note though that like traditional Google Analytics filters, the Bot & Spider filter works on a “going forward” basis; in other words, past data will be unaffected. It also does not actually prevent bots or spiders from visiting your site (only filters them out from your data).

Typically, traffic from bots and spiders will be negligible and barely noticeable in your Google Analytics data. For example, in E-Nor’s recent test using the new filter, bots and spiders accounted for less than 0.5% of all sessions.

So then why should you care? There are cases where bot and spider activity can seriously skew results in Google Analytics. Unpredicted spikes for small durations can throw data for small time-frames out of whack and result in significant time being spent diagnosing spikes that turn out to be bot or spider traffic.

How E-Nor Diagnosed Spikes in Google Analytics Sessions

The key to what triggers our investigation(s) were:

  1. Was the activity out of the norm? Did it stand out compared to the same metrics in prior periods?
  2. Could we assign a reasonable explanation to it? For example, was content changed on the site or could some outside event lead to the activity?

When the answers are: 1. Yes, the activity is abnormal; and 2. No, we can’t figure out a reasonable explanation as to why – we need to go on a Google Analytics fishing expedition…


Case 1: We observed a Top Pages report with several new, unexpected entries. None of the typical reasons why the report would have new top pages had occurred – like an actual new page, a very popular campaign linked to a page or sudden public interest in the topic of a particular page. The organization this report belonged to primarily provided informational content, so their concern was the level of engagement with that information. These weird occurrences were a big deal.

E-Nor examined metrics for the pages in question (and for sessions including those pages). We observed the following anomalies compared to past data, to other pages during the same time period. We subsequently concluded that bot activity was probably responsible – even though this activity at first seemed atypical for bots!

We checked many metrics, and the ones that stood out were:

  • Pageviews and entrances increased by atypical amounts compared to prior time periods for several pages.
  • The bounce rate dropped dramatically, e.g. from 74% to 42%.
  • These same pages were also the top landing pages when they had not been in prior periods.
  • Sessions involving these pages include very high pageviews to the page in question.
  • Session duration for sessions including the pages increased abnormally, e.g. 4:36 to 28:37 minutes.
  • Pages per Session increased a huge amount from 2 to 13 pages per session.
  • Browsers with Browser Versions were unusual with sessions including these pages coming primarily from Internet Explorer versions 7, 8 and 6, rather than the typical IE 11, IE 9 and Chrome for this site.
  • Locations were primarily Russia, Indonesia, Argentine, Thailand, Mexico and other countries atypical for this site, where sessions typically occur mostly in the United States.


Case 2: For a B2B high tech company, we observed, again, a deviation from the prior period visible across many pages in a management report. Our report to senior management indicated huge interest from a major customer in jobs, rather than products, and huge interest from another customer who hadn’t been in the “top customers” report in the past. For the latter, we learned that a news item caused the unusual activity from that customer, so that was explained. For the former, we did not discover a solid reason for the change, so we went on a Google Analytics fishing expedition.

Our expedition across many, many metrics revealed these anomalies compared to prior periods:

  • The Source/Medium for the affected sessions was “(direct) / (none)”, which was suspicious because the Landing Page was not one someone would bookmark or type into the browser.
  • All sessions had the same Landing Page.
  • The Bounce Rate for that page skyrocketed.
  • The City and Region were both “(not set)” while the Country was the United States.
  • The Operating System was also “(not set)”.
  • The Browser was also “(not set)”.

With its high bounce rate and frequent occurrence of “none” and “(not set)”, Case 2 was an example of actypical activity likely to be indicative of bots.

References & More

For more information, always go to the source: Google’s announcement about Bot & Spider Filtering

And please don’t forget to annotate your Google Analytics data and please don’t apply the Bot & Spider Filter to your Raw Data View.*

*What is a Raw Data View?

A Raw Data View is a Google Analytics View with no configuration. For example, no Filters are applied and no Goals are set. The Raw Data View acts as a back-up, first, in case, we need to validate configuration in other Views (it might be easier to compare to a Raw Data View), and second, in case a View becomes too complex to reverse engineer, it might be easier to just copy the Raw Data View and then apply configuration, like Filters and Goals, fresh.

Nov 07

This is the video to our sampled data post for those of you who prefer visual over reading!
For sites with heavy traffic, Google Analytics might sampled your data, which may make it hard to gain insights (the data might even become unusable). Don’t worry though, there are plenty of other options to work around that challenge. The video explains a couple.

Do you have any solutions? Tell us in the comments.

Oct 29

Fortune 500 Google Analytics Adoption Rate in 2014

See last year’s (2013) Fortune 500 Adoption.

Google Analytics – Still the One

It’s that time of the year. We’re taking a look at Analytics use among the Fortune 500. Once again, all you see is orange. Google Analytics is the leader – a whopping 67% of the Fortune 500 websites are using GA and/or a combination of GA and other web analytics tools.

Key Highlights:

  • Orange is still the new black. In 2014, 6% increase in the use of GA from 2013.
  • Not a bad year for Adobe. 3.9% growth in the use of Adobe Analytics (Omniture/SiteCatalyst) with a 26.4% usage among the Fortune 500.
  • 14.9% decline in the use of Webtrends from the past year and 12.6% usage this year. :(
  • Finally, we see a 42.9% decline in the use of IBM/Coremetrics and a total of 4% usage this year. :( :(

Note: The percentages do not represent that the tools were adopted in 2014. They could have been adopted prior to 2014 or during. Also, the other category includes Yahoo Analytics, Piwik, AT Internet and Chartbeat).

Expect the use of Google Analytics in the enterprise to increase following the increased adoption of Google Analytics Premium across many verticals and the emergence of Google-sized enhancements such as Enhanced eCommerce, Data Import, BigQuery Integration, enhanced Mobile Analytics, and DoubleClick integration, among many others!

Oct 21


We’re so proud of the success of our first Evolve with Google Analytics Conference. Special thanks to Rising Media for partnering with us on it. It was the meeting of the minds of the foremost thought leaders in the industry. Everyone was super-engaged, shared great tips and strategies, and had a lot of fun! Can’t wait till next time.

Here are some highlights.







Oct 16

Google Analytics Jeopardy board

The Evolve with Google Analytics conference rolled into Boston last week and proved to be an event to remember: great speakers across the spectrum of conversion optimization, the inside scoop from Justin Cutroni, dynamite attendees who came from near and far, and more high-impact networking than you could shake a stick at.

Among the highlights was Google Analytics Jeopardy!, hosted by the congenial Judah Phillips, who heightened the learning and the fun with off-the-cuff anecdotes and insights. Make no mistake, though: the competition was fierce!

Ready to take the Google Analytics Jeopardy! challenge?

Stay tuned for round two in an upcoming blog post.

Hope to see you see you at the next Evolve with Google Analytics conference – don’t miss the action!

Oct 14


Bill Gates once said,

“The most meaningful way to differentiate your company from your competitors, the best way to put distance between you and the crowd is to do an outstanding job with information. How you gather, manage and use information will determine whether you win or lose.”

With the announcement of Wave, SalesForce Analytics Cloud, it’s now more important than ever to measure clients’ 360 journey to connect the dots between business user data (e.g. sales data in SalesForce) and web and mobile behavioral data (e.g. user interactions data in Google Analytics).

In this post, we’re going to show you how to do that!

Our Story

To improve profitability, every company (regardless of size) requires multiple tools to understand their customers and potential customers.

In our organization, we use Google Analytics to improve the effectiveness of our web and mobile presence as well as to track and understand the behavior of our online visitors and how they interact with our digital properties (Website, Mobile App, YouTube Channel, Facebook page, Twitter Profile,…). When our (online) visitors convert to leads, we rely on our CRM platform, SalesForce, to track the history of our prospects’ and customers’ offline and online interactions. We track every single conversation possible – from emails, calls, meetings, and documentations.

While Google Analytics itself gives us the ability to measure the effectiveness of our marketing campaigns in terms of what channels bring visitors to our “front door”, we were missing the link between real live sales conversions and the original (online) lead generation/traffic sources.

Our sales team receives a flood of leads and sales, talking to many prospects and meeting with many customers. We need to link the valuable information they gather with what brought these leads to us in the first place. This missing link led us for many years to inaccurate conversion attribution, bad budget allocation, and incorrect strategic business focus.

Corrective Action

In order to make better-informed decisions on marketing, budget allocation, and strategy, we need to have clear visibility into the sales cycle not only from the point of live contact, but from the possible origin of online inquiry.

To do that, we integrated the two tools that tell us this (SalesForce and Google Analytics) in the following way:

  1. Pass as much relevant Google Analytics visitor behavioral information as possible to SaleForce with every form submission
  2. Pass the final lead status and offline activities from SalesForce back to GA.

Technical Challenge: New integration method with Universal Analytics

Marrying data from different sources is nothing new. But when marriage gets a little stale, sometimes you have to find new ways to spice it up! :) Our legacy solutions were heavily dependent on the Google Analytics _utm cookies that are generated by the classic ga.js tracking code, in which session and campaign information are stored in the tracking cookies.

In the new Universal Google Analytics, all tracking happens at the server-side level and the single _ga cookie generated by the new analytics.js does not contain any session or campaign information.


We adjusted our classic integration method with a solution that works with all GA tracking code versions, including the latest version of Universal Google Analytics tracking code. All needed code was bundled in one JavaScript file called GASalesforce.js

Passing GA data to SalesForce

By following the steps below, you’ll be able to pass users’ campaign and session information into SalesForce every time a form is submitted:

1. Setup SalesForce Custom Fields
Create the following 8 fields in SalesForce:
create custom fields in salesforceWatch this video to learn how to create a custom field in SalesForce.

  • Visitor ID: a unique, persistent, and non-personally identifiable ID string representing a user.
  • Medium: The marketing channel. Possible medium include: “organic”, “cpc”, “referral”, and “email”.
  • Source: The referral origin. Possible sources include: “”, “”, and “direct”.
  • Campaign: Name of the marketing campaign.
  • Content: Additional campaign identifier.
  • Term: The words that users searched.
  • Count of Sessions: The number of visits to our site including the current session.
  • Count of Pageviews: The number of browsed pages prior to the form submission.

2. Add Hidden Fields to Forms
Add the 8 variables created in the previous step as hidden input fields to all forms in your website in which you want to track in Salesforce.

3. Setting Visitor ID
Value of the Visitor ID hidden input field can be read from the backend system (This is the same value that we will use later to set a Google Analytics custom dimension).

4. Setting Campaign and Session Information
The values of the hidden fields can be read from the JavaScript file referenced in step 7. The code will generate GA cookies and parse out the campaign variables from the cookies and make them available in the 7 following variables: source, medium, term, content, campaign, visit_count, and pageview_count.

5. Pass values to hidden fields. Logic should be put in place within the form to pass the values of the hidden fields to their corresponding fields within SalesForce when the form is submitted.

6. Download GASalesforce.js
Download it from here to your local environment.
Download GASalesforce.js

7. Cross Domain Tracking
If cross-domain tracking is needed, update GASalesforce.js with a comma-separated list of domains to set up automatic cross-domain link tracking.


var domains = ["", ""];

8. Reference the GASalesforce.js file.
Placing the following code before the closing tag on every page of the site.

<script src="" type="text/javascript"></script>

9. Make sense of the data!
Now after we successfully collected all campaign and session data in Salesforce, we can make more sense of it and can extract intelligence.



Passing Salesforce Data to GA

In this section, we’ll utilize a really cool new Google Analytics feature called Data Import to pass customers’ lead information from Salesforce into GA. With Data Import, data can be uploaded from different data sources and combined with the existing Google Analytics data to create one powerful and robust unified report.

The “key” that we’re going to use to marry the two data sets will be the Visitor ID. If Google Analytics finds matching keys, it will join the data in that row of imported data with the existing GA data.



Step One: Create Custom Dimensions
Since “Visitor ID” and “Lead Status” don’t exist as dimensions in Google Analytics, you’ll need to create them as a Custom Dimension.

Custom Dimension Name Scope
Visitor Id User
Lead Status User

You must pass your own visitor ID to Google Analytics as a custom dimension to represent each user who submitted one of your forms in question.

When you have a unique, persistent, and non-personally Visitor ID, set it directly on the tracker as in the following example:

ga('set', 'dimension1', 'vid20140930-005');

Step Two: Create the Data Set
1. In Admin, go to the account and web property that you want to upload data.
2. Click Data Import under PROPERTY.
3. Click New Data Set.
4. Select “User Data” as the Type.
5. Name the Data Set: “Lead Status”
6. Pick one or more views in which you want to see this data.
7. Define the Schema:
Key: Custom Dimensions > Visitor ID
Imported Data: Custom Dimensions > Lead Status
Overwrite hit data: Yes
Click Save.

Step Three: Export Salesforce data
Export the lead statues data from Salesforce into a CSV file.


Step Four: Upload the data
1. In the Data Set table, click “Lead Status”. That will display the schema page.

2. Click Get schema. You’ll see something like the following:

CSV header

This is the header you should use as the first line of your uploaded CSV file. The table below identifies the columns:

Visitor ID Lead Status
ga:dimension1 ga:dimension2

3. Update the exported CSV file to follow the below format. The first (header) row of your spreadsheet should use the internal names (e.g. ga:dimension1 instead of Visitor ID). The columns beneath each header cell should include the corresponding data for each header.

The CSV file should look something like this:


4. In the Manage Uploads table, click Choose Action > Upload files. Choose the CSV file you created.


Step Five: Create Custom Report
Since custom dimensions don’t appear in standard reports, create a Custom Report with our two dimensions (Visitor ID and Lead Status) and your desired metrics.

1. Uploaded data needs to be processed before it can show up in reports. Once processing is complete, it may take up to 24 hours before the imported data will begin to be applied to incoming hit data.
2. Data Import also now supports a new Query Time mode that allows linking data with historical GA data. Query Time mode is currently in whitelist release for Premium users.


And now you can analyze the online user behavior of only those who are marked offline as qualified leads.


Wow, this was a long post! But there you have it – very common uses cases for those of us ready to move their marketing optimization a notch up. I welcome your comments and other use cases our readers come across.

Oct 09


Google Analytics is a very powerful tool, but I think we’d all agree it might be a bit much to expect it to process monster amounts of data on-demand and return reports instantaneously without a tiny tradeoff. Some heavier trafficked sites expecting instant reports from their oceans and oceans of data (which obviously take time to generate) instead find themselves running into sampling issues in their reports – where GA is forced to make it’s calculations based on a smaller sample size of the overall data to get a report instantly. The problem is, sometimes that sample might not be statistically significant or sufficiently representative of the data, so any insights contained in the data aren’t…well…accurate.

In general, sampling isn’t an issue if all you’re looking at are the standard out-of-box reports, because they are all unsampled. However, when leveraging GA’s segmentation capabilities (which is where the real beauty of deep insights resides), whenever a data set is greater than 250,000 or 500,000 sessions within a selected time period, sampling might come into play.

Sampled data is just that….it’s sampled. It’s not fully representative of the actual data. While Google Analytics has an intelligent algorithm to ensure that sampling minimizes adverse effects on the data, the reality is that a dataset that is a 5% sample of your actual data really isn’t usable. How you determine what is usable and what is not, really depends on the nature of your data, and the type of analysis being performed, but in general, it’s best to keep the sample size as high as possible. These reports will undoubtedly be used as a reference point for marketing decisions, so it’s important that they’re accurate and provide actionable insights.

Is the Core Reporting API a solution to this dilemma? Not entirely. Sampling isn’t solved with just this API, even if you have GA Premium because the API has the same sampling thresholds applied to it as GA standard.

So what to do?

Hold tight, the following are 4 solutions to help you get clean data and clean insights again!

1. Reduce the date range.

The first solution is to reduce the date range. When looking at a report (for which you’ve met or crossed the sampling threshold), the interface displays that the report is being sampled. Instead of looking at the entire month all at once, it may help to look at a smaller timeframe, such as a week. This way, only a subset of the data is being viewed and thus, the report that is pulled contains less sessions, which keeps us under the sampling threshold. You would have to look at subsequent weeks one at a time, which is a bit mundane, but once this is done, you can aggregate this and other date ranges of the same report outside of GA, into a single report. Read onto the next solution find out what to do with all those reports.

Note: The only way to export unsampled data directly is to be a Google Analytics Premium customer. There are some third-party tools available for non-GA premium users discussed below. These tools are designed to reduce but not eliminate the effects of sampling.

2. Query Partitioning.

One way of reducing the effects of sampling is to break up the timeframe into smaller timeframes. For example, a year of data can be pulled as 12 separate months (12 separate queries), or a month of data can be pulled as 4 separate weeks (4 separate queries). For example, instead of pulling data for all of 2014, I can pull Jan. 2014, and then pull Feb. 2014, and so on. Obviously, we all have better things to do… A featured called query partitioning, available in tools such as ShufflePoint and Analytics Canvas (more details below), does the above for you in an automated fashion. The tools partition the query and programmatically loop through the desired timeframe, aggregating the report back together once done. This way, when you pull the report, the tool would appear as if making one query but in reality it’s making the number of queries behind the scenes, based on how you configure granularly you define the query partitioning. It may take some experimenting to find a balance between speed and accuracy (sample size).

More detail about the tools:

  • ShufflePoint has a drag-and-drop interface that supports Google Analytics and a few other Google products. The nice thing about ShufflePoint is that it uses Excel’s web-querying capability, so you can write SQL-like queries to retrieve your data, make built-in calculations and display the data essentially any way you want.
  • Analytics Canvas is another tool which allows you to connect to the Google Analytics API without coding. Analytics Canvas uses a “canvas” in which you can construct a visual flowchart of the query and subsequent transformations and joins of your data,to show what series of modifications will take place. It also allows for automating data extraction from BigQuery. If you using Google Sheets for your data, Analytics Canvas has an add-on in Chrome that allows you to create dashboards within Sheets.

Both of these tools have the functionality of extracting your data from Google Analytics and analyzing and creating reports.

3. Download Unsampled Reports.

If you are a Google Analytics Premium user, you can download unsampled reports (you will have to export them). Google Analytics just announced an exciting new feature available on Premium accounts called Custom Tables which allows you to create a custom table with metrics and dimensions of your choice (although there are some limitations). In other words, you can essentially designate a report that would otherwise be sampled, as a “Custom Table” which is then available to you as an unsampled report, similar to the out-of-box reports. You can create up to a 100 Custom Tables. This is awesome because you won’t have to worry about the sampled data for the reports you use often.

4. BigQuery.

If you have Google Premium, it integrates with Google BigQuery which allows for moving massive datasets and super fast SQL-like querying. It works over the Google cloud infrastructure and is able to process data in the order of billions of rows. GA Premium allows for your data to be exported daily into BigQuery. In the Core Reporting API, the data is sampled at the same threshold as in GA Standard. BigQuery allows you to access unsampled hit level data instead of the aggregate level data within the user interface, which in turn opens doors for very powerful and previously impossible analysis!

Here is an examples of the type of analysis possible with BigQuery to help illustrate its use.

  • What is the average amount of money spent by users per visit?
  • What is the sequence of hits (sequence of clicks, pages, events, etc)?
  • What other products are purchased by those customers who purchased a specific product?

For more details, visit here.

There you have it, four solutions to help you deal with sampling! Happy Reporting!

Sep 19

We used to hear “mobile is the future.” Then, we started to hear the phrase, “Mobile Now.” Finally, “Mobile First” is the reality. While this is great for consumers, as we have access to great apps and services literally at our finger tips, marketers are now challenged to keep up with new user-experiences, new platforms, mobile development, and yes, you guessed it, new mobile analytics! But, hey, no one said life in the fast lane was easy ;) And, when you need mobile insights in a jiffy, you’re naturally going to look at real-time reporting. Don’t get ahead of yourself though – it might not be what you expect.

Mobile Analytics vs. Standard Desktop Analytics

Just as mobile user-experience and mobile marketing have pushed beyond the familiar features and functionality of the traditional web, so does mobile analytics. General analysis concepts such as segmentation, acquisition, behavior, conversion etc. still very much apply to mobile analytics. But there are foundational mobile concepts that marketers should adopt and do so quickly.

Don’t waste any time before getting comfortable with these:

  • Tracking activity in a mobile app, in Google Analytics for example, requires a mobile SDK (and yes, you can use the Google Tag Manager for mobile).
  • Screens. Don’t get caught saying pageviews. Pages are for browsers.
  • Events and more Events for all your user interactions. No browser, no hyperlinks. Explicitly define every Event.
  • Crash and Exception reports. You won’t need these? Yeah, right.
  • Metrics like Installs (App installs) and IAP (In App Purchases).
  • Mixed Web and Mobile data for similar user interactions - like when a user logs in at to shop and later uses their Amazon iPhone app to make a purchase. It’s a multi-screen world, folks. Let’s track it.

These aspects will be the subject of another post, but I’ll share a taste of what is going in Mobile Analytics.

Google Analytics recently made a User Interface change and updated some metrics to work across desktop and mobile. Visitors are now Users and Visits are now Sessions. A needed switch as we begin to use the internet in browserless ways.

Real Time Reports

What about reporting? Take a look at Real-Time reports. In GA for web, Real Time reports show you data about your users as they traverse the site (after a few seconds of delay). How does Real-Time Reports work for Apps? Slightly differently. We’ve isolated the data in the following examples to highlight the differences.

In the Overview Report, you see how many active users there are, how many screen views the app is getting “Per Minute” and “Per Second”. The metrics work in the following ways:

Scenario one:

  • The app is started; the user navigates through number of screens and icons, links. Events are generated in GA.
  • After waiting, nothing shows up in the “Per second” window.
  • You escalate to your developers, they check the code and the GA View configuration and it’s all solid.
  • You run the test again. You watch closely. In two minutes, activities appear in the “Per minute” window showing activities that happened two minutes earlier (see first snapshot below). Huh? What happened?
  • This is as Real-Time as you are going to get. Not good? Sorry :( It’s by design. Stay with me as we learn why.

Data Dispatch
In Mobile Analytics and in our specific GA example, there is a concept called data dispatching, as defined by Google “As your app collects GA data, that data is added to a queue and periodically dispatched to Google Analytics. Periodic dispatch can occur either when your app is running in the foreground or the background.” Dispatching is used to reduce overhead, increase battery life, etc.

In the iOS, the default dispatch is 2 minutes (and you can adjust it to your liking).

For Android, the default dispatch is 30 minutes (and can be adjusted as well).

Other analytics platforms such as Flurry have similar concepts.

real time sdk google analytics screenshot 2

Scenario two:

  • The app was started; a number of screens and events were clicked on.
  • The app was killed—all within less than 2 minutes. The activities showed up right away in the Per Second window.
  • After a minute or 2, the activities showed up in the Per Minute window as shown below.
  • Why did the activity show up within less then 2 minutes in the Per Seconds and (pretty much less then 2 minutes) in the Per Minute window?

real time sdk google analytics screenshot 2

The way Data Dispatch works is that there is a set delay in which the activity is transmitted. However, if the app is terminated before that time frame, data will be submitted immediately. Thus, in this scenario, because the app was terminated before the 2 minute mark, the data was submitted and (after some processing time) showed up in the Per Second window. In the minute window, of course, after the processing time and minute intervals, then you’ll the real time activity.

real time sdk google analytics screenshot 3

An active user is triggered when you start the app, as seen below. The snapshot was taken within 1 min of starting the app. Just remember only 5 minutes of inactivity will drop the user from the Active User report even though their activity will be present in the Per Minute report.

real time sdk google analytics screenshot 4


So be careful – the present world of mobile Google Analytics Real-Time reports is different from what you might expect from desktop Real-Time. What you see isn’t necessarily exactly live, but rest assured that the same hits that have appeared in standard (non-Real-Time) Google Analytics reports are still making their way to GA to appear in the reports (and in some new reports, too) in only a few hours.

Share with us your thoughts and comments!