Archive for the ‘google analytics’ Category

Nov 17
2014

bots-thumb

Google Analytics’ Bot & Spider Filtering

Google Analytics recently released a feature called “Bot and Spider Filtering”. You might have read about it already but haven’t had a chance to implement it or process how it might affect your organization or even your effectiveness as a digital analyst. E-Nor recommends turning on Google Analytics’ Bot & Spider Filter as a best practice for all Google Analytics accounts (except for raw data Views – which we’ll explain at the end of the post).*

As an added bonus, it is super easy to do!

What is Bot & Spider Filtering?

Bot and Spider Filtering filters out sessions from bots and spiders (as defined by Google) from your Google Analytics data. It’s important to note though that like traditional Google Analytics filters, the Bot & Spider filter works on a “going forward” basis; in other words, past data will be unaffected. It also does not actually prevent bots or spiders from visiting your site (only filters them out from your data).

Typically, traffic from bots and spiders will be negligible and barely noticeable in your Google Analytics data. For example, in E-Nor’s recent test using the new filter, bots and spiders accounted for less than 0.5% of all sessions.

So then why should you care? There are cases where bot and spider activity can seriously skew results in Google Analytics. Unpredicted spikes for small durations can throw data for small time-frames out of whack and result in significant time being spent diagnosing spikes that turn out to be bot or spider traffic.

How E-Nor Diagnosed Spikes in Google Analytics Sessions

The key to what triggers our investigation(s) were:

  1. Was the activity out of the norm? Did it stand out compared to the same metrics in prior periods?
  2. Could we assign a reasonable explanation to it? For example, was content changed on the site or could some outside event lead to the activity?

When the answers are: 1. Yes, the activity is abnormal; and 2. No, we can’t figure out a reasonable explanation as to why – we need to go on a Google Analytics fishing expedition…

Examples

Case 1: We observed a Top Pages report with several new, unexpected entries. None of the typical reasons why the report would have new top pages had occurred – like an actual new page, a very popular campaign linked to a page or sudden public interest in the topic of a particular page. The organization this report belonged to primarily provided informational content, so their concern was the level of engagement with that information. These weird occurrences were a big deal.

E-Nor examined metrics for the pages in question (and for sessions including those pages). We observed the following anomalies compared to past data, to other pages during the same time period. We subsequently concluded that bot activity was probably responsible – even though this activity at first seemed atypical for bots!

We checked many metrics, and the ones that stood out were:

  • Pageviews and entrances increased by atypical amounts compared to prior time periods for several pages.
  • The bounce rate dropped dramatically, e.g. from 74% to 42%.
  • These same pages were also the top landing pages when they had not been in prior periods.
  • Sessions involving these pages include very high pageviews to the page in question.
  • Session duration for sessions including the pages increased abnormally, e.g. 4:36 to 28:37 minutes.
  • Pages per Session increased a huge amount from 2 to 13 pages per session.
  • Browsers with Browser Versions were unusual with sessions including these pages coming primarily from Internet Explorer versions 7, 8 and 6, rather than the typical IE 11, IE 9 and Chrome for this site.
  • Locations were primarily Russia, Indonesia, Argentine, Thailand, Mexico and other countries atypical for this site, where sessions typically occur mostly in the United States.

bots-spiders

Case 2: For a B2B high tech company, we observed, again, a deviation from the prior period visible across many pages in a management report. Our report to senior management indicated huge interest from a major customer in jobs, rather than products, and huge interest from another customer who hadn’t been in the “top customers” report in the past. For the latter, we learned that a news item caused the unusual activity from that customer, so that was explained. For the former, we did not discover a solid reason for the change, so we went on a Google Analytics fishing expedition.

Our expedition across many, many metrics revealed these anomalies compared to prior periods:

  • The Source/Medium for the affected sessions was “(direct) / (none)”, which was suspicious because the Landing Page was not one someone would bookmark or type into the browser.
  • All sessions had the same Landing Page.
  • The Bounce Rate for that page skyrocketed.
  • The City and Region were both “(not set)” while the Country was the United States.
  • The Operating System was also “(not set)”.
  • The Browser was also “(not set)”.

With its high bounce rate and frequent occurrence of “none” and “(not set)”, Case 2 was an example of actypical activity likely to be indicative of bots.

References & More

For more information, always go to the source: Google’s announcement about Bot & Spider Filtering

And please don’t forget to annotate your Google Analytics data and please don’t apply the Bot & Spider Filter to your Raw Data View.*

*What is a Raw Data View?

A Raw Data View is a Google Analytics View with no configuration. For example, no Filters are applied and no Goals are set. The Raw Data View acts as a back-up, first, in case, we need to validate configuration in other Views (it might be easier to compare to a Raw Data View), and second, in case a View becomes too complex to reverse engineer, it might be easier to just copy the Raw Data View and then apply configuration, like Filters and Goals, fresh.

Nov 07
2014
google-analytics-sampling-art

This is the video to our sampled data post for those of you who prefer visual over reading!
For sites with heavy traffic, Google Analytics might sampled your data, which may make it hard to gain insights (the data might even become unusable). Don’t worry though, there are plenty of other options to work around that challenge. The video explains a couple.

Do you have any solutions? Tell us in the comments.

Oct 29
2014

Fortune 500 Google Analytics Adoption Rate in 2014

See last year’s (2013) Fortune 500 Adoption.

Google Analytics – Still the One

It’s that time of the year. We’re taking a look at Analytics use among the Fortune 500. Once again, all you see is orange. Google Analytics is the leader – a whopping 67% of the Fortune 500 websites are using GA and/or a combination of GA and other web analytics tools.

Key Highlights:

  • Orange is still the new black. In 2014, 6% increase in the use of GA from 2013.
  • Not a bad year for Adobe. 3.9% growth in the use of Adobe Analytics (Omniture/SiteCatalyst) with a 26.4% usage among the Fortune 500.
  • 14.9% decline in the use of Webtrends from the past year and 12.6% usage this year. :(
  • Finally, we see a 42.9% decline in the use of IBM/Coremetrics and a total of 4% usage this year. :( :(

Note: The percentages do not represent that the tools were adopted in 2014. They could have been adopted prior to 2014 or during. Also, the other category includes Yahoo Analytics, Piwik, AT Internet and Chartbeat).

Expect the use of Google Analytics in the enterprise to increase following the increased adoption of Google Analytics Premium across many verticals and the emergence of Google-sized enhancements such as Enhanced eCommerce, Data Import, BigQuery Integration, enhanced Mobile Analytics, and DoubleClick integration, among many others!

Oct 21
2014

evolve-thumb

We’re so proud of the success of our first Evolve with Google Analytics Conference. Special thanks to Rising Media for partnering with us on it. It was the meeting of the minds of the foremost thought leaders in the industry. Everyone was super-engaged, shared great tips and strategies, and had a lot of fun! Can’t wait till next time.

Here are some highlights.

 

 

 

 

 

 

Oct 16
2014

Google Analytics Jeopardy board

The Evolve with Google Analytics conference rolled into Boston last week and proved to be an event to remember: great speakers across the spectrum of conversion optimization, the inside scoop from Justin Cutroni, dynamite attendees who came from near and far, and more high-impact networking than you could shake a stick at.

Among the highlights was Google Analytics Jeopardy!, hosted by the congenial Judah Phillips, who heightened the learning and the fun with off-the-cuff anecdotes and insights. Make no mistake, though: the competition was fierce!

Ready to take the Google Analytics Jeopardy! challenge?
PLAY GA JEOPARDY!

Stay tuned for round two in an upcoming blog post.

Hope to see you see you at the next Evolve with Google Analytics conference – don’t miss the action!

Oct 14
2014

Love

Bill Gates once said,

“The most meaningful way to differentiate your company from your competitors, the best way to put distance between you and the crowd is to do an outstanding job with information. How you gather, manage and use information will determine whether you win or lose.”

With the announcement of Wave, SalesForce Analytics Cloud, it’s now more important than ever to measure clients’ 360 journey to connect the dots between business user data (e.g. sales data in SalesForce) and web and mobile behavioral data (e.g. user interactions data in Google Analytics).

In this post, we’re going to show you how to do that!

Our Story

To improve profitability, every company (regardless of size) requires multiple tools to understand their customers and potential customers.

In our organization, we use Google Analytics to improve the effectiveness of our web and mobile presence as well as to track and understand the behavior of our online visitors and how they interact with our digital properties (Website, Mobile App, YouTube Channel, Facebook page, Twitter Profile,…). When our (online) visitors convert to leads, we rely on our CRM platform, SalesForce, to track the history of our prospects’ and customers’ offline and online interactions. We track every single conversation possible – from emails, calls, meetings, and documentations.

While Google Analytics itself gives us the ability to measure the effectiveness of our marketing campaigns in terms of what channels bring visitors to our “front door”, we were missing the link between real live sales conversions and the original (online) lead generation/traffic sources.

Our sales team receives a flood of leads and sales, talking to many prospects and meeting with many customers. We need to link the valuable information they gather with what brought these leads to us in the first place. This missing link led us for many years to inaccurate conversion attribution, bad budget allocation, and incorrect strategic business focus.

Corrective Action

In order to make better-informed decisions on marketing, budget allocation, and strategy, we need to have clear visibility into the sales cycle not only from the point of live contact, but from the possible origin of online inquiry.

To do that, we integrated the two tools that tell us this (SalesForce and Google Analytics) in the following way:

  1. Pass as much relevant Google Analytics visitor behavioral information as possible to SaleForce with every form submission
  2. Pass the final lead status and offline activities from SalesForce back to GA.

Technical Challenge: New integration method with Universal Analytics

Marrying data from different sources is nothing new. But when marriage gets a little stale, sometimes you have to find new ways to spice it up! :) Our legacy solutions were heavily dependent on the Google Analytics _utm cookies that are generated by the classic ga.js tracking code, in which session and campaign information are stored in the tracking cookies.

In the new Universal Google Analytics, all tracking happens at the server-side level and the single _ga cookie generated by the new analytics.js does not contain any session or campaign information.

Solution

We adjusted our classic integration method with a solution that works with all GA tracking code versions, including the latest version of Universal Google Analytics tracking code. All needed code was bundled in one JavaScript file called GASalesforce.js

Passing GA data to SalesForce

By following the steps below, you’ll be able to pass users’ campaign and session information into SalesForce every time a form is submitted:

1. Setup SalesForce Custom Fields
Create the following 8 fields in SalesForce:
create custom fields in salesforceWatch this video to learn how to create a custom field in SalesForce.

  • Visitor ID: a unique, persistent, and non-personally identifiable ID string representing a user.
  • Medium: The marketing channel. Possible medium include: “organic”, “cpc”, “referral”, and “email”.
  • Source: The referral origin. Possible sources include: “google.com”, “facebook.com”, and “direct”.
  • Campaign: Name of the marketing campaign.
  • Content: Additional campaign identifier.
  • Term: The words that users searched.
  • Count of Sessions: The number of visits to our site including the current session.
  • Count of Pageviews: The number of browsed pages prior to the form submission.

2. Add Hidden Fields to Forms
Add the 8 variables created in the previous step as hidden input fields to all forms in your website in which you want to track in Salesforce.

3. Setting Visitor ID
Value of the Visitor ID hidden input field can be read from the backend system (This is the same value that we will use later to set a Google Analytics custom dimension).

4. Setting Campaign and Session Information
The values of the hidden fields can be read from the JavaScript file referenced in step 7. The code will generate GA cookies and parse out the campaign variables from the cookies and make them available in the 7 following variables: source, medium, term, content, campaign, visit_count, and pageview_count.

5. Pass values to hidden fields. Logic should be put in place within the form to pass the values of the hidden fields to their corresponding fields within SalesForce when the form is submitted.

6. Download GASalesforce.js
Download it from here to your local environment.
Download GASalesforce.js

7. Cross Domain Tracking
If cross-domain tracking is needed, update GASalesforce.js with a comma-separated list of domains to set up automatic cross-domain link tracking.

Example:

var domains = ["mydomain1.com", "mydomain2.com"];

8. Reference the GASalesforce.js file.
Placing the following code before the closing tag on every page of the site.

<script src="http://www.mydomain.com/scripts/gasalesforce.js" type="text/javascript"></script>

9. Make sense of the data!
Now after we successfully collected all campaign and session data in Salesforce, we can make more sense of it and can extract intelligence.

lead-info

sf-report

Passing Salesforce Data to GA

In this section, we’ll utilize a really cool new Google Analytics feature called Data Import to pass customers’ lead information from Salesforce into GA. With Data Import, data can be uploaded from different data sources and combined with the existing Google Analytics data to create one powerful and robust unified report.

The “key” that we’re going to use to marry the two data sets will be the Visitor ID. If Google Analytics finds matching keys, it will join the data in that row of imported data with the existing GA data.

integration1

new-ga-data
integration2

Step One: Create Custom Dimensions
Since “Visitor ID” and “Lead Status” don’t exist as dimensions in Google Analytics, you’ll need to create them as a Custom Dimension.

Custom Dimension Name Scope
Visitor Id User
Lead Status User

Note:
You must pass your own visitor ID to Google Analytics as a custom dimension to represent each user who submitted one of your forms in question.

When you have a unique, persistent, and non-personally Visitor ID, set it directly on the tracker as in the following example:

ga('set', 'dimension1', 'vid20140930-005');

Step Two: Create the Data Set
1. In Admin, go to the account and web property that you want to upload data.
2. Click Data Import under PROPERTY.
3. Click New Data Set.
4. Select “User Data” as the Type.
5. Name the Data Set: “Lead Status”
6. Pick one or more views in which you want to see this data.
7. Define the Schema:
Key: Custom Dimensions > Visitor ID
Imported Data: Custom Dimensions > Lead Status
Overwrite hit data: Yes
Click Save.

Step Three: Export Salesforce data
Export the lead statues data from Salesforce into a CSV file.

export

Step Four: Upload the data
1. In the Data Set table, click “Lead Status”. That will display the schema page.

2. Click Get schema. You’ll see something like the following:

CSV header
ga:dimension1,ga:dimension2

This is the header you should use as the first line of your uploaded CSV file. The table below identifies the columns:

Visitor ID Lead Status
ga:dimension1 ga:dimension2

3. Update the exported CSV file to follow the below format. The first (header) row of your spreadsheet should use the internal names (e.g. ga:dimension1 instead of Visitor ID). The columns beneath each header cell should include the corresponding data for each header.

The CSV file should look something like this:

ga:dimension1,ga:dimension2
vid20140930-001,Qualified
vid20140930-002,Qualified
vid20140930-003,Qualified
vid20140930-004,Unqualified
vid20140930-005,Unqualified
vid20140930-006,Qualified
vid20141001-001,Unqualified
vid20141001-002,Qualified
vid20141001-003,Qualified
vid20141001-004,Unqualified

4. In the Manage Uploads table, click Choose Action > Upload files. Choose the CSV file you created.

file-upload

Step Five: Create Custom Report
Since custom dimensions don’t appear in standard reports, create a Custom Report with our two dimensions (Visitor ID and Lead Status) and your desired metrics.

Note:
1. Uploaded data needs to be processed before it can show up in reports. Once processing is complete, it may take up to 24 hours before the imported data will begin to be applied to incoming hit data.
2. Data Import also now supports a new Query Time mode that allows linking data with historical GA data. Query Time mode is currently in whitelist release for Premium users.

ga-report

And now you can analyze the online user behavior of only those who are marked offline as qualified leads.

ga-report-qualified

Wow, this was a long post! But there you have it – very common uses cases for those of us ready to move their marketing optimization a notch up. I welcome your comments and other use cases our readers come across.

Oct 09
2014

google-analytics-sampling-art

Google Analytics is a very powerful tool, but I think we’d all agree it might be a bit much to expect it to process monster amounts of data on-demand and return reports instantaneously without a tiny tradeoff. Some heavier trafficked sites expecting instant reports from their oceans and oceans of data (which obviously take time to generate) instead find themselves running into sampling issues in their reports – where GA is forced to make it’s calculations based on a smaller sample size of the overall data to get a report instantly. The problem is, sometimes that sample might not be statistically significant or sufficiently representative of the data, so any insights contained in the data aren’t…well…accurate.

In general, sampling isn’t an issue if all you’re looking at are the standard out-of-box reports, because they are all unsampled. However, when leveraging GA’s segmentation capabilities (which is where the real beauty of deep insights resides), whenever a data set is greater than 250,000 or 500,000 sessions within a selected time period, sampling might come into play.

Sampled data is just that….it’s sampled. It’s not fully representative of the actual data. While Google Analytics has an intelligent algorithm to ensure that sampling minimizes adverse effects on the data, the reality is that a dataset that is a 5% sample of your actual data really isn’t usable. How you determine what is usable and what is not, really depends on the nature of your data, and the type of analysis being performed, but in general, it’s best to keep the sample size as high as possible. These reports will undoubtedly be used as a reference point for marketing decisions, so it’s important that they’re accurate and provide actionable insights.

Is the Core Reporting API a solution to this dilemma? Not entirely. Sampling isn’t solved with just this API, even if you have GA Premium because the API has the same sampling thresholds applied to it as GA standard.

So what to do?

Hold tight, the following are 4 solutions to help you get clean data and clean insights again!

1. Reduce the date range.

The first solution is to reduce the date range. When looking at a report (for which you’ve met or crossed the sampling threshold), the interface displays that the report is being sampled. Instead of looking at the entire month all at once, it may help to look at a smaller timeframe, such as a week. This way, only a subset of the data is being viewed and thus, the report that is pulled contains less sessions, which keeps us under the sampling threshold. You would have to look at subsequent weeks one at a time, which is a bit mundane, but once this is done, you can aggregate this and other date ranges of the same report outside of GA, into a single report. Read onto the next solution find out what to do with all those reports.

Note: The only way to export unsampled data directly is to be a Google Analytics Premium customer. There are some third-party tools available for non-GA premium users discussed below. These tools are designed to reduce but not eliminate the effects of sampling.

2. Query Partitioning.

One way of reducing the effects of sampling is to break up the timeframe into smaller timeframes. For example, a year of data can be pulled as 12 separate months (12 separate queries), or a month of data can be pulled as 4 separate weeks (4 separate queries). For example, instead of pulling data for all of 2014, I can pull Jan. 2014, and then pull Feb. 2014, and so on. Obviously, we all have better things to do… A featured called query partitioning, available in tools such as ShufflePoint and Analytics Canvas (more details below), does the above for you in an automated fashion. The tools partition the query and programmatically loop through the desired timeframe, aggregating the report back together once done. This way, when you pull the report, the tool would appear as if making one query but in reality it’s making the number of queries behind the scenes, based on how you configure granularly you define the query partitioning. It may take some experimenting to find a balance between speed and accuracy (sample size).

More detail about the tools:

  • ShufflePoint has a drag-and-drop interface that supports Google Analytics and a few other Google products. The nice thing about ShufflePoint is that it uses Excel’s web-querying capability, so you can write SQL-like queries to retrieve your data, make built-in calculations and display the data essentially any way you want.
  • Analytics Canvas is another tool which allows you to connect to the Google Analytics API without coding. Analytics Canvas uses a “canvas” in which you can construct a visual flowchart of the query and subsequent transformations and joins of your data,to show what series of modifications will take place. It also allows for automating data extraction from BigQuery. If you using Google Sheets for your data, Analytics Canvas has an add-on in Chrome that allows you to create dashboards within Sheets.

Both of these tools have the functionality of extracting your data from Google Analytics and analyzing and creating reports.

3. Download Unsampled Reports.

If you are a Google Analytics Premium user, you can download unsampled reports (you will have to export them). Google Analytics just announced an exciting new feature available on Premium accounts called Custom Tables which allows you to create a custom table with metrics and dimensions of your choice (although there are some limitations). In other words, you can essentially designate a report that would otherwise be sampled, as a “Custom Table” which is then available to you as an unsampled report, similar to the out-of-box reports. You can create up to a 100 Custom Tables. This is awesome because you won’t have to worry about the sampled data for the reports you use often.

4. BigQuery.

If you have Google Premium, it integrates with Google BigQuery which allows for moving massive datasets and super fast SQL-like querying. It works over the Google cloud infrastructure and is able to process data in the order of billions of rows. GA Premium allows for your data to be exported daily into BigQuery. In the Core Reporting API, the data is sampled at the same threshold as in GA Standard. BigQuery allows you to access unsampled hit level data instead of the aggregate level data within the user interface, which in turn opens doors for very powerful and previously impossible analysis!

Here is an examples of the type of analysis possible with BigQuery to help illustrate its use.

  • What is the average amount of money spent by users per visit?
  • What is the sequence of hits (sequence of clicks, pages, events, etc)?
  • What other products are purchased by those customers who purchased a specific product?

For more details, visit here.

There you have it, four solutions to help you deal with sampling! Happy Reporting!

Sep 19
2014

mobile-analytics
We used to hear “mobile is the future.” Then, we started to hear the phrase, “Mobile Now.” Finally, “Mobile First” is the reality. While this is great for consumers, as we have access to great apps and services literally at our finger tips, marketers are now challenged to keep up with new user-experiences, new platforms, mobile development, and yes, you guessed it, new mobile analytics! But, hey, no one said life in the fast lane was easy ;) And, when you need mobile insights in a jiffy, you’re naturally going to look at real-time reporting. Don’t get ahead of yourself though – it might not be what you expect.

Mobile Analytics vs. Standard Desktop Analytics

Just as mobile user-experience and mobile marketing have pushed beyond the familiar features and functionality of the traditional web, so does mobile analytics. General analysis concepts such as segmentation, acquisition, behavior, conversion etc. still very much apply to mobile analytics. But there are foundational mobile concepts that marketers should adopt and do so quickly.

Don’t waste any time before getting comfortable with these:

  • Tracking activity in a mobile app, in Google Analytics for example, requires a mobile SDK (and yes, you can use the Google Tag Manager for mobile).
  • Screens. Don’t get caught saying pageviews. Pages are for browsers.
  • Events and more Events for all your user interactions. No browser, no hyperlinks. Explicitly define every Event.
  • Crash and Exception reports. You won’t need these? Yeah, right.
  • Metrics like Installs (App installs) and IAP (In App Purchases).
  • Mixed Web and Mobile data for similar user interactions - like when a user logs in at amazon.com to shop and later uses their Amazon iPhone app to make a purchase. It’s a multi-screen world, folks. Let’s track it.

These aspects will be the subject of another post, but I’ll share a taste of what is going in Mobile Analytics.

Google Analytics recently made a User Interface change and updated some metrics to work across desktop and mobile. Visitors are now Users and Visits are now Sessions. A needed switch as we begin to use the internet in browserless ways.

Real Time Reports

What about reporting? Take a look at Real-Time reports. In GA for web, Real Time reports show you data about your users as they traverse the site (after a few seconds of delay). How does Real-Time Reports work for Apps? Slightly differently. We’ve isolated the data in the following examples to highlight the differences.

In the Overview Report, you see how many active users there are, how many screen views the app is getting “Per Minute” and “Per Second”. The metrics work in the following ways:

Scenario one:

  • The app is started; the user navigates through number of screens and icons, links. Events are generated in GA.
  • After waiting, nothing shows up in the “Per second” window.
  • You escalate to your developers, they check the code and the GA View configuration and it’s all solid.
  • You run the test again. You watch closely. In two minutes, activities appear in the “Per minute” window showing activities that happened two minutes earlier (see first snapshot below). Huh? What happened?
  • This is as Real-Time as you are going to get. Not good? Sorry :( It’s by design. Stay with me as we learn why.

Data Dispatch
In Mobile Analytics and in our specific GA example, there is a concept called data dispatching, as defined by Google “As your app collects GA data, that data is added to a queue and periodically dispatched to Google Analytics. Periodic dispatch can occur either when your app is running in the foreground or the background.” Dispatching is used to reduce overhead, increase battery life, etc.

In the iOS, the default dispatch is 2 minutes (and you can adjust it to your liking).
https://developers.google.com/analytics/devguides/collection/ios/v2/dispatch

For Android, the default dispatch is 30 minutes (and can be adjusted as well).
https://developers.google.com/analytics/devguides/collection/android/v4/dispatch

Other analytics platforms such as Flurry have similar concepts.

real time sdk google analytics screenshot 2

Scenario two:

  • The app was started; a number of screens and events were clicked on.
  • The app was killed—all within less than 2 minutes. The activities showed up right away in the Per Second window.
  • After a minute or 2, the activities showed up in the Per Minute window as shown below.
  • Why did the activity show up within less then 2 minutes in the Per Seconds and (pretty much less then 2 minutes) in the Per Minute window?

real time sdk google analytics screenshot 2

The way Data Dispatch works is that there is a set delay in which the activity is transmitted. However, if the app is terminated before that time frame, data will be submitted immediately. Thus, in this scenario, because the app was terminated before the 2 minute mark, the data was submitted and (after some processing time) showed up in the Per Second window. In the minute window, of course, after the processing time and minute intervals, then you’ll the real time activity.

real time sdk google analytics screenshot 3

An active user is triggered when you start the app, as seen below. The snapshot was taken within 1 min of starting the app. Just remember only 5 minutes of inactivity will drop the user from the Active User report even though their activity will be present in the Per Minute report.

real time sdk google analytics screenshot 4

Conclusion

So be careful – the present world of mobile Google Analytics Real-Time reports is different from what you might expect from desktop Real-Time. What you see isn’t necessarily exactly live, but rest assured that the same hits that have appeared in standard (non-Real-Time) Google Analytics reports are still making their way to GA to appear in the reports (and in some new reports, too) in only a few hours.

Share with us your thoughts and comments!

Sep 16
2014

The following is an excerpt from a post we wrote on the Evolve Conference Blog. For the full article, please click here.

universal-analytics-thumb

Universal Google Analytics is all the craze right now. We hope you’re planning to attend our Evolve conference in October, because we are going to cover it and cover it well!

Not only do we have the sessions, but we’re also having a Fireside Chat with Justin Cutroni, Google Analytics Evangalist, and he’ll be providing answers to all your “burning questions”. You all should know Justin, an avid speaker and blogger and thought leader in our industry. So bring on your questions!

If you’re still not sure about what’s all involved with Universal Analytics and the impacts it has on your organization, follow the following 5 tips and you’ll be in a good shape.

Tip #1. Don’t panic! You don’t have to rush, you still have time.

Legacy (or Classic) Google Analytics isn’t going away soon. According to Google’s Universal Analytics Upgrade Center, “Data collected from the deprecated features will be processed for a minimum of 2 years”. So if you are using ga.js, urchin.js, customer variables, etc., you’ll have data for a couple of years from when Universal was announced out of beta (April 2014).

For the complete list of tips, read the full article here.

Sep 02
2014

fitness analytics
Labor day just passed which pretty much marks the end of summer vacation. The party’s over, as they say, and it’s time to get back to work. Children get back into routine for school, students gear up for another semester, people reminisce over summer vacations and plan out the rest of the year, and of course businesses begin to buckle down for Q4 so they can finish out the year on a strong note. It’s a time to re-evaluate, tweak and re-establish goals. There’s also that iPhone 6 announcement coming next week, but I digress. :)

One of my goals for this year was to make significant strides towards being healthier. As with anything else this requires physical effort, but more-so than that, it requires mental discipline.

I’ve noticed in the last couple of years, the huge influx of wearable fitness trackers that measure everything from steps taken, to distance, to elevation, to heart-rate, to sleep. It’s impressive what data these little devices can capture and the exhaustive reports that can be produced.

Being a data geek combined with this goal of being healthier, I decided to invest in my own tracker. While there are many options available, they are all very similar and you basically have to find one that you’ll use. Features and functionality aside, if they sit at home instead of on your person, they’re a waste of money. So my main requirement was comfort. After an exhaustive search I decided to try out the Jawbone UP24. It’s an amazing little device, and pumps out quite the stream of data. Color me impressed!

I was talking to my colleague Farid the other day about these fitness trackers prior to making the decision to purchase. To be honest, this rarely happens, but on this auspicious day, Farid had some words of wisdom which echoed in my mind for days [Edited and rejected by Farid Alhadi]. He said, in reference to the large selection of devices available, “You know, they’re essentially all the same, and they don’t do much unless you use them, but what they do is provide is that little extra motivation.” Ok, he didn’t quite make the discovery of the century, I’m probably being kind of dramatic to his compliment, but you get the point. [Edited and rejected by Farid Alhadi]

“That little extra motivation”? Maybe that’s the difference between being healthy and not quite there? Could the line in the sand really be that narrow? Would data be all I need to help me run that extra mile, or do those extra reps? Really? Could it be that a dashboard of my fitness stats would help me take that step that I’ve been trying to take for so long? Wait a second…dashboards, metrics, data, insights??? this is starting to sound a little too familiar.

Here’s some samples of the type of reports that this little thing produces:
fitness_dashboard

More importantly, this got me thinking. (Yes I know I’m a geek…you don’t need to remind me). Doesn’t a fitness tracker essentially do what Analytics does for a business? Just as a fitness tracker captures the performance of an individual based on pre-defined metrics, so does analytics based on similar pre-defined metrics.

1. Use it or lose it. (And I don’t mean the weight).

The fitness tracker is only as good as our usage of it. The same principle can be applied to analytics. It only works if you use it. If I leave the fitness tracker at home, or forget to wear it, or if I don’t calibrate it, the data is going to be useless. Same goes for your analytics tool. Organizations sometimes spend thousands on analytics tools and even strategies, but they don’t put a legitimate effort to use it or even use it right! Your organization’s mindset needs to make a legitimate effort to be metrics driven and put value in the tool you spent so much money on.

If you’re intimidated by it, there are plenty of consultants and trainings that can help.

2. Plan your meals and workouts carefully with a clear goal in mind.

How can you measure progress if you don’t know where you’re going and how you’re going to get there? If our measurement strategy isn’t zeroing in on the things that are key to the business, it needs to be adjusted, but you wont be able to know that without a clear cut plan. Analytics works the same way. Have a clear goal and a strategy as to how you plan to measure your goals. Once again, this might be a daunting task, but that’s what your certified partners are there for!

3. Garbage in equals garbage out.

That doesn’t only apply to your diet, but it applies to your data. If you don’t eat the right food, you wont have the right body. If I don’t enter my calories consumed correctly, I won’t really be able to measure my progress. Similarly, if you don’t measure your digital website or mobile app data correctly, your data may be garbage and thus, your insights may be garbage. The tool needs good food/data to be able to do it’s job correctly. Make sure you’re tracking your data properly, using the correct methods and code. Track the right pages, the right events, and create clean proper funnels and goals.

4. Review your progress.

If I don’t review the data at periodic intervals and adjust my exercise routine as a result, what benefit am I getting out of it? No one get’s it right the first time. It takes optimization. Maybe your diet isn’t right. Maybe your exercises aren’t burning enough calories. Maybe your body has hit a plateau and it’s time to switch things up. Similarly, with analytics, if we’re not reacting to what the data is telling us, then what good is our analytics implementation? Your measurement strategy should include periodic review to see your progress and make the adjustments necessary to truly optimize results of your site. Just as the reports above tell the wearer of the fitness device to move, Analytics reports similarly tell us which marketing campaigns need attention, which landing pages need to be tweaked, what’s working well, and what’s not. In other words, an entire action plan can be derived just by periodic review of key performance indicators.

Conclusion

Yes people, an individual is similar to a business in this sense. While that won’t get you on the Fortune 500 list, it’s still something to ponder :) It’s an interesting comparison, but one that resonated in my mind and helps me apply the concept of “Measure, Analyze, Optimize” we learn in the world of analytics to my daily life as well. The thought did cross my mind that perhaps I should import this data into Tableau and go all nerdy with it, but that would be a bit too over-zealous. I’ll leave that for another day, or maybe I’ll have “a little extra motivation” once I start hitting my goals. :)

One key takeaway from this little lesson I learned: Performance is an attitude, not just a device.