Thursday, 27 February 2014

8 Custom Reports from the Google Analytics Solutions Gallery

The following is a guest post from Rachelle Maisner, who recently transitioned from Sr Analyst to Account Manager at Digitaria, a Google Analytics Certified Partner.
New analysts have it easy these days. Back in my day, we have to walk uphill in the snow both ways to get decent enough web reporting. My first blush with web analytics came upon me when I was a marketing coordinator for an advertising agency several years ago. I got the hand-me-down grunt work of pulling stats for one of our client's websites using server logs. Server logs, people. It was painfully slow, and gut-wrenchingly inefficient. So for the sake of my sanity, I started looking into better solutions, and I knew if it could help the client out with more meaningful reporting, that would make me look really good. When I found a solution I liked, I needed to pitch the client for their buy in. That conversation went something like... "I found this great tool, and it's free- we can install it on your website and try out this fast new reporting. It's called Google Analytics."
Since then, there are now so many fantastic resources available to budding young analysts. From the Analysis Exchange to Avinash's own Market Motive courses, not to mention GA's recently revamped Analytics Academy, there's a wealth of quality education and training just a click away to anyone who’s willing to learn. 
I'm blogging to tell you all about one of my absolute favorite new resources-- a tremendous goldmine of knowledge sharing unlike anything else this industry has ever seen-- Google Analytics’ very own Solutions Gallery.
The Solutions Gallery is a free and public platform that allows users to share custom reports, segments and dashboards. It's invaluable resource not only for those that are new to digital analytics, but also for analytics veterans looking for fresh ideas and new approaches. I mean, wow, you can download reports and dashboards from experts all over the globe and instantly apply them to your own Google Analytics account. 

I was so excited about the Solutions Gallery that I uploaded 8 custom reports of my own to share with the community, and in about a month I had over 1,600+ imports. 
I have received awesome feedback and gratitude for the custom reports I created, so I am absolutely thrilled to be able to share them here on the Google Analytics blog and showcase them to a wider audience. I hope you find these helpful and I hope they encourage you to not only get more from your data, but to upload some of your own solutions to the Gallery.
All my custom reports are organized into four categories. These categories are based on the ABC's of analytics, plus D for bonus points: Acquisition, Behavior, Conversion, and Diagnostics.
A is for Acquisition
Visits and Goal Conversion by Traffic Source: Take your traffic source reports one step further by understanding volume & conversion by each channel. This is one way to see how your best visitors are getting to your site. I recommend setting up a goal for “engaged visits”, for this custom report and some of the following reports. When you import this custom report, change Goal One to your engaged visits goal, or another significant KPI configured as a goal.
B is for Behavior
Page Effectiveness: Ever ponder the question, “How is my content doing?” This custom report provides page-level performance, allowing you to discover your top and bottom performing pages using various traffic and engagement metrics.
Social Sharing: A four-tab custom report chock full of site-to-social reporting. Tab 1 is the Shared Content Trend, showing how top pages are shared to social networks over time. Tab 2 is Top Shared Content by Network, a first step to discovering what content works for specific channels. Tab 3 is a report on Socially Engaged Visitors, providing a quick profile of visitors that engage in social sharing links. And finally, Tab 4 is Social Outcomes and Conversions, tying social engagement to site goals.
C is for Conversion
Simple E-Commerce Report: A starting point for trending revenue or purchases against visits, with a traffic sources breakdown.
PPC Campaign Goal Performance: Analyze paid search performance against goal conversion by search engine. Change goal one completions to your engaged visits goal. This report filters for Google campaigns. To filter for Bing change the source filter for "Bing" or delete the filter to include all search engines.
PPC Keywords: Get a paid keyword report with traffic volume, CPC, goal conversions, and cost per conversion.
D is for Diagnostics
Page Timing: Use this custom report to QA page load timing and reveal problem pages. Switch from the "data" table view to the "comparison" table view, and compare load time to bounce rate, allowing you to view the bounce rate for each page against the site average.
Internal and External 404 Report: A custom report to help resolve 404 errors. Includes two report tabs. Tab 1: bad inbound links, and Tab 2: bad internal links. Be sure to change the filter for "page title" to the page title used on your site's 404 page.
Posted by Rachelle Maisner, Account Manager at Digitaria, a Google Analytics Certified Partner

Google joins the Global Alliance for Genomics and Health



Generating research data is easier than ever before, but interpreting and analyzing it is still hard, and getting harder as the volume increases. This is especially true of genomics. Sequencing the whole genome of a single person produces more than 100 gigabytes of raw data, and a million genomes will add up to more than 100 petabytes. In 2003, the Human Genome Project completed after 15 years and $3 billion. Today, it takes closer to one day and $1,000 to sequence a human genome.

This abundance of new information carries great potential for research and human health -- and requires new standards, policies and technology. That’s why Google has joined the Global Alliance for Genomics and Health. The Alliance is an international effort to develop harmonized approaches to enable responsible, secure, and effective sharing of genomic and clinical information in the cloud with the research and healthcare communities, meeting the highest standards of ethics and privacy. Members of the Global Alliance include leading technology, healthcare, research, and disease advocacy organizations from around the world.

To contribute to the genomics community and help meet the data-intensive needs of the life sciences, we are introducing:


Interoperability: One API, Many Apps
Any of the apps at the top (one graphical, one command-line, and one for batch processing) can work with information in any of the repositories at the bottom (one using cloud-based storage and one using local files). As the ecosystem grows, all developers and researchers benefit from each individual developer’s work.

With these first steps, it is our goal to support the global research community in bringing the vision of the Global Alliance for Genomics and Health to fruition. Imagine the impact if researchers everywhere had larger sample sizes to distinguish between people who become sick and those who remain healthy, between patients who respond to treatment and those whose condition worsens, between pathogens that cause outbreaks and those that are harmless. Imagine if they could test biological hypotheses in seconds instead of days, without owning a supercomputer.

We are honored to be part of the community, working together to refine the technology and evolve the ecosystem, and aligning with appropriate standards as they arise.

How you can be involved

To request access to the API for your research, please fill out this simple form to tell us about yourself and your research interests, and we will let you know when we’re ready to work with more partners.

Together with the members of the Global Alliance for Genomics and Health, we believe we are at the beginning of a transformation in medicine and basic research, driven by advances in genome sequencing and huge-scale computing. We invite you to contact us and share your ideas about how to bring data science and life science together.

Wednesday, 26 February 2014

New: AdWords Reports for App download campaigns

Today we’re happy to announce a deeper integration between AdWords and Google Analytics for Mobile Apps that will help advertisers make faster and better decisions about marketing their apps.

To put it simply: link your AdWords and Google Analytics accounts and enable auto-tagging, and you’ll start receiving a new set of detailed reports on things like day parts, destination URLs and keyword positions. These automatic reports show exactly how your search and display campaigns are performing and offer rich insights into the kind of users they’re driving to Google Play. 

Any user of both AdWords and Google Analytics can have this set of reports by just enabling auto-tagging. We handle the rest, so you can focus on optimizing rather than manually tagging AdWords campaigns.


These new reports can be found under the Acquisition menu for Google Analytics App Views. They’ll become visible to everyone over the next few days.

Insights for display & search campaigns
These new reports cover both display and search campaigns. You can: 
  • Check the Campaigns report to better understand users being driven into your app, and see how they use your app. 
  • Find out from the Day Parts report when users are interacting with your campaigns.
  • Use Search reports to find out which keywords and search queries are acquiring the most new users.
Click image for full-sized version
A step towards measuring lifetime value of your customers
Ad campaigns should help you find the best customers. These new reports go a long way towards identifying them. Whether you track in-app revenue or specific goal conversions, you’ll be able to tie user quality to the campaign that brought them to your app.

One of our early Beta testers was Nubee, a Singapore-based game development studio. They shared their experience with us:

"We were satisfied that we could track which keywords attributed to sales. Using this data, we were also able to modify the download page." - Shizuka Watanbe, Head of PR, Nubee

We hope you’ll find these new reports useful. You can get them running by linking accounts and enabling auto-tagging today.

Posted by Rahul Oak, Product Manager, Google Analytics for Apps

Tuesday, 25 February 2014

Making Sense of Data with Google


In September 2013, Google announced joining forces with edX to contribute to their open source platform, Open edX. Since then we’ve been working together to expand this open education ecosystem. We’re pleased to announce our first online course built using Open edX. Making Sense of Data showcases the collaborative technology of Google and edX using cbX to run Open edX courses on Google App Engine.

The world is filled with lots of information; learning to make sense of it all helps us to gain perspective and make decisions. We’re pleased to share tools and techniques to structure, visualize, and analyze information in our latest self-paced, online course: Making Sense of Data.

Making Sense of Data is intended for anybody who works with data on a daily basis, such as students, teachers, journalists, and small business owners, and who wants to learn more about how to apply that information to practical problems. Participants will learn about the data process, create and use Fusion Tables (an experimental tool), and look for patterns and relationships in data. Knowledge of statistics or experience with programming is not required.

Like past courses, participants engage with course material through a combination of video and text lessons, activities, and projects. In this course, we will also introduce some new features that help create a more engaging participant experience. For example, participants will be able to access instant hangouts and live chats from the course web page for quick help or for direct feedback. As with all of our MOOCs, you’ll learn from Google experts and collaborate with participants worldwide. You’ll also have the opportunity to complete a final project and apply the skills you’ve learned to earn a certificate.

Making Sense of Data runs from March 18 - April 4, 2014. Visit g.co/datasense to learn more and register today. We look forward to seeing you make sense of all the information out there!

Thursday, 20 February 2014

Monitoring the World's Forests with Global Forest Watch




By the time we find out about deforestation, it’s usually too late to take action.

Scientists have been studying forests for centuries, chronicling the vital importance of these ecosystems for human society. But most of us still lack timely and reliable information about where, when, and why forests are disappearing.

This is about to change with the launch of Global Forest Watch—an online forest monitoring system created by the World Resources Institute, Google and a group of more than 40 partners. Global Forest Watch uses technologies including Google Earth Engine and Google Maps Engine to map the world’s forests with satellite imagery, detect changes in forest cover in near-real-time, and make this information freely available to anyone with Internet access.

By accessing the most current and reliable information, everyone can learn what’s happening in forests around the world. Now that we have the ability to peer into forests, a number of telling stories are beginning to emerge.

Global forest loss far exceeds forest gain
Pink = tree cover loss
Blue = Tree cover gain

According to data from the University of Maryland and Google, the world lost more than 500 million acres of forest between 2000 and 2012. That’s the equivalent of losing 50 soccer fields’ worth of forests every minute of every day for the past 13 years! By contrast, only 0.8 million km2 have regrown, been planted, or restored during the same period.


The United States’ most heavily forested region is made up of production forests
Pink = tree cover loss Blue = Tree cover gain

The Southern United States is home to the nation’s most heavily forested region, making up 29 percent of the total U.S. forest land. Interestingly, the majority of this region is “production forests.” The mosaic of loss (pink) and gain (blue) in the above map shows how forests throughout this region are used as crops – grown and harvested in five-year cycles to produce timber or wood pulp for paper production.

This practice of “intensive forestry” is used all over the world to provide valuable commodities and bolster regional and national economies. WRI analysis suggests that if managers of production forests embrace a “multiple ecosystem services strategy”, they will be able to generate additional benefits such as biodiversity, carbon storage, and water filtration.


Forests are protected in Brazil’s indigenous territories
Pink = tree cover loss Dark green = forest Light green = Degraded land or pastures
The traditional territory of Brazil's Surui tribe is an island of green surrounded by lands that have been significantly degraded and deforested over the past 10+ years. Indigenous communities often rely on forests for their livelihoods and cultural heritage and therefore have a strong incentive to manage forests sustainably. However, many indigenous communities struggle to protect their lands against encroachment by illegal loggers, which may be seen in Global Forest Watch using annual data from the University of Maryland and Google, or monthly alerts from Imazon, a Brazilian NGO and GFW partner.


Make Your Own Forest Map

Previously, the data required to make these maps was difficult to obtain and interpret, and most people lacked the resources necessary to access, view, and analyze the the information. With Global Forest Watch, this data is now open to anyone with Internet access. We encourage you to visit Global Forest Watch and make your own forest map. There are many stories to tell about what is happening to forests around the world—and your stories can lead to action to protect these special and threatened places. What story will you tell?

Wednesday, 19 February 2014

Google Award Program stimulates Journalism and CS collaboration



Last fall, Google invited academic researchers to participate in a Computational Journalism awards program focused on the intersection of Computer Science and Journalism. We solicited proposals for original research projects relevant to today’s fast evolving news industry.

As technology continues to shape and be shaped by the media landscape, applicants were asked to rethink traditional models and roles in the ecosystem, and reimagine the lifecycle of the news story in the online world. We encouraged them to develop innovative tools and open source software that could benefit readers and be game-changers for reporters and publishers. Each award includes funding of $60,000 in cash and $20,000 in computing credits on Google’s Cloud Platform.

We congratulate the recipients of these awards, whose projects are described below, and look forward to the results of their research. Stay tuned for updates on their progress.

Larry Birnbaum, Professor of Electrical Engineering and Computer Science, and Journalism, Northwestern University
Project: Thematic Characterization of News Stories
This project aims to develop computational methods for identifying abstract themes or "angles" in news stories, e.g., seeing a story as an instance of "pulling yourself up by your bootstraps," or as a "David vs. Goliath" story. In collaboration with journalism and computer science students, we will develop applications utilizing these methods in the creation, distribution, and consumption of news content.

Irfan Essa, Professor, Georgia Institute of Technology
Project: Tracing Reuse in Political Language
Our goal in this project is to research, and then develop a data-mining tool that allows an online researcher to find and trace language reuse. By language reuse, we specifically mean: Can we find if in a current text some language was used that can be traced back to some other text or script. The technical innovation in this project is aimed at (1) identifying linguistic reuse in documents as well as other forms of material, which can be converted to text, and therefore includes political speeches and videos. Another innovation will be in (2) how linguistic reuse can be traced through the web and online social networks.

Susan McGregor, Assistant Director, Tow Center for Digital Journalism, Columbia Journalism School
Project: InfoScribe
InfoScribe is a collaborative web platform that lets citizens participate in investigative journalism projects by digitizing select data from scanned document sets uploaded by journalists. One of InfoScribe's primary research goals is to explore how community participation in journalistic activities can help improve their accuracy, transparency and impact. Additionally, InfoScribe seeks to build and expand upon understandings of how computer vision and statistical inference can be most efficiently combined with human effort in the completion of complex tasks.

Paul Resnick, Professor, University of Michigan School of Information
Project: RumorLens
RumorLens is a tool that will aid journalists in finding posts that spread or correct a particular rumor on Twitter, by exploring the size of the audiences that those posts have reached. In the collection phase, the user provides one or a few exemplar tweets and then manually classifies a few hundred others as spreading the rumor, correcting it, or labeling it as unrelated. This enables automatic retrieval and classification of remaining tweets, which are then presented in an interactive visualization that shows audience sizes.

Ryan Thornburg, Associate Professor, School of Journalism and Mass Communication, University of North Carolina at Chapel Hill
Project: Public Records Dashboard for Small Newsrooms
Building off our Knight News Challenge effort to bring data-driven journalism to readers of rural newspaper websites, we are developing an internal newsroom tool that will alert reporters and editors to potential story tips found in public data. Our project aims to lower the cost of finding in public data sets stories that shine light in dark places, hold powerful people accountable, and explain our increasingly complex and interconnected world. (Public facing site for the data acquisition element of the project at http://open-nc.org)

Ensuring Data Accuracy with a Tag Management Policy

The following is a guest post from GACP Michael Loban, CMO at InfoTrust.

The quality of the website analytics data we have is directly related to the tag management processes adopted by an organization. Most likely, you can remember days when the following incidents may have occurred:
  1. You find that one (or several) of the pages on your site is missing Google Analytics, or some pages had Google Analytics deployed twice causing duplicate pageviews and inflating traffic.
  2. Google Analytics custom variables were inconsistent or missing on some portions of the site, leading to data quality issues.
  3. An unauthorized marketing tag was piggybacking off of another tag.
  4. One of the tags on an international site you managed did not follow the new EU Cookie Laws related to privacy.
Adopting a Tag Management System like Google Tag Manager is a great way to go, but having a great tool to organize and deploy your tags is often not enough. You still need a system, a process, and ongoing review. Here are the steps for creating a tag management policy for your company:

1. Know where you are – what tags are currently firing, where and how? Whether you have a small site with a few hundred pages or an international publication with thousands of pages, it is important to assess your current tag deployment. 

Can you say, with 100% confidence, that your analytics tag are located on every page?  Are you sure the cookies set by your analytics tag/tool are accurate and not over-writing each other?

Regardless of whether you are confident or not, I suggest using a tool like TagInspector.com (Tag Inspector is an InfoTrust product). It will help you locate:
  1. All the tags on your site, split up by specific pages’ tags, and even pages they are missing from.
  2. Cookies set by various tags and what pages they are set on.
  3. How the tag is deployed – through a tag management system or directly from a page source.
  4. Instances of tag piggybacking – one tag being loaded by another tag.
Here is a screenshot from an example scan. It shows how tags load (commonly referred to as tag hierarchy). We have removed the website URL, but as you can see there are instances when Google Analytics is being loaded by the TMS, and instances where Google Analytics is being loaded directly from the source of the page. 

2. Document all approved tags. The average enterprise website might have 25-50 marketing tags. Not all of them have to be present across all pages. However, even if you are considering moving to a Tag Management System, or already are using one, it is not a bad idea to have the following documented and categorized:
  1. Tag name and functionality
  2. Pages or the category pages the tag needs to be on
  3. Information collected through the tag about visitors (cookies set)
  4. Firing rules

Check out Tagopedia – a wiki of tags to learn more about the many different types of tags.

3. Consider the implementation of a Tag Management System. There is a reason this is step three, and not step one or two. A lot of companies jump to this step first, thinking that a new technology will miraculously make all tagging issues disappear. The first step in moving to a TMS is knowing what tags you need to keep, and where they are or how they are loaded on your site so you can remove them from the source after the tag management system is deployed.

When considering the implementation of a tag management system, think about your team. Every website of a TMS vendor says you will no longer need your IT team to make changes to the tags thus simplifying and expediting the process. I have met plenty of marketers who do not want anything to do with a TMS. Even though you will free up your IT resources, you will still need a person or team with the technical training to manage your tags. 

Naturally, your first step in evaluating Tag Management vendors should be outlining what features you really need. Google Tag Manager is free, and is one of the few TMS systems that works for both mobile websites and native mobile applications. 

NOTE:  If you do decide to migrate to a TMS or if you have already done so, you still should scan all the pages across your site to ensure that your tags fire correctly, such as, once per page for analytics tags – and only from your TMS. You certainly want to avoid having a tag in the source of your page and inside a TMS – this will inflate your data and cause data quality issues.

4. Run ongoing site audits to ensure correct tags are deployed across correct pages. Ideally, this will only serve as the insurance. However, ongoing site scans or audits can help you avoid the moments when you realize you did not capture AdWords conversions because your GA or AdWords conversion tag was removed from the conversion page. Keep in mind certain tags might only fire when a user looks at your website on a mobile device, and your scan might need to simulate different user agents.  Doing this manually for all the sites you manage, or across one very large site, can be quite challenging. Again, TagInspector.com can help speed up this process and dramatically reduce the effort required. Here is an example screenshot of the scanning options:

5. Think ahead – will you be able to innovate? Complete lock down is in nobody’s best interests. What happens if there is a new platform for A/B testing that you would like to try? How long will it take you to get the tag approved, implemented on your site, verify its performance, and launch a campaign? Keep innovation in mind and make it relatively easy for marketers in your company to adopt new technologies.

One way to go about this is having an application that needs to be completed and approved prior to implementing a new tag. This will help you ensure only tags that meet company standards are implemented on your site. 

At the end of the day, tag deployment and data collection will only get more complex. If you do not have any process for managing your tags, it is time to start. If you have some kind of process, perhaps it is time for optimization. Get all the stakeholders in the room, and decide who will be your tag management team, and what the first step will be to ensure tag accuracy. You can’t do analysis if the data isn’t accurate. And your data won’t be accurate if your marketing tags aren’t implemented correctly. 

If you would like to learn more about implementing a tag management policy, we would like to invite you to attend a free webinar on March 26th at 1:00PM EST where we will discus items outlined in this post, and a lot more. 

Posted by Michael Loban, CMO at Google Analytics Certified Partner InfoTrust

Tuesday, 18 February 2014

Google Research Awards: Winter 2014



We have just completed another round of the Google Research Awards, our biannual open call for proposals on computer science-related topics including robotics, natural language processing, systems, policy, and mobile. Our grants cover tuition for a graduate student and provide both faculty and students the opportunity to work directly with Google researchers and engineers.

This round we received 691 proposals, an increase of 19% over last round, covering 46 countries on 6 continents. After expert reviews and committee discussions, we decided to fund 115 projects. The subject areas that received the highest level of support were human-computer interaction, systems, and machine learning, with 25% of the funding awarded to universities outside the U.S.

We set a new record this round with over 2000 reviews done by 650 reviewers. Each proposal is reviewed by internal committees who provide feedback on merit and relevance. In many cases, the committees include some of the foremost experts in the world. All committee members are volunteers who spend a significant amount of time making the Research Award program happen twice a year.

Congratulations to the well-deserving recipients of this round’s awards. If you are interested in applying for the next round (deadline is April 15), please visit our website for more information.

Thursday, 13 February 2014

Watchfinder clocks 1300% ROI using precision Remarketing with Google Analytics

Watchfinder is a leading UK retailer of premium, pre-owned watches. The company was founded in 2002 as an online-only store selling watches from more than 80 premier manufacturers. Today, it has an annual turnover of £25 million and has recently opened a flagship boutique in the London Royal Exchange.


Counting the hours 
Considering the average order value on Watchfinder’s site is over £3,500, the company found buying decisions tended to take time, often spanning weeks or months. In fact, less than 1% of visitors were completing purchases on their first site visit. Watchfinder’s challenge was to re-engage and also maintain a conversation with these visitors, encouraging them to return and make an order. In addition to driving customers back to its site, Watchfinder also wanted to encourage customers to visit its new physical boutique in the London Royal Exchange.

A moment to reconnect
Watchfinder’s agency Periscopix – a Google Analytics Certified Partner – suggested Remarketing with Google Analytics as a great way to reconnect with users. Remarketing with Google Analytics allows advertisers to tap into valuable insights about website visitors who show an interest in products, identify the most relevant audiences, and run ads across the Google Display Network that are tailored to that audience using the industry’s most powerful segmentation capabilities.

Periscopix created 20 highly focused lists of visitors who demonstrated intent but did not purchase. Specifically, lists were based on various aspects of user context such as location, language, and what stage of the purchase funnel they were in. On-site behavior helped establish groups that had spent a certain amount of time on the site or had viewed a certain number of pages. Other lists were created around users who had viewed a specific watch brand on the site.

Additionally, traffic performance analysis across a variety of GA dimensions revealed that certain ISP’s in the London financial district yielded traffic with much higher engagement and above average conversion rates. As a result, Periscopix designed segments around investment banks like JPMorgan and Goldman Sachs to engage with employees at these companies.


Google Analytics’ functionality enabled Periscopix to convey tailored messages to these key groups of interested consumers. For example, London-based users were retargeted with ads encouraging visits to the new London store, while visitors to the .co.uk site from France were retargeted with ads promoting the French site. 

Time well spent
Thanks to clear reporting in Google Analytics, it’s been easy to see the impressive results from Watchfinders’ remarketing campaign. Six months in, Periscopix reveals the return on investment is 1300%. Average order value on the site has also increased by 13%, resulting in 34% lower CPAs than Watchfinder’s non-brand search campaigns. 

Across all tactics used, the remarketing list that produced the highest conversion rates, both in terms of goals and transactions, was made up of visitors who browsed for 10 minutes or more on their initial site visit without purchasing.

Given Watchfinder's early success with Remarketing with Google Analytics across the Google Display Network, the brand is excited to increase investment in this area going forward.

Be sure to check out the whole case study here.

Posted by the Google Analytics Team

Monday, 10 February 2014

Zillow uses Google Analytics Premium to make data-driven decisions

Google Analytics Premium lets Zillow grow and scale their company.

A host of functions at Zillow use Google Analytics every day… Marketing, business intelligence, design, engineering and usability are using it to drive product decisions, user experience decisions, and business decisions.”
- Jeremy Wacksman, VP Marketing at Zillow

Zillow is the home and real estate marketplace that helps people share vital information about home values, rentals, mortgages and a lot more. Zillow was founded in 2005 and now has over 110 million U.S. homes in its living database. (That name? A combination of “zillions of data points” and the pillows where happy homeowners rest their heads.)
Recently we sat down with Jeremy Wacksman, Zillow’s VP of Marketing, to learn how they’ve been using Google Analytics Premium to help them grow at such an amazing pace. Here’s what he told us:



This comment stands out: “As an Internet company that has reinvented itself as a mobile-first business, analytics across devices is a big challenge for us.”  

A lot of companies are reinventing themselves for mobile today, and we’ve been working hard to make sure Google Analytics Premium can help them measure all those new cross-device journeys. The goal, as always, is to help businesses gather meaningful data, easily discover insights that they can act upon to improve results and boost the bottom line.

Learn more about Google Analytics Premium here.

Posted by Adam Singer, Google Analytics Advocate