It’s no secret: Higher education institutions are complex. 

Between multiple campuses, multiple audiences, and a high volume of content, higher ed marketing and communications teams have a ton to juggle.

And that’s before you throw a new website into the mix.   

Not long ago, the team at Oomph partnered with the University of Colorado (CU) and Keene State College (KSC) to redesign sites for each institution. While their asks – and end products – were unique, the processes had a lot in common. So much so that we’re peeling back the curtain on our discovery process to give other higher ed institutions the tools to deliver websites that meet business goals and audience needs. 

In this article, we’re turning our lessons learned into a discovery playbook that can help higher education institutions set the stage for a successful site redesign. 

The Projects

University of Colorado Giving Platform 

The University of Colorado has an active and engaged alumni network that loves supporting all things CU. The university came to Oomph because it needed a donor funds platform that could keep up. The goals of the discovery process were to:

While CU had a gut feeling about what it would take to meet internal expectations and keep prospective donors happy, gut feelings aren’t enough to build a website. CU knew that a professional perspective and data-backed analysis would lay a firm foundation for the site redesign. 

Keene State College Main Website

KSC, a public liberal arts college in New Hampshire, wanted a refreshed main website that would resonate with prospective students, current students, and alumni alike. For KSC, key goals during discovery were to:

The team came to Oomph with ideas but wanted research validation and guidance to nurture those ideas into a strategic design plan. 

The Approach

For both projects, Oomph utilized our in-depth discovery process to validate assumptions, clarify priorities, and gain buy-in across the organizations. 

KSC and CU both had a good sense of the work they needed to be done. But having a feel for the floorplan doesn’t mean you’re ready to build your dream house. Whether it’s a home or a website, both projects need an architect: an experienced professional who can consider all the requirements and create a strategic framework that’s able to support them. That work should happen before deciding what paint to put on the walls. 

In our initial review, Oomph found that both sites had similar challenges: They struggled to focus on one key audience and to easily guide users through the site to the desired content. Our question was: How do we solve those struggles in a new website? 

To answer it, we led KSC and CU through discovery processes that included:

  1. An intake questionnaire and live sessions with key stakeholders to understand the goals and challenges holding the current sites back. 
  1. Defining strengths and areas for improvement with methods like a UX audit, a content and analytics audit, and a cohort analysis.
  1. Creating user journey maps that rolled audience, website, and competitive insights into a unified vision for the new user experience. 
  1. Delivering a final set of data-backed recommendations that translated needs and wants into actionable next steps, equipping both teams to secure organizational approval and move the projects forward. 

The Insights

Discovery isn’t a one-size-fits-all. However, our experiences with KSC and CU highlighted some common challenges that many higher-ed institutions face. Our insights from these projects offer a starting point for other institutions kicking off their own website redesigns

  1. Start with your audience’s needs.

Who is your primary audience? Figure out who they are, then really drill down on their needs, preferences, and desired actions. This can be uniquely challenging for higher education institutions because they serve such a wide range of people.

Data is how you overcome that hurdle. As part of discovery, we: 

When you do that work, you get a birds-eye view into what your audience really needs – and what it’ll take for your website to be up to the challenge. 

Take KSC. The existing site attempted to serve multiple audiences, creating a user journey that looked like this: 

Sample of a current user Journey Map

We identified a primary audience of prospective students, specifically local high school students and their parents/guardians. When you speak directly to those students, you get a simpler, cleaner user journey that looks like this: 

Sample of an ideal user Journey Flow
  1. Define organized and clear navigation to support user journeys. 

Your navigation is like a map. When all the right roads are in place, it should help your users get where they want to go. 

That makes your navigation the starting point for your redesign. Your goal is to define where content will live, the actions users can take upon arrival, and, equally important, the content they won’t see at first. This then informs what goes where – the header nav, the footer nav, or the utility nav – because each has unique and complementary purposes. 

With both KSC and CU, discovery was our opportunity to start building a navigation that would serve the primary audience we had already uncovered. For CU, the current navigation: 

Current CU navigation: 

During discovery, we created an updated navigation that would appeal to its primary audience of prospective donors, while still meeting the needs of secondary audiences (returning donors and giving professionals).  

Proposed CU navigation: 

Proposed information architecture map for the Colorado University Giving site
  1. Find an optimal blend of branding, design, and content – especially for the home page and other high-visibility areas.

The design and content you choose for your site should resonate with your target audience and enhance the navigation you already defined. In that way, your home page is like your storefront. What will you put on the sign or display in the windows so people will actually walk inside? 

That’s the secret sauce behind this part of discovery: deciding what your primary audience really needs to know and how best to showcase it. 

To help KSC speak to prospective students, we recommended: 

CU wanted to connect with prospective donors, so we suggested a design that: 

  1. Probe additional areas where needed (and skip where it’s not).

Our hot take: There is such a thing as too much data. If you’re wading through pools of information that isn’t relevant to your end user, it can muddy the waters and make it harder to identify what’s worth acting on. 

With that in mind, this step will change from project to project. Ask yourself, what else does my audience need to feel like they got what they came for? 

For KSC, that involved additional strategy work – like the information architecture – that helped the institution gear up for later design phases. CU, on the other hand, needed significant technical discovery to address the level of custom code required, limited page building capabilities and clunky e-commerce integration. We recommended an updated tech stack, including a new donation platform and payment gateway, that would improve security, simplify maintenance, and enhance the user experience. 

  1. Plan for a system that allows for easy updates later.

As they say, the only constant is change. This rings especially true for higher ed institution websites, where content is plentiful and multiple stakeholders need to make site updates. 

To make sure CU and KSC’s sites continued to work for them long after our projects had ended, our discovery included suggestions around:

Start Your Redesign on the Right Foot

A thorough, well-researched, and well-organized discovery is key for designing a website that meets all of your – and your audience’s – needs. 
Need a fresh perspective on your higher ed site redesign? Let’s talk about it.

There’s a new acronym on the block: MACH (pronounced “mock”) architecture. 

But like X is to Twitter, MACH is more a rebrand than a reinvention. In fact, you’re probably already familiar with the M, A, C, and H and may even use them across your digital properties. While we’ve been helping our clients implement aspects of MACH architecture for years, organizations like the MACH Alliance have recently formed in an attempt to provide clearer definition around the approach, as well as to align their service offerings with the technologies at hand. 

One thing we’ve learned at Oomph after years of working with these technologies? It isn’t an all-or-nothing proposition. There are many degrees of MACH adoption, and how far you go depends on your organization and its unique needs. 

But first, you need to know what MACH architecture is, why it’s great (and when it’s not), and how to get started. 

What Is MACH?

MACH is an approach to designing, building, and testing agile digital systems — particularly websites. It stands for microservices, APIs, cloud-native, and headless. 

Like a composable business, MACH unites a few tried-and-true components into a single, seamless framework for building modern digital systems. 

The components of MACH architecture are: 

  1. Microservices: Many online features and functions can be separated into more specific tasks, or microservices. Modern web apps often rely on specialized vendors to offer individual services, like sending emails, authenticating users, or completing transactions, rather than a single provider to rule them all. 
  2. APIs: Microservices interact with a website through APIs, or application programming interfaces. This allows developers to change the site’s architecture without impacting the applications that use APIs and easily offer those APIs to their customers.
  3. Cloud-Native: A cloud-based environment hosts websites and applications via the Internet, ensuring scalability and performance. Modern cloud technology like Kubernetes, containers, and virtual machines keep applications consistent while meeting the demands of your users. 
  4. Headless: Modern Javascript frameworks like Next.js and Gatsby empower intuitive front ends that can be coupled with a variety of back-end content management systems, like Drupal and WordPress. This gives administrators the authoring power they want without impacting end users’ experience. 

Are You Already MACHing? 

Even if the term MACH is new to you, chances are good that you’re already doing some version of it. Here are some telltale signs:

If you’re doing any of the above, you’re MACHing. But the magic of MACH is in bringing them all together, and there are plenty of reasons why companies are taking the leap. 

5 Benefits of MACH Architecture

If you make the transition to MACH, you can expect: 

  1. Choice: Organizations that use MACH don’t have to settle for one provider that’s “good enough” for the countless services websites need. Instead, they can choose the best vendor for the job. For example, when Oomph worked with One Percent for America to build a platform offering low-interest loans to immigrants pursuing citizenship, that meant leveraging the Salesforce CRM for loan approvals, while choosing “Click and Pledge” for donations and credit card transactions. 
  2. Flexibility: MACH architecture’s modular nature allows you to select and integrate individual components more easily and seamlessly update or replace those components.  Our client Leica, for example, was able to update its order fulfillment application with minimal impact to the rest of its Drupal site. 
  3. Performance: Headless applications often run faster and are easier to test, so you can deploy knowing you’ve created an optimal user experience. For example, we used a decoupled architecture for our client Wingspans to create a stable, flexible, and scalable site with lightning-fast performance for its audience of young career-seekers.     
  4. Security: Breaches are generally limited to individual features or components, keeping your entire system more secure. 
  5. Future-Proofing: A MACH system scales easily because each service is individually configured, making it easier to keep up with technologies and trends and avoid becoming out-of-date. 

5 Drawbacks of MACH Architecture

As beneficial as MACH architecture can be, making the switch isn’t always smooth sailing. Before deciding to adopt MACH, consider these potential pitfalls. 

  1. Complexity: With MACH architecture, you’ll have more vendors — sometimes a lot more — than if you run everything on one enterprise system. That’s more relationships to manage and more training needed for your employees, which can complicate development, testing, deployment, and overall system understanding. 
  2. Challenges With Data Parity: Following data and transactions across multiple microservices can be tricky. You may encounter synchronization issues as you get your system dialed in, which can frustrate your customers and the team maintaining your website. 
  3. Security: You read that right — security is a potential pro and a con with MACH, depending on your risk tolerance. While your whole site is less likely to go down with MACH, working with more vendors leaves you more vulnerable to breaches for specific services. 
  4. Technological Mishaps: As you explore new solutions for specific services, you’ll often start to use newer and less proven technologies. While some solutions will be a home run, you may also have a few misses. 
  5. Complicated Pricing: Instead of paying one price tag for an enterprise system, MACH means buying multiple subscriptions that can fluctuate more in price. This, coupled with the increased overhead of operating a MACH-based website, can burden your budget. 

Is MACH Architecture Right for You? 

In our experience, most brands could benefit from at least a little bit of MACH. Some of our clients are taking a MACH-lite approach with a few services or apps, while others have adopted a more comprehensive MACH architecture. 

Whether MACH is the right move for you depends on your: 

  1. Platform Size and Complexity: Smaller brands with tight budgets and simple websites may not need a full-on MACH approach. But if you’re managing content across multiple sites and apps, managing a high volume of communications and transactions, and need to iterate quickly to keep up with rapid growth, MACH is often the way to go. 
  2. Level of Security: If you’re in a highly regulated industry and need things locked down, you may be better off with a single enterprise system than a multi-vendor MACH solution.  
  3. ROI Needs: If it’s time to replace your system anyway, or you’re struggling with internal costs and the diminishing value of your current setup, it may be time to consider MACH. 
  4. Organizational Structure: If different teams are responsible for distinct business functions, MACH may be a good fit. 

How To Implement MACH Architecture

If any of the above scenarios apply to your organization, you’re probably anxious to give MACH a go. But a solid MACH architecture doesn’t happen overnight. We recommend starting with a technology audit: a systematic, data-driven review of your current system and its limitations.

We recently partnered with career platform Wingspans to modernize its website. Below is an example of the audit and the output: a seamless and responsive MACH architecture. 

The Audit

  1. Surveys/Questionnaires: We started with some simple questions about Wingspan’s website, including what was working, what wasn’t, and the team’s reasons for updating. They shared that they wanted to offer their users a more modern experience. 
  2. Stakeholder Interviews: We used insights from the surveys to spark more in-depth discussions with team members close to the website. Through conversation, we uncovered that website performance and speed were their users’ primary pain points. 
  3. Systems Access and Audit: Then, we took a peek under the hood. Wingspans had already shared its poor experiences with previous vendors and applications, so we wanted to uncover simpler ways to improve site speed and performance. 
  4. Organizational Structure: Understanding how the organization functions helps design a system to meet those needs. The Wingspans team was excited about modern technology and relatively savvy, but they also needed a system that could accommodate thousands of authenticated community members. 
  5. Marketing Plan Review: We also wanted to understand how Wingspans would talk about their website. They sought an “app-like” experience with super-fast search, which gave us insight into how their MACH system needed to function. 
  6. Roadmap: Wingspans had a rapid go-to-market timeline. We simplified our typical roadmap to meet that goal, knowing that MACH architecture would be easy to update down the road. 
  7. Delivery: We recommended Wingspans deploy as a headless site (a site we later developed for them), with documentation we could hand off to their design partner. 

The Output 

We later deployed Wingspans.com as a headless site using the following components of MACH architecture:

  1. Microservices: Wingspans leverages microservices like Algolia Search for site search, Amazon AWS for email sends and static site hosting, and Stripe for managing transactions.
  2. APIs: Wingspans.com communicates with the above microservices through simple APIs. 
  3. Cloud-Native: The new website uses cloud-computing services like Google Firebase, which supports user authentication and data storage. 
  4. Headless: Gatsby powers the front-end design, while Cosmic JS is the back-end content management system (CMS). 

Let’s Talk MACH

As MACH evolves, the conversation around it will, too. Wondering which components may revolutionize your site and which to skip (for now)? Get in touch to set up your own technology audit.

Finding yourself bogged down with digital analytics? Spending hours just collecting and organizing information from your websites and apps? Looker Studio could be the answer to all your problems (well, maybe not all of them, but at least where data analytics are concerned).

This business intelligence tool from Google is designed to solve one of the biggest headaches out there for marketers: turning mountains of website data into actionable insights. Anyone who’s ever gone down the proverbial rabbit hole scouring Google Analytics for the right metrics or manually inputting numbers from a spreadsheet into their business intelligence platform knows that organizing this data is no small task. With Looker Studio, you can consolidate and simplify complicated data, freeing up more time for actual analysis.

With so many customizable features and templates, it does take time to set up a Looker Studio report that works for you. Since Google’s recent switch from Universal Analytics to Google Analytics 4, you might also find that certain Looker Studio reports aren’t working the way they used to.

Not to worry: Our Oomph engineers help clients configure and analyze data with Looker Studio every day, and we’ve learned a few tips along the way. Here’s what to know to make Looker Studio work for your business.

The benefits of using Looker Studio for data visualization and analysis

Formerly known as Google Data Studio, Looker Studio pulls, organizes, and visualizes data in one unified reporting experience. For marketers who rely heavily on data to make informed decisions, Looker Studio can save precious time and energy, which you can then invest in analyzing and interpreting data.

Key benefits of using Looker Studio include:

How Oomph uses Looker Studio

As a digital-first company in the business of helping other digital-first companies, we’re big fans of Looker Studio. We think the platform is a great way to share trends on your websites and apps in an easy-to-digest way, making monthly or quarterly reporting much more efficient.

Whether you’re looking for basic insights or need sophisticated analysis, Looker Studio’s visualization capabilities can support smarter, more informed digital decision-making. Here’s a peek at some of the metrics we monitor for our own business, including:

Oomph Looker Studio sample dashboard

We also use the platform to drill deeper, comparing trends over time, identifying seasonal fluctuations and assessing the performance of specific campaigns. We leverage features like dashboards and filters in Looker Studio to give our clients an interactive view of their data.

How Looker Studio Works With GA4

Google Analytics, now known as GA4, is one of the primary tools we connect to Looker Studio. GA4 is the latest version of Google’s popular analytics platform and offers new features and functionality compared with its predecessor, Universal Analytics (UA), including new data visualization capabilities.

As many companies migrate over to GA4, they may be wondering if reporting will be similar between GA4 and Looker Studio – and if you need both.

While GA4 reports may challenge Looker Studio’s capabilities, Looker Studio provides a variety of features that go beyond what GA4 can do on its own. While GA4 dashboards and reports just include GA4 data, Looker Studio can import data from other sources as well. This means you can use Looker Studio to track trends in your site’s performance, regardless of the data source.

Looker Studio also has a unique feature called “LookML,” which allows users to create custom data models and transformations. This means you can tailor your data to your specific needs, rather than being limited by GA4’s built-in reporting. Finally, Looker Studio’s robust sharing and collaboration features allow teams to share data and insights easily and efficiently.

If your company set up Looker Studio before switching to GA4, you may notice a few metrics are now out of sync. Here are a few adjustments to get everything working correctly:

How To Set Up a Looker Studio Report

  1. Choose a template for your dashboard or create one from scratch. If you’re not sure, you can browse through templates to get an idea of what Looker can do.
A view into the Looker Studio template gallery
  1. Connect your data source. Looker supports a long list of sources, including Google, MySQL, AWS Redshift, and more. Don’t worry if your data isn’t in Google – Looker will likely be able to connect to it regardless.
Add data to a report using built-in Google connectors…
…or search for specific Conectors, some of which are provided by partners
  1. Choose your metrics. These are the specific data points you want to track and analyze in your report. You can customize your metrics to fit your specific needs.
  2. Build your dashboard. You can add charts, tables, and other visualizations to help you understand your data. Looker makes it easy to drag and drop these elements into place.
  3. Share it with others. You can either create a share link so that others can access the dashboard directly or you can set up automatic updates to be sent on a regular basis. This makes it easy for others to stay up-to-date on changes and progress.
Reports can be eamiled to participants on a schedule using Looker’s scheduling tool

A Powerful Path To Data Insights

The digital landscape is growing more fragmented and complex by the day, but tools like Looker Studio make it infinitely easier to find your path forward. Taking the time to configure and customize the platform can deliver major ROI by helping you understand user needs, pinpoint website strengths and challenges, and craft the right digital strategy.

Crunched for time or not sure where to start? Oomph can help take the hassle out of data analysis by setting up and monitoring your Looker Studio dashboards. Get in touch to chat about your needs.

Have you ever tried to buy tickets to a concert and experienced the frustration and eventual rage of waiting for pages to load, unresponsive pages, unclear next steps, timers counting down, or buttons not working to submit — and you probably still walked away with zero tickets? Yeah, you probably had some choice words, and your keyboard and mouse might have suffered your ire in the process.

As a website owner, you strive to create a seamless user experience for your audience. Ideally, one that doesn’t involve them preparing to star in their own version of the printer scene in Office Space. Despite your best efforts, there will be times when users get frustrated due to slow page loads, broken links, navigation loops, or any other technical issues. This frustration can lead to what the industry calls “rage clicks” and “thrashed cursors.” When your users are driven to these actions, your website’s reputation, engagement, and return visits can be damaged. Let’s dig in to discuss what rage clicks and thrashed cursors are and how to deal with frustrated users.

First of all, what are Rage Clicks?

Rage clicks are when a user repeatedly clicks on a button or link when it fails to respond immediately — the interface offers no feedback that their first click did something. This bad user experience doesn’t motivate them to return for more. These clicks are likely often accompanied by loud and audible sighs, groans, or even yelling. “Come on, just GO!” might ring a bell if you’ve ever been in this situation. Rage clicks are one of the most frustrating things a user can experience when using a website or app.

Rage Clicks are defined technically by establishing that:

  1. At least three clicks take place
  2. These three clicks happen within a two-second time frame
  3. All clicks occur within a 100px radius
rage-click

Similarly, what is a Thrashed Cursor?

A thrashed cursor is when a user moves the cursor back and forth over a page or element, indicating impatience or confusion. Various issues, including slow page load times, broken links, unresponsive buttons, and unclear navigation, can cause users to exhibit these digital behaviors. It can also indicate the user is about to leave the site if they cannot find that solution quickly.

Thrashed cursors are defined technically by establishing that:

  1. There is an area on the page where a user was moving their mouse erratically
  2. An established pattern of “thrashing” occurs around or on specific elements or pages
  3. Higher rate of user exits from the identified pages

Why do Rage Clicks and Thrashed Cursor happen?

Common reasons rage clicks and thrashed cursors happen are:

  1. Poor Design: Poor design is one of the most common reasons for rage clicks and thrashed cursors. If the website has a confusing layout or navigation structure, it can be frustrating for users to find what they’re looking for. Or, they may assume an element is clickable; when it’s not, it can be irksome. Underlined text is an excellent example, as users often associate underlines with links.
  2. Technical Issues: Technical issues such as slow loading times, broken links, or non-responsive buttons can cause rage clicks and thrashed cursors. Users expect the website to work correctly; when it doesn’t, they can become annoyed or frustrated. If they click a button, they expect the button to do something.
  3. Lack of Clarity: If the website’s content is unclear or poorly written, it can cause confusion and frustration for users. They may struggle to understand the information provided or find it challenging to complete the intended action. Content loops can be a good example of this. Content loops happen when users repeatedly go back and forth between pages or sections of a website, trying to find the information they need. Eventually, they’ll become frustrated, leading to this user leaving the website.
IT Crowd Monitor Throw

How do you resolve issues that lead to Rage Clicks and Thrashed Cursors?

Now that we know what rage clicks and thrashed cursors are and why they happen, how do you resolve it, you may be asking. Here are a few things an agency partner can help you with that can significantly reduce the risk of your users resorting to these behaviors.

Use Performance Measuring Tools

By employing performance measuring, you can analyze the data collected, gain valuable insights into how users interact with your platform, and identify areas for improvement. For example, if you notice a high number of rage clicks on a specific button or link, it may indicate that users are confused about its functionality or that it’s not working correctly. Similarly, if you see a high number of thrashed cursors on a particular page, it may suggest that users are struggling to navigate or find the information they need.

Tools that support Friction or Frustration measurement:

  1. Clarity (from Microsoft)
  2. ContentSquare
  3. Heap
  4. HotJar
  5. Mouseflow
  6. Quantum Metric

Conduct User Experience Exercises and Testing

Identifying the root causes of rage clicks and thrashed cursors can be done through a UX audit. An agency can examine your website design, functionality, and usability, identifying areas of improvement.

  1. User Journey Mapping: User journey mapping involves mapping the user’s journey through your website from a starting point to a goal, identifying pain points along the way, and determining where users may get stuck or frustrated.
  2. Usability Testing: Usability testing involves putting the website in front of real users and giving them tasks to complete. The tester then looks to identify issues, such as slow loading times, broken links, or confusing navigation.
  3. User Surveys: User surveys can be conducted in various ways, including online surveys, in-person interviews, and focus groups. These surveys can be designed to gather information about users’ perceptions of the website, interactions with the website, and satisfaction levels. Questions can be designed to identify areas of frustration, such as difficult-to-find information, slow page load times, or confusing navigation. It’s wise to keep surveys short, so work with your agency to select the questions to garner the best feedback.
  4. Heat Mapping: Heat mapping involves analyzing user behavior on your website, identifying where users are clicking, scrolling, and spending their time. This can identify areas of the website that are causing frustration and leading to rage clicks and thrashed cursors.

Focus on Website Speed Optimization

A digital agency can synthesize findings from UX research and performance-measuring tools and work to optimize your website for quicker page loads and buttons or links that respond immediately to user actions.

  1. Image Optimization: Optimizing images on your website will significantly improve page loading times. An agency can help you optimize server settings and compress images to reduce their size without sacrificing quality.
  2. Minification: Minification involves reducing the size of HTML, CSS, and JavaScript files by removing unnecessary characters such as white space, comments, and line breaks. This can significantly improve page loading times.
  3. Caching: Caching involves storing frequently accessed website data on a user’s device, reducing the need for data retrieval and improving website speed.
  4. Content Delivery Network (CDN): A CDN is a network of servers distributed worldwide that store website data, improving website speed by reducing the distance between the user and the server.
  5. Server Optimization: Server optimization involves optimizing server settings and configurations, such as increasing server resources, using a faster server, and reducing request response time. Website owners frequently skip this step and don’t select the right hosting plan, which can cost more money through lost users and lower conversions.

Resolve Technical Issues

A web agency can help resolve any technical issues that may be causing frustration for your users. These issues may include broken links or buttons, 404 errors, slow page load times, and server errors. Technical issue resolution can involve various activities, including code optimization, server maintenance, and bug fixes that work to ensure that everything is working correctly and address any issues that arise promptly. The resolution of technical issues will improve website performance, reducing the likelihood of user frustration and rage clicks.

Next Steps

User frustration can negatively impact user satisfaction and business outcomes. Partnering with a digital agency can be a valuable investment to mitigate these issues. Through the use of tools, UX audits, user surveys, website speed optimization, and technical issue resolution, a digital agency can identify and address the root causes of user frustration, improving the overall user experience — leading to an increase in user engagement, satisfaction, and loyalty, which means improved conversion rates, higher customer retention, and ultimately, increased revenue for your business.

If your customers are hulking out, maybe it’s time to call us!

More than two years after Google announced the launch of its powerful new website analytics platform, Google Analytics 4 (GA4), the final countdown to make the switch is on.

GA4 will officially replace Google’s previous analytics platform, Universal Analytics (UA), on July 1, 2023. It’s the first major analytics update from Google since 2012 — and it’s a big deal. As we discussed in a blog post last year, GA4 uses big data and machine learning to provide a next-generation approach to measurement, including:

At Oomph, we’ve learned a thing or two about making the transition seamless while handling GA4 migrations for our clients – including a few platform “gotchas” that are definitely better to know in advance. Before you start your migration, do yourself a favor and explore our GA4 setup guide.

Your 12-Step GA4 Migration Checklist

Step 1: Create a GA4 Analytics Property and Implement Tagging

The Gist: Launch the GA4 setup assistant to create a new GA4 property for your site or app. For sites that already have UA installed, Google is beginning to create GA4 properties automatically for them beginning in March 2023 (unless you opt out). If you’re migrating from UA, you can connect your UA property to your GA4 property to use the existing Google tracking tag on your site. For new sites, you’ll need to add the tag directly to your site or via Google Tag Manager.

The Gotcha: During property setup, Google will ask you which data streams you’d like to add (websites, apps, etc…). This is simple if you’re just tracking one site, but gets more complex for organizations with multiple properties, like educational institutions or retailers with individual locations. While UA allowed you to separate data streams by geography or line of business, GA4 handles this differently. This Google guide can help you choose the ideal configuration for your business model.

Step 2: Update Your Data Retention Settings

The Gist: GA4 lets you control how long you retain data on users and events before it’s automatically deleted from Google’s servers. For user-level data, including conversions, you can hang on to data for up to 14 months. For other event data, you have the option to retain the information for 2 months or 14 months.

The Gotcha: The data retention limits are much shorter than UA, which allowed you to keep Google-signals data for up to 26 months in some cases. The default retention setting in GA4 is 2 months for some types of data – a surprisingly short window, in our opinion – so be sure to extend it to avoid data loss.

Step 3: Initialize BigQuery

The Gist: Have a lot of data to analyze? GA4 integrates with BigQuery, Google’s cloud-based data warehouse, so you can store historical data and run analyses on massive datasets. Google walks you through the steps here.

The Gotcha: Since GA4 has tight time limits on data retention as well as data limits on reporting , skipping this step could compromise your reporting. BigQuery is a helpful workaround for storing, analyzing and visualizing large amounts of complex data.

Step 4: Configure Enhanced Measurements

The Gist: GA4 measures much more than pageviews – you can now track actions like outbound link clicks, scrolls, and engagements with YouTube videos automatically through the platform. When you set up GA4, simply check the box for any metrics you want GA4 to monitor. You can still use Google tags to customize tracking for other types of events or use Google’s Measurement Protocol for advanced tracking.

The Gotcha: If you were previously measuring events through Google tags that GA4 will now measure automatically, take the time to review which ones to keep to avoid duplicating efforts. It may be simpler to use GA4 tracking – giving you a good reason to do that Google Tag Manager cleanup you’ve been meaning to get to.

Step 5: Configure Internal and Developer Traffic Settings

The Gist: To avoid having employees or IT teams cloud your insights, set up filters for internal and developer traffic. You can create up to 10 filters per property.

The Gotcha: Setting up filters for these users is only the first step – you’ll also need to toggle the filter to “Active” for it to take effect (a step that didn’t exist in UA). Make sure to turn yours on for accurate reporting.

Step 6: Migrate Users

The Gist: If you were previously using UA, you’ll need to migrate your users and their permission settings to GA4. Google has a step-by-step guide for migrating users.

The Gotcha: Migrating users is a little more complex than just clicking a button. You’ll need to install the GA4 Migrator from Google Analytics add-on, then decide how to migrate each user from UA. You also have the option to add users manually.

Step 7: Migrate Custom Events

The Gist: Event tracking has fundamentally changed in GA4. While UA offered three default parameters for events (eventcategory, action, and eventlabel), GA4 lets you create any custom conventions you’d like. With more options at your fingertips, it’s a great opportunity to think through your overall measurement approach and which data is truly useful for your business intelligence.

When mapping UA events to GA4, look first to see if GA4 is collecting the data as an enhanced measurement, automatically collected, or recommended event. If not, you can create your own custom event using custom definitions. Google has the details for mapping events.

The Gotcha: Don’t go overboard creating custom definitions – GA4 limits you to 50 per property.

Step 8: Migrate Custom Filters to Insights

The Gist: Custom filters in UA have become Insights in GA4. The platform offers two types of insights: automated insights based on unusual changes or emerging trends, and custom insights based on conditions that matter to you. As you implement GA4, you can set up custom insights for Google to display on your Insights dashboard. Google will also email alerts upon request.

The Gotcha: Similar to custom events, GA4 limits you to 50 custom insights per property.

Step 9: Migrate Your Segments

The Gist: Segments work differently in GA4 than they do in UA. In GA4, you’ll only find segments in Explorations. The good news is you can now set up segments for events, allowing you to segment data based on user behavior as well as more traditional segments like user geography or demographics.

The Gotcha: Each Exploration has a limit of 10 segments. If you’re using a lot of segments currently in UA, you’ll likely need to create individual reports to see data for each segment. While you can also create comparisons in reports for data subsets, those are even more limited at just four comparisons per report.

Step 10: Migrate Your Audiences

The Gist: Just like UA, GA4 allows you to set up audiences to explore trends among specific user groups. To migrate your audiences from one platform to another, you’ll need to manually create each audience in GA4.

The Gotcha: You can create a maximum of 100 audiences for each GA4 property (starting to sense a theme here?). Also, keep in mind that GA4 audiences don’t apply retroactively. While Google will provide information on users in the last 30 days who meet your audience criteria — for example, visitors from California who donated more than $100 — it won’t apply the audience filter to users earlier than that.

Step 11: Migrate Goals to Conversion Events

The Gist: If you were previously tracking goals in UA, you’ll need to migrate them over to GA4, where they’re now called conversion events. GA4 has a goals migration tool that makes this process pretty simple.

The Gotcha: GA4 limits you to 30 custom conversion events per property. If you’re in e-commerce or another industry with complex marketing needs, those 30 conversion events will add up very quickly. With GA4, it will be important to review conversion events regularly and retire ones that aren’t relevant anymore, like conversions for previous campaigns.

Step 12: Migrate Alerts

The Gist: Using custom alerts in UA? As we covered in Step 8, you can now set up custom insights to keep tabs on key changes in user activity. GA4 will deliver alerts through your Insights dashboard or email, based on your preferences.

The Gotcha: This one is actually more of a bonus – GA4 will now evaluate your data hourly, so you can learn about and respond to changes more quickly.

The Future of Measurement Is Here

GA4 is already transforming how brands think about measurement and user insights – and it’s only the beginning. While Google has been tight-lipped about the GA4 roadmap, we can likely expect even more enhancements and capabilities in the not-too-distant future. The sooner you make the transition to GA4, the sooner you’ll have access to a new level of intelligence to shape your digital roadmap and business decisions.

Need a hand getting started? We’re here to help – reach out to book a chat with us.

Was this blog written by ChatGPT? How would you really know? And what impact would it have on Oomph’s site if it were?

Yes, we know there are some great AI-detecting tools out there. But for the typical reader, picking an AI article out of a crowd can be challenging. And with AI tools like ChatGPT delivering better-quality results than ever, many companies are struggling to decide whether to hand their content and SEO reins over to the machines.

While AI can add value to your content, companies should proceed with caution to avoid some potentially big pitfalls. Here’s why.

Quality Content Is Critical to SEO

All the way back in 1996, Bill Gates said “Content is King.” This phrase became ubiquitous in the early years of SEO. At that time, you could rank well simply by writing about a search topic, then optimizing your writing with the right keywords.

Since then, search algorithms have evolved, and the Google search engine results page (SERP) is more crowded than ever (not to mention the new continuous scroll). While ranking isn’t as easy as it used to be, content — whether it’s a video, an image, a product, a blog, or a news story — still matters. When content ranks well, it’s an ad-spend-free magnet for readers that eventually become customers and subscribers. What else on your website can do that?

That makes your content special. It also puts a premium on producing a high volume of relevant content quickly. For years, brands have done this the old-fashioned way: with copywriters and designers researching, writing, revising, creating images, and publishing ad infinitum.

Until AI.

AI-Powered Content Generation Changes How We Make Content

There’s no point in denying it: AI will impact SEO. But it’s still up for debate just how deep that impact will be.

The rise of AI-powered language processing tools like ChatGPT and Meta’s Llama makes quick content generation a reality. They can easily produce high-quality content that will likely only get better with time. ChatGPT can produce an article in minutes, not hours, and even suggest keywords for you.

For all those reasons, marketers have embraced these new tools — ChatGPT shattered records when it reached 100 million daily active users in a mere two months. As the saying goes, though, just because we can, doesn’t mean we should — especially if using it means compromising on quality or losing what makes us, well, us.

After all, AI is a tool, not a human replacement. If you’re considering integrating AI into your martech stack, here’s what to keep in mind.

4 Things to Know Before Using AI for Content

1. Google may ding you.

In August 2022, Google released an algorithm update called the “helpful content update.” You can read more on Google’s blog, but what’s important is this: Google prioritizes content by people, for people.


In other words, using AI solely to churn out articles that feed the search engine beast won’t work. You should continue to create content that puts your audience first, is genuinely helpful, and is 100% unique to your brand.


But Google’s stance does beg the question: Will they really know if you used AI to generate your content? Well, they do have a very special set of skills…

Animated gif of Liam Neesen mouthing the words “I will find you” from the movie Taken (2008)

While AI content can be good quality and easy to read, tools like Crossplag’s AI Content Detector and OpenAI’s GPT-2 Output Detector can detect whether something is written by a human or by an AI bot. That comes down, in part, to the fact that AI lacks EEAT – experience, expertise, authority, and trust – which is not just a major factor in SEO, but a major reason why people consume content in the first place.

2. AI isn’t always accurate.

AI culls information from across the web, and it’s remarkably good at it. But aggregating massive amounts of information isn’t the same as fact-checking. AI tools often get it wrong – even with easily verifiable, fact-based questions.

Take the question: What is the largest country in Central America that isn’t Mexico?

AI’s answer:

The largest country in Central America that isn’t Mexico is Guatemala. Guatemala has an area of about 108,890 square kilometers, making it the largest country in Central America by land area. It is bordered by Mexico to the north, Belize to the northeast, Honduras to the east, El Salvador to the southeast, and the Pacific Ocean to the south.

Source for AI’s answer

Google’s answer:

Nicaragua is the largest country in the subregion, with a total area of over 130 thousand square kilometers, followed by Honduras, with more than 112 thousand square kilometers.

(Hint: Google is right)

This is a problem for your business and for your SEO. Accuracy is an important part of EEAT. It’s especially critical for “your money or your life” (YMYL) content, like financial or medical advice. In these cases, the content you publish can and does impact real people’s lives and livelihoods.

Spotty accuracy has even prompted some sites, like StackOverflow, to ban AI-generated content.

3. You don’t have the rights to your AI-generated content.

AI-generated content isn’t actually copyrightable. Yes, you read that right.

As it stands, the courts have interpreted the Copyright Act to mean that only human-authored works can be copyrighted. Something is only legally defensible when it involves at least a minimal amount of creativity.

We’re all familiar with this concept when it comes to books, TV shows, movies, and paintings, but it matters for your website, too. You want your content and your ideas to be yours. If you use AI-generated content, be aware that it isn’t subject to standard intellectual property rules and may not be protected.

4. AI-generated content can’t capture your voice.

Even if you fly under Google’s radar with your AI content, it still won’t really feel like you. You are the only you. We know that sounds like it belongs on an inspirational poster, but it’s true. Your voice is what readers will connect with, believe in, and ultimately trust.

Sure, AI may succeed at stringing together facts and keywords to create content that ranks. And that content may even drive people to your site. But it lacks the emotional intelligence to infuse your content with real-life examples and anecdotes that make readers more likely to read, share, and engage with your content and your brand.

Your voice is also what sets you apart from other brands in your industry. Without that, why would a customer choose you?

AI and SEO Is a Journey, Not a Destination

AI is not the end of human-driven SEO. In reality, AI has only just arrived. But the real opportunity lies in finding out how AI can enhance, not replace, our work to create winning SEO content.

Think about content translation. Hand translation is the most premium translation option out there. It’s also costly. While machine translation on its own can be a bit of a mess, many translation companies actually start with an automated solution, then bring in the humans to polish that first translation into a final product. If you ask us, AI and SEO will work in much the same way.

Even in a post-AI world, SEO all comes down to this guidance from Google:

“If it is useful, helpful, original, and satisfies aspects of E-E-A-T, it might do well in Search. If it doesn’t, it might not.”

If and when you do decide to leverage AI, keep these tips in mind:

At Oomph, we believe quality branded content is just one component of a digital experience that engages and inspires your audience.

Need help integrating SEO content into your company’s website? Let’s talk.

There’s a phrase often used to gauge healthcare quality: the right care, at the right time, in the right place. When those elements are out of sync, the patient experience can take a turn for the worse. Think about missed appointments, misunderstood pre-op instructions, mismanagement of medication… all issues that require clear and timely communication to ensure positive outcomes.

Many healthcare organizations are tapping into patient engagement tools that use artificial intelligence (AI) to drive better healthcare experiences. In this article, we’ll cover a number of use cases for AI within healthcare, showing how it can benefit providers, their patients, and their staff in an increasingly digital world.

Healthcare Consumers are Going Digital

Use of AI in the clinical space has been growing for years, from Google’s AI aiding diagnostic screenings to IBM’s Watson AI informing clinical decision making. But there are many other touchpoints along a patient’s continuum of care that can impact patient outcomes.

The industry is seeing a shift towards more personalized and data-driven patient engagement, with recent studies showing that patients are ready to integrate AI and other digital tools into their healthcare experiences.

For instance, healthcare consumers are increasingly comfortable with doctors using AI to make better decisions about their care. They also want personalized engagement to motivate them on their health journey, with 65% of patients agreeing that communication from providers makes them want to do more to improve their health.

At the same time, 80% of consumers prefer to use digital channels (online messaging, virtual appointments, text, etc…) to communicate with healthcare providers at least some of the time. This points to significant opportunities for digital tools to help providers and patients manage the healthcare experience.

Filling in Gaps: AI Use Cases for Healthcare

Healthcare will always need skilled, highly trained experts to deliver high quality care. But, AI can fill in some gaps by addressing staffing shortages, easing workflows, and improving communication. Many healthcare executives also believe AI can provide a full return on investment in less than three years.

Here are some ways AI can support healthcare consumers and providers to improve patients’ outcomes and experiences.

Streamline basic communications

Using AI as the first line to a patient for basic information enables convenient, personalized service without tying up staff resources. With tools like text-based messaging, chatbots, and automated tasks, providers can communicate with people on the devices, and at the times, that they prefer.

Examples include:

Remove barriers to access

AI algorithms are being used in some settings to conduct initial interviews that help patients determine whether they need to see a live, medical professional — and then send them to the right provider.

AI can offer a bridge for patients who, for a host of reasons, are stuck in taking the first step. For instance, having the first touchpoint as a chatbot helps overcome a barrier for patients seeking care within often-stigmatized specialities, such as behavioral health. It can also minimize time wasted at the point of care communicating things like address changes and insurance providers.

Reduce no-show rates

In the U.S., patient no-show rates range from 5.5 to 50%, depending on the location and type of practice. Missed appointments not only result in lost revenue and operational inefficiencies for health systems, they can also delay preventive care, increase readmissions, and harm long-term outcomes for patients.

AI-driven communications help ensure that patients receive critical reminders at optimal times, mitigating these risks. For instance:

Close information gaps

Imagine a patient at home, alone, not feeling well, and confused about how to take their medication or how to handle post-operative care. Not having that critical information can lead to poor outcomes, including readmission.

Delivering information at the right time, in the right place, is key. But multiple issues can arise, such as:

By providing consistent, accurate, and timely information, AI-enabled tools can provide critical support for patients and care teams.

Minimize staff burnout

Burnout and low morale have contributed to severe staffing shortages in the US healthcare system. The result is an increase in negative patient outcomes, in addition to massive hikes in labor costs for hospitals and health systems.

AI can help lighten the burden on healthcare employees through automated touchpoints in the patient journey, such as self-scheduling platforms or FAQ-answering chatbots. AI can even perform triage informed by machine learning, helping streamline the intake process and getting patients the right care as quickly as possible.

This frees up staff to focus on more meaningful downstream conversations between patients and care teams. It can also reduce phone center wait times for those patients (often seniors) who still rely on phone calls with live staff members.

Maximize staff resources

When 80% of healthcare consumers are willing to switch providers for convenience factors alone, it’s crucial to communicate with patients through their preferred channels. Some people respond to asynchronous requests (such as scheduling confirmations) late at night, while others must speak to a live staff member during the day.

Using multimodal communication channels (phone, text, email, web) offers two major benefits for healthcare providers. For one, you can better engage patients who prefer asynchronous communication. You can also identify the ratio of patients who prefer live calls and staff accordingly when it’s needed most.

Leverage customer feedback

AI provides fast, seamless avenues to gather and track patient satisfaction data and create a reliable, continual customer feedback loop. Tools like chatbots and text messaging expand the number of ways patients can communicate with healthcare providers, making it easier to leave feedback and driving not only a better digital customer experience but potentially leading to better satisfaction scores that may impact payment or quality scores.

AI offers another benefit, too: the ability to identify and respond more quickly to negative feedback. The more swiftly a problem is resolved, the better the consumer experience.

A Few Tips for Getting Started

First, find a trusted technology partner who has experience with healthcare IT stacks and understands how AI fits into the landscape. The healthcare industry is distinctly different from other verticals that might use tools like chatbots and automated tasks. You need a partner who’s familiar with the nuances of the healthcare consumer experience and regulatory compliance requirements.

Next, start small. It’s best to choose your first AI applications in a strategic, coordinated manner. One approach is to identify the biggest bottlenecks for care teams and/or patients, then assess which areas present the lowest risk to the customer experience and the greatest chance of operational success.

Finally, track the progress of your first implementation. Evaluate, iterate, evaluate again, and then expand into other areas when you’re comfortable with the results.

Focal points for iteration:

Above all, remember that successful use of AI isn’t just about how well you implement the technology. It’s about the impact those digital tools have on improving patient outcomes and increasing patient satisfaction with their healthcare experience.

Interested in exploring the specific ways AI can benefit your care team and patients? We’re here to help! Contact us today.

The circular economy aims to help the environment by reducing waste, mainly by keeping goods and services in circulation for as long as possible. Unlike the traditional linear economy, in which things are produced, consumed, and then discarded, a circular economy ensures that resources are shared, repaired, reused, and recycled, over and over.

What does this have to do with your digital platform? In a nutshell: everything.

From tackling climate change to creating more resilient markets, the circular economy is a systems-level solution for global environmental and economic issues. By building digital platforms for the circular economy, your business will be better prepared for whatever the future brings.

The Circular Economy isn’t Coming. It’s Here.

With environmental challenges growing day by day, businesses all over the world are going circular. Here are a few examples:

One area where nearly every business could adopt a circular model is the creation and use of digital platforms. The process of building websites and apps, along with their use over time, consumes precious resources (both people and energy). That’s why Oomph joined 1% For the Planet earlier this year. Our membership reflects our commitment to do more collective good — and to hold ourselves accountable for our collective impact on the environment.

But, we’re not just donating profits to environmental causes. We’re helping companies build sustainable digital platforms for the circular economy.

Curious about your platform’s environmental impact? Enter your URL into this tool to get an estimate of your digital platform’s carbon footprint.

Changing Your Platform From Linear to Circular

If protecting the environment and promoting sustainability is a priority for your business, it’s time to change the way you build and operate your websites and apps. Here’s what switching to a platform for the circular economy could look like.

From a linear mindset…

When building new sites or apps, many companies fail to focus on longevity or performance. Within just a few years, their platforms become obsolete, either as a result of business changes or a desire to keep up with rapidly evolving technologies.

So, every few years, they have to start all over again — with all the associated resource costs of building a new platform and migrating content from the old one.

Platforms that aren’t built with performance in mind tend to waste a ton of energy (and money) in their daily operation. As these platforms grow in complexity and slow down in performance, one unfortunate solution is to just increase computing power. That means you need new hardware to power the computing cycles, which leads to more e-waste, more mining for metals and more pollution from manufacturing, and more electricity to power the entire supply chain.

Enter the circular economy.

…to a circular approach.

Building a platform for the circular economy is about reducing harmful impacts and wasteful resource use, and increasing the longevity of systems and components. There are three main areas you can address:

1. Design out waste and pollution from the start.

At Oomph, we begin every project with a thorough and thoughtful discovery process that gets to the heart of what we’re building, and why. By identifying what your business truly needs in a platform — today and potentially tomorrow — you’ll minimize the need to rebuild again later.

It’s also crucial to build efficiencies into your backend code. Clean, efficient code makes things load faster and run more quickly, with fewer energy cycles required per output.

Look for existing frameworks, tools, and third-party services that provide the functions you need and will continue to stay in service for years or decades to come. And, instead of building a monolith platform that has to be upgraded every few years or requires massive computing power, consider switching to a more nimble and efficient microservices architecture.

2. Keep products and services in use.

Regular maintenance and timely patching is key to prolonging the life of your platform. So is proactively looking for performance issues. Be sure to regularly test and assess your platform’s speed and efficiency, so you can address problems early on.

While we’re advocating for using products and services for as long as possible, if your platform is built on microservices, don’t be afraid to replace an existing service with a new one. Just make sure the new service provides a benefit that outweighs the resource costs of implementing it.

3. Aim to regenerate natural systems.

The term “regenerate” describes a process that mimics the cycles of nature by restoring or renewing sources of energy and materials. It might seem like the natural world is far removed from your in-house tech, but there are a number of ways that your IT choices impact the environment.

For starters, you can factor sustainability into your decisions around vendors and equipment. Look for digital hosting companies and data centers that are green or LEED-certified. Power your hardware with renewable energy sources. Ultimately, the goal is to consider not just how to reduce your platform’s impact on the environment, but how you can create a net-positive effect by doing better with less.

Get Ready for the Future

We’ve long seen that the ways in which businesses and societies use resources can transform local and global communities. And we know that environmental quality is inextricably linked to human wellbeing and prosperity. The circular economy, then, provides a way to improve our future readiness.

Companies that invest in sustainability generally experience better resilience, improved operational performance, and longer-lasting growth. They’re also better suited to meet the new business landscape, as governments incentivize sustainable activities, customers prefer sustainable products, and employees demand sustainable leadership.

Interested in exploring how you can join the new circular economy with your digital platforms? We’d love to help you explore your options, just contact us.

In our previous post we broadly discussed the mindset of composable business. While “composable” can be a long term company-wide strategy for the future, companies shouldn’t overlook smaller-scale opportunities that exist at every level to introduce more flexibility, longevity, and reduce costs of technology investments.

For maximum ROI, think big, then start small

Many organizations are daunted by the concept of shifting a legacy application or monolith to a microservices architecture. This is exacerbated when an application is nearing end of life.

Don’t discount the fact that a move to a microservices architecture can be done progressively over time, unlike the replatform of a monolith which is a huge investment in both time and money that may not be realized for years until the new application is ready to deploy.

A progressive approach allows organizations to:

Prioritizing the approach by aligning technical architecture with business objectives

As with any application development initiative, aligning business objectives with technology decisions is essential. Unlike replatforming a monolith, however, prioritizing and planning the order of development and deployments is crucial to the success of the initiative.

Start with clearly defining your application with a requirements and feature matrix. Then evaluate each using three lenses to see priorities begin to emerge:

  1. With a current state lens, evaluate each item. Is it broken? Is it costly to maintain? Is it leveraged by multiple business units or external applications?
  2. Then with a future state lens, evaluate each item. Could it be significantly improved? Could it be leveraged by other business units? Could it be leveraged outside the organization (partners, etc…)? Could it be leveraged in other applications, devices, or locations?
  3. Lastly, evaluate the emerging priority items with a cost and effort lense. What is the level of effort to develop the feature as a service? What is the likely duration of the effort?

Key considerations when planning a progressive approach

Planning is critical to any successful application development initiative, and architecting a microservices based architecture is no different. Be sure to consider the following key items as part of your planning exercises:

  1. Remember that rearchitecting a monolith feature as a service can open the door to new opportunities and new ways of thinking. It is helpful to ask “If this feature was a stand alone service, we could __
  2. Be careful of designing services that are too big in scope. Work diligently to break down the application into the smallest possible parts, even if it is later determined that some should be grouped together
  3. Keep security front of mind. Where a monolith may have allowed for a straightforward security management policy with everything under one roof, a services architecture provides the opportunity for a more customized security policy, and the need to define how separate services are allowed to communicate with each other and the outside world

In summary

A microservices architecture is an approach that can help organizations move faster, be more flexible and agile, and reduce costs on development and maintenance of software applications. By taking a progressive approach when architecting a monolith application, businesses can move quickly, reduce risk, improve quality, and reduce costs.

If you’re interested in introducing composability to your organization, we’d love to help! Contact us today to talk about your options.

Digital customer experience (DCX) is fast becoming a key factor in how consumers choose whom to do business with. Every digital interaction contributes to an overall feeling about your brand — which means digital touchpoints like apps and chatbots can play a big part in what customers think of your company.

What story do you want those interactions to tell? What kind of experiences do you want people to share with others?

This article covers five ways to assess and improve your digital customer experience so you can attract, delight, and retain your target customers.

But First – What IS Digital Customer Experience?

Customer experience, or CX, is the perception that customers form based on all of their interactions, in-person or online, with your brand. If CX is about carefully and consistently meeting your customers’ needs, Digital Customer Experience is the online expression of those efforts.

Digital customer experience is the part of your CX journey that involves digital interactions via your website, mobile app, social media accounts, digital kiosks, etc. Wherever your customers are engaging with your people, products, or services through the internet, it’s a digital experience.

DCX is their perception of those moments.

Brands with a great DCX provide a personalized and consistent online experience throughout the customer journey. Whether someone is considering becoming a client, placing an order, or searching for information, every digital interaction has to be easy and enjoyable.

5 Ways to Improve Your Digital Customer Experience

Technology is a wonderful tool for improving the customer experience, whether mining data for customer insights or leveraging AI for personalization. But technology alone can’t deliver an exceptional digital customer experience. Your DCX strategy must include a human component — one that focuses on customer care through empathy and authenticity. Here’s how to ensure your digital customer experience lives up to your users’ expectations.

Know your target audience

To deliver the kind of digital experience your customers will love, you have to know what they want. Who’s buying your product, and why? When they visit your website or app, what are they hoping to accomplish?

Delighting your customers requires knowing their goals, understanding their pain points, and providing interactions that meet their specific needs. The upshot? 68% of customers will spend more money with a brand that understands and treats them like an individual.

Here are three crucial steps:

  1. Use qualitative and quantitative analyses to learn about your audience. The more you understand their preferences and behaviors, the better you can create an experience that meets their needs.
  2. Apply a user-centered design process, which relies on deeply understanding your audience to craft usable, accessible digital interfaces.
  3. Incorporate personalization techniques to adapt the digital experience for individual users. More than anything else, this will help make the customer journey smooth and enjoyable.

Adopt an omnichannel mindset

Customers expect seamless interactions from brands throughout their journey, whether through digital or non-digital channels. In fact, brands with the strongest omnichannel customer engagement strategies retain an average of 89% of their customers, in comparison to 33% of companies with weak strategies.

Knowing that today’s consumers often jump from channel to channel as they browse, buy, or get in touch, DCX leaders embrace an omnichannel strategy. Note that this is different from a multichannel approach, where customers access multiple channels in separate interactions. An omnichannel approach integrates all digital touchpoints to create a seamless, personalized experience.

Multichannel: Some or all channels available but no data syncing between chgannels causing a disconnected customer experience. Omnichannel: All channels available with data syncing between channels providing a seamless customer experience.
Image sourced from Zingle.com

Here are a few key ways to create personalized experiences that resonate across all your digital channels:

Get help from experts

Expert assessments can remove the guesswork around optimizing your digital customer experience. A digital CX audit, for instance, will show you what’s working and what could be better, as well as providing actionable insights and a prioritized roadmap.

CX specialists will look beyond the basic digital experience (clunky design, system bugs, etc…) to assess whether your digital channels are effectively serving your customers’ needs. A professional audit can help determine things like:

Make customer feedback easy

Most companies know that customer feedback is crucial for improving the customer experience. But many fall short in providing easy, effective options for people to reach them.

Offering multiple, easy-to-use communication options across your digital channels is one more way to delight your customers. Help people engage with you via the medium of their choice, so they can communicate through the interface they’re most comfortable with.

That could be a chat function or contact form on your website, or the commenting and messaging features on your social profiles. Or, maybe it’s good old-fashioned phone calls and emails. Whatever the avenue, make it easy to find and intuitive to use.

One more thing: when someone does reach out, respond quickly. The faster a problem is resolved, the better the experience.

Plan for the post-launch reality

You might design and launch an amazing new website, app, or service that delights your customers and sends revenue through the roof. But, without a long-term plan to keep it effective and relevant, your digital CX will likely diminish over time.

To maintain the quality of customer experience across all your digital touchpoints, apply a measurement framework based on the principles above:

Remember, too, that new technological trends are going to keep emerging and influencing consumer expectations. Be prepared to evolve what digital CX looks like for your business, especially if it means extending your digital services to new platforms or devices.

Putting the “C” in Digital CX

Technology has made so many things possible for today’s consumers that, ultimately, the power is in their hands. As digital capabilities continue to evolve, people may become increasingly selective about which brands earn their trust and business — and companies will need to make the digital customer experience more beneficial for both sides.

As you can see from the steps above, the key is putting your customers’ needs above all else.

If you’re not sure where to start, you’re not alone! We’ve helped dozens of clients dive into customer research, omnichannel strategies, and strategic planning for digital platforms. Reaching out to a digital CX expert (like Oomph) can help you do things right the first time, saving you time and money and, most importantly, building a foundation to get results.

Excited about crafting an exceptional DCX? So are we. Check out our DCX audit service to learn how we can help set you up for success.

Google Analytics 4, or GA4, is Google’s fourth iteration of its website analytics platform. This is no ordinary upgrade! Leveraging the power of big data and machine learning, GA4 offers entirely new ways to collect and analyze user activity data across websites and apps.

While GA4 provides access to robust new tools and features for data-driven decision making, it also sheds many of the metrics and reports we’re used to in Google Analytics 3 (a.k.a. Universal Analytics, or UA).

Google will be sunsetting UA properties in July 2023. Here’s what you need to know about GA4’s capabilities — and why you should start the transition sooner rather than later.


Not sure which platform you currently have (UA vs. GA4)? 

Take a look at this cheat sheet.


Key Benefits of Google Analytics 4

We’re living in a more privacy-centric world, and GA4 is Google’s answer to stricter data laws and browser regulations. GA4 is designed to function without third-party cookies, using machine learning and statistical modeling instead to collect data.

This change comes with a range of benefits, from more actionable user insights to enhanced reporting capabilities.

Broader Insights

Unlike UA, GA4 has the ability to track users across devices and platforms, combining all the data into a single property with a unified set of metrics and dimensions. This gives you a more complete picture of how users interact with your brand, whether they’re on your website, your mobile app, or both.

Another major advantage is that you can more effectively track conversions — particularly for users that might visit on their mobile, come back on desktop, and then download/purchase/register through your app. Because GA4 attributes actions to users across devices and platforms, you can see the entire journey a user takes from start to finish.

Predictive Metrics

Using machine learning, GA4 offers powerful new metrics to predict user actions and includes new data buckets like Acquisition, Engagement, Monetization, and Retention. These predictive metrics can help you better understand your audience and make more informed decisions, so you can do things like tailoring your website experience for different users or creating targeted marketing campaigns.

Customized Reporting

UA offers a set of standard reports with some customization options. By contrast, GA4 enables and encourages users to create custom reports with only the data they need.

With greater freedom to create reports, you can declutter your dashboard and make decisions more quickly by drilling down to the data that’s most important to you. You can even create a separate “Audiences” report with custom user definitions, further tailoring the data to support your business needs.

Key Features of Google Analytics 4

With comprehensive user tracking, predictive metrics, customizable reports, and more, GA4 promises to be much more powerful than any previous version of Google Analytics. Here are the core capabilities driving all of those benefits.

Event-Based Tracking

One of the biggest changes in GA4 is how user data is collected. In UA, data is collected via tags placed on each page of a website. Users are tracked via sessions, or set periods that begin and end when a user enters and exits a site.

Instead of relying on pageviews and sessions, GA4 tracks user interactions, known as “events,” as users complete them. This focus on individual user interactions provides a more complete picture of each user’s journey across your website or app.

This event-based model also makes it possible to track interactions that don’t happen on web pages but can be influenced by digital marketing, such as in-store visits or in-app purchases. And, it allows Google to more accurately deduplicate users.

Cross-Platform Data Consolidation

In UA, “properties” are where Analytics data is collected for individual websites and apps. You can then use views to see and report on the data in various ways.

GA4 uses individual data streams to combine data from different platforms into a single property. You can add multiple data streams into a property and create different views based on certain criteria.

For example, you could create a stream for all web traffic, a stream for all app traffic, or a stream for traffic from both that covers a given geographic area. By placing the same tracking code across different digital platforms, you can consolidate data to track users who move between the streams.

Advanced Analytics

Maybe the most exciting feature for data geeks like us, GA4’s Explorations Hub offers a suite of advanced data and analytical techniques that go well beyond standard reports. The Explore section lets you create custom analyses to uncover deeper insights about your website and app performance, with filters and segments so you can drill down even further.

GA4 also integrates with BigQuery, Google’s cloud-based data warehouse, where you can run complex analyses of very large datasets. Bonus: BigQuery offers near-unlimited data storage.

Machine Learning

In an increasingly cookie-less world, Google is attempting to balance privacy limitations with usable insights. Using machine learning, GA4 fills in data gaps and provides predictive insights about user behavior and trends.

Machine learning combines artificial intelligence (AI) and computer science to fill in gaps and make predictions. It essentially looks for patterns of activity that can be fed into an algorithm to understand and predict how users behave online.

As an example, GA4’s AI-powered insights can help identify user actions that are most likely to lead to conversions. Using metrics like purchase probability, churn probability, and revenue prediction, you can customize marketing campaigns or target specific audiences to achieve your conversion goals.

Why You Should Switch to GA4 ASAP

You’ll be able to collect and use platform data in your existing UA property until July 1, 2023. After that, you’ll be able to access historical data for only six months. That’s why we strongly recommend you implement GA4 as soon as possible.

Transitioning now will allow you to:

Feed The Machine

Many of GA4’s core features rely on machine learning, and in order for machine learning to be effective, the algorithm needs time to learn. The sooner you set up and start collecting data in GA4, the more time your models will have to analyze and learn, shaping the insights you’ll need down the road.

Train Your People

Those using GA4 will need time to learn the new terminology, user interface, and capabilities. Switching early gives your team time to get used to the new platform and work out new processes and reporting while you still have UA to fall back on.

Get Year-Over-Year Data

GA4 is forward-facing only, which means your new GA4 property will only collect data from the time of creation; it won’t import past data from UA. Once UA sunsets next year, you’ll be relying solely on GA4 for year-over-year data.

Why does that matter? Here at Oomph, when we launch client projects, we use Google Analytics data to analyze digital platform performance so we can develop the best possible user experience. By examining user flows, page visits, common search terms, engagement metrics, and more, we can very quickly get a picture of where a platform has strengths and weak points. And we need your historical data to do it.

Ready to switch to Google Analytics 4? It’s a relatively simple process. Just follow the steps Google provides, whether you want to switch from UA to GA4 or set up a GA4 property alongside an existing UA property.

If you’re not feeling confident about handling the transition alone, we’d love to help. Get in touch with us today.

Many organizations today, large and small, have a digital asset problem. Companies are amassing huge libraries of images, videos, audio recordings, documents, and other files — while relying on shared folders and email to move them around the organization. As asset libraries explode, digital asset management (DAM) is crucial for keeping things accessible and up to date, so teams can spend more time getting work done and less time hunting for files.

First Things First: DAM isn’t Dropbox

Some folks still equate DAM with basic digital storage solutions, like Dropbox or Google Drive. While those are great for simple sharing needs, they’re essentially just file cabinets in the cloud.

DAM technology is purpose-built to optimize the way you store, maintain, and distribute digital assets. A DAM platform not only streamlines day-to-day content work; it also systematizes the processes and guidelines that govern content quality and use.

Today’s DAMs have sophisticated functionality that offers a host of benefits, including:

Is it time for your business to invest in a DAM? Let’s see if you recognize the pain points below:

The 5 Signs You Need a DAM

There are some things you can’t afford not to invest in if they significantly impact your team’s creativity and productivity and your business’s bottom line. Here are some of the most common signs it’s time to invest in a DAM:

It takes more than a few seconds to find what you need.

As your digital asset library grows, it’s harder to keep sifting through it all to find things — especially if you’re deciphering other people’s folder systems. If you don’t know the exact name of an asset or the folder it’s in, you’re often looking for a needle in a haystack.

Using a DAM, you can tag assets with identifying attributes (titles, keywords, etc.) and then quickly search the entire database for the ones that meet your criteria. DAMs also offer AI- and machine-learning–based tagging, which automatically adds tags based on the content of an image or document. Voila! A searchable database with less manual labor.

You have multiple versions of documents — in multiple places.

Many of our clients, including universities, healthcare systems, libraries, and nonprofits, have large collections of policy documents. These files often live on public websites, intranets, and elsewhere, with the intent that staff can pull them up as needed.

Problem is, if there’s a policy change, you need to be sure that anywhere a document is accessed, it’s the most current version. And you can’t just delete old files on a website, because any previous links to them will go up in smoke.

DAMs are excellent at managing document updates and variations, making it easy to find and replace old versions. They can also perform in-place file swaps without breaking the connections to the pieces of content that refer to a particular file.

You’re still managing assets by email.

With multiple team members or departments relying on the same pool of digital assets for a variety of use cases, some poor souls will spend hours every day answering email requests, managing edits, and transferring files. The more assets and channels you’re dealing with, the more unwieldy this gets.

DAMs facilitate collaboration by providing a single, centralized platform where team members can assign tasks, track changes, and configure permissions and approval processes. As a result, content creators know they’re using the most up-to-date, fully approved assets.

Your website doubles as a dump bin.

If your website is the source of assets for your entire organization, it can be a roadblock for other departments that need to use those assets in other places. They need to know how to find assets, download copies, and obtain sizes or formats that differ from the web-based versions… and there may or may not be a web team to assist.

What’s more, some web hosting providers offer limited storage space. If you have a large and growing digital library, you’ll hit those limits in no time.

A DAM provides a high-capacity, centralized location where staff can easily access current, approved digital assets in various sizes and formats.

You’re duplicating assets you already have.

How many times have you had different teams purchase assets like stock photography and audio tracks, when they could have shared the files instead? Or, maybe your storage folders are overrun with duplicates. Instead of relying on teams to communicate whenever they create or use an asset, you could simplify things with a DAM.

Storing and tagging all your assets, in various sizes and formats, in a DAM enables your teams to:

When Should You Implement a DAM?

You can implement a DAM whether you have an existing website or you’re building a new one. DAM technology easily complements platform builds or redesigns, helping to make websites and intranets even more powerful. Organizing all of your assets in a DAM before launching a web project also makes it easier to migrate them to your new platform and helps ensure that nothing gets lost.

Plus, we’ve seen companies cling to old websites when too many departments are still using assets that are hosted on the site. Moving your assets out of your website and into a DAM frees you up to move on.

If you’re curious about your options for a DAM platform, there are a number of solutions on the market. Our partner Acquia offers an excellent DAM platform with an impressive range of functions for organizing, accessing, publishing, and repurposing assets, automating manual processes, and monitoring content metrics.

Other candidates to consider include Adobe Experience Manager AssetsBynderPictureParkCantoCloudinaryBrandfolder, and MediaValet.

Given the number of DAMs on the market, choosing the right solution is a process. We’re happy to share our experience in DAM use and implementation, to help you find the best one for your needs. Just get in touch with any questions you have.