More than two years after Google announced the launch of its powerful new website analytics platform, Google Analytics 4 (GA4), the final countdown to make the switch is on.

GA4 will officially replace Google’s previous analytics platform, Universal Analytics (UA), on July 1, 2023. It’s the first major analytics update from Google since 2012 — and it’s a big deal. As we discussed in a blog post last year, GA4 uses big data and machine learning to provide a next-generation approach to measurement, including:

At Oomph, we’ve learned a thing or two about making the transition seamless while handling GA4 migrations for our clients – including a few platform “gotchas” that are definitely better to know in advance. Before you start your migration, do yourself a favor and explore our GA4 setup guide.

Your 12-Step GA4 Migration Checklist

Step 1: Create a GA4 Analytics Property and Implement Tagging

The Gist: Launch the GA4 setup assistant to create a new GA4 property for your site or app. For sites that already have UA installed, Google is beginning to create GA4 properties automatically for them beginning in March 2023 (unless you opt out). If you’re migrating from UA, you can connect your UA property to your GA4 property to use the existing Google tracking tag on your site. For new sites, you’ll need to add the tag directly to your site or via Google Tag Manager.

The Gotcha: During property setup, Google will ask you which data streams you’d like to add (websites, apps, etc…). This is simple if you’re just tracking one site, but gets more complex for organizations with multiple properties, like educational institutions or retailers with individual locations. While UA allowed you to separate data streams by geography or line of business, GA4 handles this differently. This Google guide can help you choose the ideal configuration for your business model.

Step 2: Update Your Data Retention Settings

The Gist: GA4 lets you control how long you retain data on users and events before it’s automatically deleted from Google’s servers. For user-level data, including conversions, you can hang on to data for up to 14 months. For other event data, you have the option to retain the information for 2 months or 14 months.

The Gotcha: The data retention limits are much shorter than UA, which allowed you to keep Google-signals data for up to 26 months in some cases. The default retention setting in GA4 is 2 months for some types of data – a surprisingly short window, in our opinion – so be sure to extend it to avoid data loss.

Step 3: Initialize BigQuery

The Gist: Have a lot of data to analyze? GA4 integrates with BigQuery, Google’s cloud-based data warehouse, so you can store historical data and run analyses on massive datasets. Google walks you through the steps here.

The Gotcha: Since GA4 has tight time limits on data retention as well as data limits on reporting , skipping this step could compromise your reporting. BigQuery is a helpful workaround for storing, analyzing and visualizing large amounts of complex data.

Step 4: Configure Enhanced Measurements

The Gist: GA4 measures much more than pageviews – you can now track actions like outbound link clicks, scrolls, and engagements with YouTube videos automatically through the platform. When you set up GA4, simply check the box for any metrics you want GA4 to monitor. You can still use Google tags to customize tracking for other types of events or use Google’s Measurement Protocol for advanced tracking.

The Gotcha: If you were previously measuring events through Google tags that GA4 will now measure automatically, take the time to review which ones to keep to avoid duplicating efforts. It may be simpler to use GA4 tracking – giving you a good reason to do that Google Tag Manager cleanup you’ve been meaning to get to.

Step 5: Configure Internal and Developer Traffic Settings

The Gist: To avoid having employees or IT teams cloud your insights, set up filters for internal and developer traffic. You can create up to 10 filters per property.

The Gotcha: Setting up filters for these users is only the first step – you’ll also need to toggle the filter to “Active” for it to take effect (a step that didn’t exist in UA). Make sure to turn yours on for accurate reporting.

Step 6: Migrate Users

The Gist: If you were previously using UA, you’ll need to migrate your users and their permission settings to GA4. Google has a step-by-step guide for migrating users.

The Gotcha: Migrating users is a little more complex than just clicking a button. You’ll need to install the GA4 Migrator from Google Analytics add-on, then decide how to migrate each user from UA. You also have the option to add users manually.

Step 7: Migrate Custom Events

The Gist: Event tracking has fundamentally changed in GA4. While UA offered three default parameters for events (eventcategory, action, and eventlabel), GA4 lets you create any custom conventions you’d like. With more options at your fingertips, it’s a great opportunity to think through your overall measurement approach and which data is truly useful for your business intelligence.

When mapping UA events to GA4, look first to see if GA4 is collecting the data as an enhanced measurement, automatically collected, or recommended event. If not, you can create your own custom event using custom definitions. Google has the details for mapping events.

The Gotcha: Don’t go overboard creating custom definitions – GA4 limits you to 50 per property.

Step 8: Migrate Custom Filters to Insights

The Gist: Custom filters in UA have become Insights in GA4. The platform offers two types of insights: automated insights based on unusual changes or emerging trends, and custom insights based on conditions that matter to you. As you implement GA4, you can set up custom insights for Google to display on your Insights dashboard. Google will also email alerts upon request.

The Gotcha: Similar to custom events, GA4 limits you to 50 custom insights per property.

Step 9: Migrate Your Segments

The Gist: Segments work differently in GA4 than they do in UA. In GA4, you’ll only find segments in Explorations. The good news is you can now set up segments for events, allowing you to segment data based on user behavior as well as more traditional segments like user geography or demographics.

The Gotcha: Each Exploration has a limit of 10 segments. If you’re using a lot of segments currently in UA, you’ll likely need to create individual reports to see data for each segment. While you can also create comparisons in reports for data subsets, those are even more limited at just four comparisons per report.

Step 10: Migrate Your Audiences

The Gist: Just like UA, GA4 allows you to set up audiences to explore trends among specific user groups. To migrate your audiences from one platform to another, you’ll need to manually create each audience in GA4.

The Gotcha: You can create a maximum of 100 audiences for each GA4 property (starting to sense a theme here?). Also, keep in mind that GA4 audiences don’t apply retroactively. While Google will provide information on users in the last 30 days who meet your audience criteria — for example, visitors from California who donated more than $100 — it won’t apply the audience filter to users earlier than that.

Step 11: Migrate Goals to Conversion Events

The Gist: If you were previously tracking goals in UA, you’ll need to migrate them over to GA4, where they’re now called conversion events. GA4 has a goals migration tool that makes this process pretty simple.

The Gotcha: GA4 limits you to 30 custom conversion events per property. If you’re in e-commerce or another industry with complex marketing needs, those 30 conversion events will add up very quickly. With GA4, it will be important to review conversion events regularly and retire ones that aren’t relevant anymore, like conversions for previous campaigns.

Step 12: Migrate Alerts

The Gist: Using custom alerts in UA? As we covered in Step 8, you can now set up custom insights to keep tabs on key changes in user activity. GA4 will deliver alerts through your Insights dashboard or email, based on your preferences.

The Gotcha: This one is actually more of a bonus – GA4 will now evaluate your data hourly, so you can learn about and respond to changes more quickly.

The Future of Measurement Is Here

GA4 is already transforming how brands think about measurement and user insights – and it’s only the beginning. While Google has been tight-lipped about the GA4 roadmap, we can likely expect even more enhancements and capabilities in the not-too-distant future. The sooner you make the transition to GA4, the sooner you’ll have access to a new level of intelligence to shape your digital roadmap and business decisions.

Need a hand getting started? We’re here to help – reach out to book a chat with us.

Was this blog written by ChatGPT? How would you really know? And what impact would it have on Oomph’s site if it were?

Yes, we know there are some great AI-detecting tools out there. But for the typical reader, picking an AI article out of a crowd can be challenging. And with AI tools like ChatGPT delivering better-quality results than ever, many companies are struggling to decide whether to hand their content and SEO reins over to the machines.

While AI can add value to your content, companies should proceed with caution to avoid some potentially big pitfalls. Here’s why.

Quality Content Is Critical to SEO

All the way back in 1996, Bill Gates said “Content is King.” This phrase became ubiquitous in the early years of SEO. At that time, you could rank well simply by writing about a search topic, then optimizing your writing with the right keywords.

Since then, search algorithms have evolved, and the Google search engine results page (SERP) is more crowded than ever (not to mention the new continuous scroll). While ranking isn’t as easy as it used to be, content – whether it’s a video, an image, a product, a blog, or a news story – still matters. When content ranks well, it’s an ad-spend-free magnet for readers that eventually become customers and subscribers. What else on your website can do that?

That makes your content special. It also puts a premium on producing a high volume of relevant content quickly. For years, brands have done this the old-fashioned way: with copywriters and designers researching, writing, revising, creating images, and publishing ad infinitum.

Until AI.

AI-Powered Content Generation Changes How We Make Content

There’s no point in denying it: AI will impact SEO. But it’s still up for debate just how deep that impact will be.

The rise of AI-powered language processing tools like ChatGPT and DALL-E makes quick content generation a reality. They can easily produce high-quality content that will likely only get better with time. ChatGPT can produce an article in minutes, not hours, and even suggest keywords for you.

For all those reasons, marketers have embraced these new tools – ChatGPT shattered records when it reached 100 million daily active users in a mere two months. As the saying goes, though, just because we can, doesn’t mean we should – especially if using it means compromising on quality or losing what makes us, well, us.

After all, AI is a tool, not a replacement human. If you’re considering integrating AI into your martech stack, here’s what to keep in mind.

4 Things to Know Before Using AI for Content

1. Google may ding you.

In August 2022, Google released an algorithm update called the “helpful content update.” You can read more on Google’s blog, but what’s important is this: Google prioritizes content by people, for people.


In other words, using AI solely to churn out articles that feed the search engine beast won’t work. You should continue to create content that puts your audience first, is genuinely helpful, and is 100% unique to your brand.


But Google’s stance does beg the question: Will they really know if you used AI to generate your content? Well, they do have a very special set of skills…

Animated gif of Liam Neesen mouthing the words “I will find you” from the movie Taken (2008)

While AI content can be good quality and easy to read, tools like Crossplag’s AI Content Detector and OpenAI’s GPT-2 Output Detector can detect whether something is written by a human or by an AI bot. That comes down, in part, to the fact that AI lacks EEAT – experience, expertise, authority, and trust – which is not just a major factor in SEO, but a major reason why people consume content in the first place.

2. AI isn’t always accurate.

AI culls information from across the web, and it’s remarkably good at it. But aggregating massive amounts of information isn’t the same as fact-checking. AI tools often get it wrong – even with easily verifiable, fact-based questions.

Take the question: What is the largest country in Central America that isn’t Mexico?

AI’s answer:

The largest country in Central America that isn’t Mexico is Guatemala. Guatemala has an area of about 108,890 square kilometers, making it the largest country in Central America by land area. It is bordered by Mexico to the north, Belize to the northeast, Honduras to the east, El Salvador to the southeast, and the Pacific Ocean to the south.

Source for AI’s answer

Google’s answer:

Nicaragua is the largest country in the subregion, with a total area of over 130 thousand square kilometers, followed by Honduras, with more than 112 thousand square kilometers.

(Hint: Google is right)

This is a problem for your business and for your SEO. Accuracy is an important part of EEAT. It’s especially critical for “your money or your life” (YMYL) content, like financial or medical advice. In these cases, the content you publish can and does impact real people’s lives and livelihoods.

Spotty accuracy has even prompted some sites, like StackOverflow, to ban AI-generated content.

3. You don’t have the rights to your AI-generated content.

AI-generated content isn’t actually copyrightable. Yes, you read that right.

As it stands, the courts have interpreted the Copyright Act to mean that only human-authored works can be copyrighted. Something is only legally defensible when it involves at least a minimal amount of creativity.

We’re all familiar with this concept when it comes to books, TV shows, movies, and paintings, but it matters for your website, too. You want your content and your ideas to be yours. If you use AI-generated content, be aware that it isn’t subject to standard intellectual property rules and may not be protected.

4. AI-generated content can’t capture your voice.

Even if you fly under Google’s radar with your AI content, it still won’t really feel like you. You are the only you. We know that sounds like it belongs on an inspirational poster, but it’s true. Your voice is what readers will connect with, believe in, and ultimately trust.

Sure, AI may succeed at stringing together facts and keywords to create content that ranks. And that content may even drive people to your site. But it lacks the emotional intelligence to infuse your content with real-life examples and anecdotes that make readers more likely to read, share, and engage with your content and your brand.

Your voice is also what sets you apart from other brands in your industry. Without that, why would a customer choose you?

AI and SEO Is a Journey, Not a Destination

AI is not the end of human-driven SEO. In reality, AI has only just arrived. But the real opportunity lies in finding out how AI can enhance, not replace, our work to create winning SEO content.

Think about content translation. Hand translation is the most premium translation option out there. It’s also costly. While machine translation on its own can be a bit of a mess, many translation companies actually start with an automated solution, then bring in the humans to polish that first translation into a final product. If you ask us, AI and SEO will work in much the same way.

Even in a post-AI world, SEO all comes down to this guidance from Google:

“If it is useful, helpful, original, and satisfies aspects of E-E-A-T, it might do well in Search. If it doesn’t, it might not.”

If and when you do decide to leverage AI, keep these tips in mind:

At Oomph, we believe quality branded content is just one component of a digital experience that engages and inspires your audience.

Need help integrating SEO content into your company’s website? Let’s talk.

There’s a phrase often used to gauge healthcare quality: the right care, at the right time, in the right place. When those elements are out of sync, the patient experience can take a turn for the worse. Think about missed appointments, misunderstood pre-op instructions, mismanagement of medication… all issues that require clear and timely communication to ensure positive outcomes.

Many healthcare organizations are tapping into patient engagement tools that use artificial intelligence (AI) to drive better healthcare experiences. In this article, we’ll cover a number of use cases for AI within healthcare, showing how it can benefit providers, their patients, and their staff in an increasingly digital world.

Healthcare Consumers are Going Digital

Use of AI in the clinical space has been growing for years, from Google’s AI aiding diagnostic screenings to IBM’s Watson AI informing clinical decision making. But there are many other touchpoints along a patient’s continuum of care that can impact patient outcomes.

The industry is seeing a shift towards more personalized and data-driven patient engagement, with recent studies showing that patients are ready to integrate AI and other digital tools into their healthcare experiences.

For instance, healthcare consumers are increasingly comfortable with doctors using AI to make better decisions about their care. They also want personalized engagement to motivate them on their health journey, with 65% of patients agreeing that communication from providers makes them want to do more to improve their health.

At the same time, 80% of consumers prefer to use digital channels (online messaging, virtual appointments, text, etc…) to communicate with healthcare providers at least some of the time. This points to significant opportunities for digital tools to help providers and patients manage the healthcare experience.

Filling in Gaps: AI Use Cases for Healthcare

Healthcare will always need skilled, highly trained experts to deliver high quality care. But, AI can fill in some gaps by addressing staffing shortages, easing workflows, and improving communication. Many healthcare executives also believe AI can provide a full return on investment in less than three years.

Here are some ways AI can support healthcare consumers and providers to improve patients’ outcomes and experiences.

Streamline basic communications

Using AI as the first line to a patient for basic information enables convenient, personalized service without tying up staff resources. With tools like text-based messaging, chatbots, and automated tasks, providers can communicate with people on the devices, and at the times, that they prefer.

Examples include:

Remove barriers to access

AI algorithms are being used in some settings to conduct initial interviews that help patients determine whether they need to see a live, medical professional — and then send them to the right provider.

AI can offer a bridge for patients who, for a host of reasons, are stuck in taking the first step. For instance, having the first touchpoint as a chatbot helps overcome a barrier for patients seeking care within often-stigmatized specialities, such as behavioral health. It can also minimize time wasted at the point of care communicating things like address changes and insurance providers.

Reduce no-show rates

In the U.S., patient no-show rates range from 5.5 to 50%, depending on the location and type of practice. Missed appointments not only result in lost revenue and operational inefficiencies for health systems, they can also delay preventive care, increase readmissions, and harm long-term outcomes for patients.

AI-driven communications help ensure that patients receive critical reminders at optimal times, mitigating these risks. For instance:

Close information gaps

Imagine a patient at home, alone, not feeling well, and confused about how to take their medication or how to handle post-operative care. Not having that critical information can lead to poor outcomes, including readmission.

Delivering information at the right time, in the right place, is key. But multiple issues can arise, such as:

By providing consistent, accurate, and timely information, AI-enabled tools can provide critical support for patients and care teams.

Minimize staff burnout

Burnout and low morale have contributed to severe staffing shortages in the US healthcare system. The result is an increase in negative patient outcomes, in addition to massive hikes in labor costs for hospitals and health systems.

AI can help lighten the burden on healthcare employees through automated touchpoints in the patient journey, such as self-scheduling platforms or FAQ-answering chatbots. AI can even perform triage informed by machine learning, helping streamline the intake process and getting patients the right care as quickly as possible.

This frees up staff to focus on more meaningful downstream conversations between patients and care teams. It can also reduce phone center wait times for those patients (often seniors) who still rely on phone calls with live staff members.

Maximize staff resources

When 80% of healthcare consumers are willing to switch providers for convenience factors alone, it’s crucial to communicate with patients through their preferred channels. Some people respond to asynchronous requests (such as scheduling confirmations) late at night, while others must speak to a live staff member during the day.

Using multimodal communication channels (phone, text, email, web) offers two major benefits for healthcare providers. For one, you can better engage patients who prefer asynchronous communication. You can also identify the ratio of patients who prefer live calls and staff accordingly when it’s needed most.

Leverage customer feedback

AI provides fast, seamless avenues to gather and track patient satisfaction data and create a reliable, continual customer feedback loop. Tools like chatbots and text messaging expand the number of ways patients can communicate with healthcare providers, making it easier to leave feedback and driving not only a better digital customer experience but potentially leading to better satisfaction scores that may impact payment or quality scores.

AI offers another benefit, too: the ability to identify and respond more quickly to negative feedback. The more swiftly a problem is resolved, the better the consumer experience.

A Few Tips for Getting Started

First, find a trusted technology partner who has experience with healthcare IT stacks and understands how AI fits into the landscape. The healthcare industry is distinctly different from other verticals that might use tools like chatbots and automated tasks. You need a partner who’s familiar with the nuances of the healthcare consumer experience and regulatory compliance requirements.

Next, start small. It’s best to choose your first AI applications in a strategic, coordinated manner. One approach is to identify the biggest bottlenecks for care teams and/or patients, then assess which areas present the lowest risk to the customer experience and the greatest chance of operational success.

Finally, track the progress of your first implementation. Evaluate, iterate, evaluate again, and then expand into other areas when you’re comfortable with the results.

Focal points for iteration:

Above all, remember that successful use of AI isn’t just about how well you implement the technology. It’s about the impact those digital tools have on improving patient outcomes and increasing patient satisfaction with their healthcare experience.

Interested in exploring the specific ways AI can benefit your care team and patients? We’re here to help! Contact us today.

The circular economy aims to help the environment by reducing waste, mainly by keeping goods and services in circulation for as long as possible. Unlike the traditional linear economy, in which things are produced, consumed, and then discarded, a circular economy ensures that resources are shared, repaired, reused, and recycled, over and over.

What does this have to do with your digital platform? In a nutshell: everything.

From tackling climate change to creating more resilient markets, the circular economy is a systems-level solution for global environmental and economic issues. By building digital platforms for the circular economy, your business will be better prepared for whatever the future brings.

The Circular Economy isn’t Coming. It’s Here.

With environmental challenges growing day by day, businesses all over the world are going circular. Here are a few examples:

One area where nearly every business could adopt a circular model is the creation and use of digital platforms. The process of building websites and apps, along with their use over time, consumes precious resources (both people and energy). That’s why Oomph joined 1% For the Planet earlier this year. Our membership reflects our commitment to do more collective good — and to hold ourselves accountable for our collective impact on the environment.

But, we’re not just donating profits to environmental causes. We’re helping companies build sustainable digital platforms for the circular economy.

Curious about your platform’s environmental impact? Enter your URL into this tool to get an estimate of your digital platform’s carbon footprint.

Changing Your Platform From Linear to Circular

If protecting the environment and promoting sustainability is a priority for your business, it’s time to change the way you build and operate your websites and apps. Here’s what switching to a platform for the circular economy could look like.

From a linear mindset…

When building new sites or apps, many companies fail to focus on longevity or performance. Within just a few years, their platforms become obsolete, either as a result of business changes or a desire to keep up with rapidly evolving technologies.

So, every few years, they have to start all over again — with all the associated resource costs of building a new platform and migrating content from the old one.

Platforms that aren’t built with performance in mind tend to waste a ton of energy (and money) in their daily operation. As these platforms grow in complexity and slow down in performance, one unfortunate solution is to just increase computing power. That means you need new hardware to power the computing cycles, which leads to more e-waste, more mining for metals and more pollution from manufacturing, and more electricity to power the entire supply chain.

Enter the circular economy.

…to a circular approach.

Building a platform for the circular economy is about reducing harmful impacts and wasteful resource use, and increasing the longevity of systems and components. There are three main areas you can address:

1. Design out waste and pollution from the start.

At Oomph, we begin every project with a thorough and thoughtful discovery process that gets to the heart of what we’re building, and why. By identifying what your business truly needs in a platform — today and potentially tomorrow — you’ll minimize the need to rebuild again later.

It’s also crucial to build efficiencies into your backend code. Clean, efficient code makes things load faster and run more quickly, with fewer energy cycles required per output.

Look for existing frameworks, tools, and third-party services that provide the functions you need and will continue to stay in service for years or decades to come. And, instead of building a monolith platform that has to be upgraded every few years or requires massive computing power, consider switching to a more nimble and efficient microservices architecture.

2. Keep products and services in use.

Regular maintenance and timely patching is key to prolonging the life of your platform. So is proactively looking for performance issues. Be sure to regularly test and assess your platform’s speed and efficiency, so you can address problems early on.

While we’re advocating for using products and services for as long as possible, if your platform is built on microservices, don’t be afraid to replace an existing service with a new one. Just make sure the new service provides a benefit that outweighs the resource costs of implementing it.

3. Aim to regenerate natural systems.

The term “regenerate” describes a process that mimics the cycles of nature by restoring or renewing sources of energy and materials. It might seem like the natural world is far removed from your in-house tech, but there are a number of ways that your IT choices impact the environment.

For starters, you can factor sustainability into your decisions around vendors and equipment. Look for digital hosting companies and data centers that are green or LEED-certified. Power your hardware with renewable energy sources. Ultimately, the goal is to consider not just how to reduce your platform’s impact on the environment, but how you can create a net-positive effect by doing better with less.

Get Ready for the Future

We’ve long seen that the ways in which businesses and societies use resources can transform local and global communities. And we know that environmental quality is inextricably linked to human wellbeing and prosperity. The circular economy, then, provides a way to improve our future readiness.

Companies that invest in sustainability generally experience better resilience, improved operational performance, and longer-lasting growth. They’re also better suited to meet the new business landscape, as governments incentivize sustainable activities, customers prefer sustainable products, and employees demand sustainable leadership.

Interested in exploring how you can join the new circular economy with your digital platforms? We’d love to help you explore your options, just contact us.

Google Analytics 4, or GA4, is Google’s fourth iteration of its website analytics platform. This is no ordinary upgrade! Leveraging the power of big data and machine learning, GA4 offers entirely new ways to collect and analyze user activity data across websites and apps.

While GA4 provides access to robust new tools and features for data-driven decision making, it also sheds many of the metrics and reports we’re used to in Google Analytics 3 (a.k.a. Universal Analytics, or UA).

Google will be sunsetting UA properties in July 2023. Here’s what you need to know about GA4’s capabilities — and why you should start the transition sooner rather than later.


Not sure which platform you currently have (UA vs. GA4)? 

Take a look at this cheat sheet.


Key Benefits of Google Analytics 4

We’re living in a more privacy-centric world, and GA4 is Google’s answer to stricter data laws and browser regulations. GA4 is designed to function without third-party cookies, using machine learning and statistical modeling instead to collect data.

This change comes with a range of benefits, from more actionable user insights to enhanced reporting capabilities.

Broader Insights

Unlike UA, GA4 has the ability to track users across devices and platforms, combining all the data into a single property with a unified set of metrics and dimensions. This gives you a more complete picture of how users interact with your brand, whether they’re on your website, your mobile app, or both.

Another major advantage is that you can more effectively track conversions — particularly for users that might visit on their mobile, come back on desktop, and then download/purchase/register through your app. Because GA4 attributes actions to users across devices and platforms, you can see the entire journey a user takes from start to finish.

Predictive Metrics

Using machine learning, GA4 offers powerful new metrics to predict user actions and includes new data buckets like Acquisition, Engagement, Monetization, and Retention. These predictive metrics can help you better understand your audience and make more informed decisions, so you can do things like tailoring your website experience for different users or creating targeted marketing campaigns.

Customized Reporting

UA offers a set of standard reports with some customization options. By contrast, GA4 enables and encourages users to create custom reports with only the data they need.

With greater freedom to create reports, you can declutter your dashboard and make decisions more quickly by drilling down to the data that’s most important to you. You can even create a separate “Audiences” report with custom user definitions, further tailoring the data to support your business needs.

Key Features of Google Analytics 4

With comprehensive user tracking, predictive metrics, customizable reports, and more, GA4 promises to be much more powerful than any previous version of Google Analytics. Here are the core capabilities driving all of those benefits.

Event-Based Tracking

One of the biggest changes in GA4 is how user data is collected. In UA, data is collected via tags placed on each page of a website. Users are tracked via sessions, or set periods that begin and end when a user enters and exits a site.

Instead of relying on pageviews and sessions, GA4 tracks user interactions, known as “events,” as users complete them. This focus on individual user interactions provides a more complete picture of each user’s journey across your website or app.

This event-based model also makes it possible to track interactions that don’t happen on web pages but can be influenced by digital marketing, such as in-store visits or in-app purchases. And, it allows Google to more accurately deduplicate users.

Cross-Platform Data Consolidation

In UA, “properties” are where Analytics data is collected for individual websites and apps. You can then use views to see and report on the data in various ways.

GA4 uses individual data streams to combine data from different platforms into a single property. You can add multiple data streams into a property and create different views based on certain criteria.

For example, you could create a stream for all web traffic, a stream for all app traffic, or a stream for traffic from both that covers a given geographic area. By placing the same tracking code across different digital platforms, you can consolidate data to track users who move between the streams.

Advanced Analytics

Maybe the most exciting feature for data geeks like us, GA4’s Explorations Hub offers a suite of advanced data and analytical techniques that go well beyond standard reports. The Explore section lets you create custom analyses to uncover deeper insights about your website and app performance, with filters and segments so you can drill down even further.

GA4 also integrates with BigQuery, Google’s cloud-based data warehouse, where you can run complex analyses of very large datasets. Bonus: BigQuery offers near-unlimited data storage.

Machine Learning

In an increasingly cookie-less world, Google is attempting to balance privacy limitations with usable insights. Using machine learning, GA4 fills in data gaps and provides predictive insights about user behavior and trends.

Machine learning combines artificial intelligence (AI) and computer science to fill in gaps and make predictions. It essentially looks for patterns of activity that can be fed into an algorithm to understand and predict how users behave online.

As an example, GA4’s AI-powered insights can help identify user actions that are most likely to lead to conversions. Using metrics like purchase probability, churn probability, and revenue prediction, you can customize marketing campaigns or target specific audiences to achieve your conversion goals.

Why You Should Switch to GA4 ASAP

You’ll be able to collect and use platform data in your existing UA property until July 1, 2023. After that, you’ll be able to access historical data for only six months. That’s why we strongly recommend you implement GA4 as soon as possible.

Transitioning now will allow you to:

Feed The Machine

Many of GA4’s core features rely on machine learning, and in order for machine learning to be effective, the algorithm needs time to learn. The sooner you set up and start collecting data in GA4, the more time your models will have to analyze and learn, shaping the insights you’ll need down the road.

Train Your People

Those using GA4 will need time to learn the new terminology, user interface, and capabilities. Switching early gives your team time to get used to the new platform and work out new processes and reporting while you still have UA to fall back on.

Get Year-Over-Year Data

GA4 is forward-facing only, which means your new GA4 property will only collect data from the time of creation; it won’t import past data from UA. Once UA sunsets next year, you’ll be relying solely on GA4 for year-over-year data.

Why does that matter? Here at Oomph, when we launch client projects, we use Google Analytics data to analyze digital platform performance so we can develop the best possible user experience. By examining user flows, page visits, common search terms, engagement metrics, and more, we can very quickly get a picture of where a platform has strengths and weak points. And we need your historical data to do it.

Ready to switch to Google Analytics 4? It’s a relatively simple process. Just follow the steps Google provides, whether you want to switch from UA to GA4 or set up a GA4 property alongside an existing UA property.

If you’re not feeling confident about handling the transition alone, we’d love to help. Get in touch with us today.

In the age of hyper-personalization by the likes of Amazon and Netflix, customized user experiences are now table stakes for digital platforms. Businesses that invest in personalization are rewarded with loyalty and revenue. Those that don’t, get left behind.

But making that investment isn’t a straightforward affair. Many services that pitch themselves as personalization tools don’t even come close to creating a truly customized experience. And today’s savvy web users aren’t fooled:

Where we’ve seen businesses stumble is in substituting personification for true personalization. While personalization involves tailoring content based on direct personal information, personification is based on categories of consumers, not individual people.

Here’s what you need to know about the difference.

Perils of Personification

Gartner defines personification as “the delivery of relevant digital experiences to individuals based on their inferred membership in a defined customer segment, rather than their personal identity.” It’s the digital equivalent of calling someone “buddy” or “champ” because you can’t remember their name. I know that I know you, but I don’t know who you are.

Personification tools can track user behavior and use AI to place users into, say, one of several marketing personas you’ve developed. But in terms of driving meaningful, personalized interactions with users, personification falls down.

Here are a few critical issues with commonly used personification tools:

User Session Data

Information about a user’s interactions with an application is stored temporarily on the application’s server, not the browser.

EXAMPLE: During this session, I see that you’ve visited a piece of content that falls in a specific category. For the rest of your session, I can serve up other content tagged with the same category (often in Featured, Related, or You May Also Like sections).

PROBLEM 1: As soon as the browser session is closed, the user data is lost.

PROBLEM 2: The moment you switch from one device (e.g. mobile) to another (e.g. tablet) you lose all session data.

Contextual Data

Marketing automation or location intelligence software can use AI to gather environmental data about a user to deliver customized content or services.

EXAMPLE: I see that you’re in Los Angeles, California. Knowing your local weather, time zone, and other regional attributes, I can tailor the content you see to be more specific to your area.

PROBLEM: I have to ask you first if I can track your location, and you might say no.

First Party Cookie Data

By storing information about a user’s behavior directly on a domain, site owners can collect analytics data and remember language settings, among other functions.

EXAMPLE: Last time you visited my website, you commented on a certain piece of content. I may even have asked, “Do you want to see more of this type of content?” Now that you’re back, I can serve up newly published content of the same type. I can even feature it right on the homepage.

PROBLEM 1: I need to ask you if I can use cookies with you, and you can say no.

PROBLEM 2: If you clear the cookies in your browser, I’ll lose that valuable data.

PROBLEM 3: Another family member is using the same application on the same device, and now I’m getting mixed signals. This is completely messing with my AI.

Bottom line: personification is not really personalization. Even worse, you may lose your data and have to start from square one. To deliver true personalization, you need first-party data from authenticated users. Instead of guessing who your customer is, get to know who they really are.

Next-Level Personalization

True personalization is difficult to achieve outside of a digital platform, where people register as users (versus just casually visiting a website). Once someone becomes an authenticated user, it’s easier to learn a number of things about them.

83% of consumers are willing to share their data to enable personalized experiences. Platform users in particular are more open to providing personal information, because they’re specifically looking for a customized experience. With that first-party data, you can track preferences and interactions to improve the user experience. And you’re not going to lose the historical data when a user closes a session or clears their cookies.

Here are some key benefits:

Looking for Middle Ground?

In the end, you’ll deliver the best personalization (and earn the most engagement) by building an interactive platform and leveraging first-party data. But what if you have a decent website, and you’re not ready to shift to a platform?

You could approach it as a testing ground for personalization instead. By creating a series of micro-interactions using personification tools, you can test whether your users actually want a personalized experience, and if so, what they want to personalize.

Let’s say you’re a news outlet. You could just let people come and read your content online. At the next level, you can try to guess who they are through personification (via cookie requests, location prompts, etc.). If users are interacting with your prompts, it’s likely they’re interested in having a personalized experience.

Finally, you could build a platform for registered users and offer true personalization. You’ll not only deliver a better user experience, you’ll increase engagement and return visits — not to mention sales and other revenue.

At whatever level you can, go the extra mile and give your users what they want. We’re happy to help! Contact us today to learn more.

You’ve decided to decouple, you’re building your stack, and the options are limitless – oh, the freedom of escaping the LAMP square and the boundaries of the conventional CMS! Utilities that were once lumped together into one unmoveable bundle can now be separately selected, or not selected at all. It is indeed refreshing to pick and choose the individual services best fitted to your project. But now you have to choose.

One of those choices is your backend content storage. Even though decoupling means breaking free from monolithic architecture, certain concepts persist: content modeling, field types, and the content editor experience.

I recently evaluated four headless CMS options: ContentfulCosmicDato, and Prismic. Prior to that, I had no experience with any of them. Fortunately they all offer a free plan to test and trial their software. For simpler projects, that may be all you need. But if not, each CMS offers multiple tiers of features and support, and costs vary widely depending on your requirements.

I was tasked with finding a CMS and plan that met the following specs:

Although this doesn’t seem like a big ask for any CMS, these requirements eliminated the free plans for all four services, so cost became a factor.

Along with cost, I focused my evaluation on the editor experience, modeling options, integration potential, and other features. While I found lots of similarities between the four, each had something a little different to offer.

It’s worth mentioning that development is active on all four CMSs. New features and improvements were added just within the span of time it took to write this article. So keep in mind that current limitations could be resolved in a future update.

Contentful

Contentful’s Team package is currently priced at $489 per month, making it the most expensive of the four. This package includes 10 content editors and 2 separate roles. There is no editorial workflow without paying extra, but scheduled publishing is included.

Terminology

A site is a “space” and content types are “content types.”

What I love

The media library. Media of many different types and sources – from images to videos to documents and more – can be easily organized and filtered. Each asset has a freeform name and description field for searching and filtering. And since you can provide your own asset name, you’re not stuck with image_8456_blah.jpeg or whatever nonsense title your asset had when you uploaded it. Additionally, image dimensions are shown on the list view, which is a quick, helpful reference.

Video description

RUNNER UP

 Dato’s Media Area offers similar filtering and a searchable notes field.

What I like

Commenting. Every piece of content has an admin comments area for notes or questions, with a threaded Reply feature.

My Views. My Views is an option in the content navigation panel. With a single click, you can display only content that you created or edited – very convenient when working with multiple editors and a large volume of content.

What could be better

Price. Contentful is expensive if your project needs don’t allow you to use the free/community plan. You do get a lot of features for the paid plans, but there’s a big jump between the free plan and the first tier paid plan.

Cosmic

Cosmic ranks second most pricey for our requirements at $299 per month for the Pro Package. This package includes 10 editors and 4 predefined roles. It has draft/scheduled publishing, and individual editor accounts can be limited to draft status only.

Terminology

A site is a “bucket” and content types are “object types.”

What I love

Developer Tools. Developer Tools is a handy button you can click at the object or object type level to view your REST endpoint and response. It also shows other ways (GraphQL, CLI, etc.) to connect to a resource, using real code that is specific to your bucket and objects.

Video description

RUNNER UP

Dato’s has an API Explorer for writing and running GraphQL queries.

The Slack Community. The Cosmic Slack community offers a convenient way to get technical support – in some cases, even down to lines-of-code level support – with quick response times.

What I like

View as editor. This is a toggle button in the navigation panel to hide developer features – even if your account is assigned the developer or admin role – allowing you to view the CMS as the editor role sees it. This is useful for documenting an editor’s process or troubleshooting their workflow.

Extensions. Cosmic provides several plug-and-play extensions, including importers for Contentful and WordPress content, as well as Algolia Search, Stripe, and more. I tested the Algolia extension, and it only took minutes to set up and immediately began syncing content to Algolia indexes1. You can also write your extensions and upload them to your account.

What could be better

Price/price structure. I found Cosmic’s pricing structure to be the most confusing, with extra monthly charges for common features like localization, backups, versioning, and webhooks. It’s hard to know what you’ll actually pay per month until you add up all the extras. And once you do, you may be close to the cost of Contentful’s lower tier.

Content model changes. Changing the content model after you’ve created or imported a lot of content is tricky. Content model changes don’t flow down to existing content without a manual process of unlocking, editing and re-publishing each piece of content, which can be very inefficient and confusing.

Dato

Dato’s Professional package is priced at €99 (about $120) per month, making it the second least pricey for our requirements. It includes 10 content editors and 15 roles, with configurable options to limit publishing rights.

Terminology

A site is a “project” and content types are “models.”

What I love

Tree-like collections. Dato lets you organize and display records in a hierarchical structure with visual nesting. The other CMSs give you roundabout ways to accomplish this, usually requiring extra fields. But Dato lets you do it without altering the content model. And creating hierarchy is as simple as dragging and dropping one record under another, making things like taxonomy a breeze to build.

Video description

RUNNER UP

No other CMS in this comparison offers hierarchical organizing quite like Dato, but Cosmic provides a parent field type, and Prismic has a documented strategy for creating hierarchical relationships.

What I like

Maintenance Mode. You can temporarily disable writes on your project and display a warning message to logged in editors. If you need to prevent editors from adding/editing content — for instance, during content model changes — this is a useful feature.

What could be better

Field types. Out-of-the-box Dato doesn’t provide field types for dropdowns or checkboxes. There’s a plugin available that transforms a JSON field into a multiselect, but it’s presented as a list of toggles/booleans rather than a true multiselect. And managing that field means working with JSON, which isn’t a great experience for content editors.

Dato is also missing a simple repeater field for adding one or more of something. I created repeater-like functionality using the Modular Content field type, but this feels overly complicated, especially when every other CMS in my comparison implements either a Repeater field type (Cosmic, Prismic) or a multi-value field setting (Contentful).

Prismic

Prismic ranks least pricey, at $100/mo for the Medium Package. This package includes 25 content editors, 3 predefined roles, draft/scheduled publishing and an editorial workflow.

Terminology

A site is a “repository”, and content types are “custom types.”

What I love

Field types. Prismic gives you 16 unique field types for modeling your content, but it’s not the number of types that I love; it’s the particular combination of options: the dedicated Title type for headings, the Media link type, the Embed type, the Color Picker. Plus, the UI is so intuitive, content editors know exactly what they’re expected to do when populating a field.

Take the Title type for example. If you need a heading field in the other CMSs, you’d probably use a plain text or rich text field type. Using rich text almost guarantees you’ll get unwanted stuff (paragraph tags, in particular) wrapped around whatever heading the editor enters. Using plain text doesn’t let the editor pick which heading they want. Prismic’s Title type field solves both of these problems.

Video description

RUNNER UP

This is a tough one, but I’m leaning toward Contentful. What they lack in the number of available field types, they make up for in Appearance settings that allow you to render a field type to the editor in different formats.

Price. Unlimited documents, custom types, API calls and locales are included in the Medium package for a reasonable price. Additionally, Prismic has more packages and support tiers than any of the others, with one paid plan as low as $7/mo.

What I like

Slices. Slices are an interesting addition to back-end content modeling, because they’re essentially components: things you build on the front. Prismic lets you create custom components, or use their predefined ones — blockquotes, a list of articles, an image gallery, etc… I admit I didn’t test how these components render on the front-end, but Slices deserve further exploration.

What could be better

Integration options/plugins. Although Webhooks are included in all of Prismic’s plans, there doesn’t seem to be any development of plugins or ways to quickly extend functionality. Every other CMS in this comparison offers simple, click-to-install extensions and integrations to common services.


A note on Front-end Frameworks

A headless CMS, by simple definition, is a content storage container. It does not provide the markup that your website visitors will see and use. Therefore, your project planning will include choosing and testing a front-end system or framework, such as Gatsby JS. It’s important to find out early in the process what, if any, obstacles exist with connecting your choice CMS to your choice front-end.

At Oomph, we’ve successfully used both Contentful and Cosmic with a Gatsby front-end. However, Gatsby plugins exist for Prismic and Dato as well.

Summary

As with any decoupled service, your headless CMS choice will be determined by your project’s distinct requirements. Make sure to build into your project plan enough time to experiment with any CMS options you’re considering. If you haven’t worked with a particular CMS yet, give yourself a full day to explore, build a sample content model, add some content and media, and test the connection to your front-end.

Does a clear winner emerge from this comparison? I don’t think so. They each succeed and stand out in different ways. Use this article to kickstart your own evaluation, and see what works for you!


At the time of this writing, there are some field types that the extension doesn’t pass from Cosmic to Algolia.

If you live in an area with a lot of freight or commuter trains, you may have noticed that trains often have more than one engine powering the cars. Sometimes it is an engine in front and one in back, or in the case of long freight lines, there could be an engine in the middle. This is known as “Distributed power” and is actually a recent engineering strategy. Evenly distributed power allows them to carry more, and carry it more efficiently.1

When it comes to your website, the same engineering can apply. If the Content Management System (CMS) is the only source of power, it may not have enough oomph to load pages quickly and concurrently for many users. Not only that, but a single source of power may slow down innovation and delivery to multiple sources in today’s multi-channel digital ecosystems.

One of the benefits of decoupled platform architecture is that power is distributed more evenly across the endpoints. Decoupled means that the authoring system and the rendering system for site visitors are not the same. Instead of one CMS powering content authoring and page rendering, two systems handle each task discreetly.

Digital properties are ever growing and evolving. While evaluating how to grow your own system, it’s important to know the difference between coupled and decoupled CMS architectures. Selecting the best structure for your organization will ensure you not only get what you want, but what is best for your entire team — editors, developers, designers, and marketers alike.

Bombardier Zefiro vector graphic designed for Vexels

What is a traditional CMS architecture?

In a traditional, or coupled, CMS, the architecture tightly links the back-end content administration experience to the front-end user experience.

Content creation such as basic pages, news, or blog articles are created, managed, and stored along with all media assets through the CMS’s back end administration screens. The back end is also where site developers create and store customized applications and design templates for use by the front-end of the site.

Essentially, the two sides of the CMS are bound within the same system, storing content created by authenticated users and then also being directly responsible for delivering content to the browser and end users (front end).

From a technical standpoint, a traditional CMS platform is comprised of:

  1. A private database-driven CMS in which content editors create and maintain content for the site, generally through some CMS administration interfaces we’re used to (think WordPress or Drupal authoring interfaces)
  2. An application where engineers create and apply design schemas. Extra permissions and features within the CMS give developers more options to extend the application and control the front end output
  3. A public front end that displays published content on HTML pages

What is a decoupled CMS architecture?

Decoupled CMS architecture separates, or decouples, the back-end and front-end management of a website into two different systems — one for content creation and storage, and another for consuming content and presenting it to the user.

In a decoupled CMS, these two systems are housed separately and work independently of the other. Once content is created and edited in the back end, this front-end agnostic approach takes advantage of flexible and fast web services and APIs to deliver the raw content to any front-end system on any device or channel. It is even possible that an authoring system delivers content to more than front-end (i.e. an article is published in the back-end and pushed out to a website as well as a mobile App).

From a technical standpoint, a decoupled CMS platform is comprised of:

  1. A private database-driven CMS in which content editors create and maintain content for the site, generally through the same CMS administration interfaces we’re used to — though it doesn’t have to be2
  2. The CMS provides a way for the front-end application to consume the data. A web-service API — usually in a RESTful manner and in a mashup-friendly format such as JSON — is the most common way
  3. Popular front-end frameworks such as React, VueJS, or GatsbyJS deliver the public visitor experience via a Javascript application rendering the output of the API into HTML

Benefits of decoupled

By moving the responsibility for the user experience completely into the browser, the decoupled model provides a number of benefits:

Push the envelope

Shifting the end-user experience out of the conventions and structures of the back-end allows UX Engineers and front-end masterminds to push the boundaries of the experience. Decoupled development gives front-end specialists full control using their native tools.

This is largely because traditional back-end platforms have been focused on the flexibility of authoring content and less so on the experience of public visitors. Too often the programming experience slows engineers down and makes it more difficult to deliver an experience that “wows” your users.

Need for speed

Traditional CMS structures are bogged down by “out-of-the-box” features that many sites don’t use, causing unnecessary bloat. Decoupled CMS structures allow your web development team to choose only what code they need and remove what they don’t. This leaner codebase can result in faster content delivery times and can allow the authoring site to load more quickly for your editors.

Made to order

Not only can decoupled architecture be faster, but it can allow for richer interactions. The front-end system can be focused on delivering a truly interactive experience in the form of in-browser applications, potentially delivering content without a visitor reloading the page.

The back-end becomes the system of record and “state machine”, but back-and-forth interaction will happen in the browser and in real-time.

Security Guard

Decoupling the back-end from the front-end is more secure. Since the front-end does not expose its connection to the authoring system, it makes the ecosystem less vulnerable to hackers. Further, depending on how the front-end communication is set up, if the back-end goes offline, it may not interrupt the front-end experience.

In it for the long haul

Decoupled architectures integrate easily with new technology and innovations and allow for flexibility with future technologies. More and more, this is the way that digital platform development is moving. Lean back-end only or “flat file” content management systems have entered the market — like Contentful and Cosmic — while server hosting companies are dealing with the needs of decoupled architecture as well.

The best of both worlds

Decoupled architecture allows the best decisions for two very different sets of users. Content editors and authors can continue to use some of the same CMSs they have been familiar with. These CMSs have great power and flexibility for content modelling and authoring workflows, and will continue to be useful and powerful tools. At the same time, front-end developers can get the power and flexibility they need from a completely different system. And your customers can get the amazing user experiences they have come to expect.

The New Age of Content Management Systems

Today’s modern CMS revolution is driving up demand for more flexible, scalable, customizable content management systems that deliver the experience businesses want and customers expect. Separating the front- and back-ends can enable organizations to quicken page load times, iterate new ideas and features faster, and deliver experiences that “wow” your audience.


  1. Great article on the distributed power of trains: Why is there an engine in the middle of that train?
  2. Non-monolithic CMSs have been hitting the market lately, and include products like Contentful, CosmicJS, and Prismic, among others.

We’ve dreamed about having conversations with our computers for a long time. Stanley Kubrick’s 1968 film 2001: A Space Odyssey imagined a sentient computer named Hal. In the past few years, with the rise of Siri, Alexa and more, we live in that reality.

A simpler version of the natural language processing apps like Siri are chatbots. 2016 was the rise of the chatbot, and 2017 will continue that trend, with more and more users having “conversations” via the keyboard to find information and complete tasks instead of clicking around in search engines and on websites. Some of us have not yet interacted with a chatbot before, so, what is it like? And what is it like to design one?


The experience of interacting with a chatbot is very similar to interacting with a smart and eager-to-please dog — the bot understands a lot, but won’t always give you the feedback that you expect; their intentions are good, but they are not always helpful; and you end up training yourself as much as you try to train them.


When it comes to designing one, it is very much like training a dog as well. Since there is no visual interface, the tools of brand are limited to the voice and tone of the snippets of conversation that the bot can have. If you are lucky, there might be an avatar that the bot can use, but beyond that, the “personality” of the bot comes solely from the way it reacts and the words that it uses.

No UI is still a UI

First, a slight tangent, if I may.

There is a saying, coined by Golden Krishna, that goes The best user interface is no user interface, or, The Future of UI is no UI. While on the surface, that is a nice, catchy statement, I don’t think it is true. To get semantic, I’d refine that to say The Future of UI is no GUI.

The absence of a graphical user interface (GUI) is still an interface. If your bot uses Facebook Messenger, or WeChat, or Slack, it still has a GUI. It’s just not one that you designed, and you have to work within the constraints of that system.

Further, I would argue that a Conversational UI is still a UI. How could it not be? To design one, you need to make the same decisions that you would if you were designing anything else — there are just a lot more constraints to work within.

You can’t choose the typeface, you can’t rely on colors, or design details such as borders, drop shadows, and the like. Instead, your UI and all the personality of this little bot comes through in the way it can have a conversation. It would be a very different experience to have the bot use a military tone of voice — gruff, coarse, quick and to the point with very few conversational niceties — than if you used a friendly, happy, and more naturally intoned voice.

A Conversational UI may not be a graphical UI, but it is still a UI. There are decisions to be made, but because there are no visuals to work with, the decisions you make are even more crucial to the expression of your bot’s personality and the brand that you want it to reflect.

Our good little boy, Gitpup

As a way to experiment with Conversational UIs, and as a way to create something useful for our team that might also have usefulness in the larger marketplace, Oomph designed, built, and implemented a chatbot for <a href=”//slack.com”target=”blank” rel=”noopener noreferrer” title=”Opens in a new window”>Slack (a Slackbot) and <a href=”//www.hipchat.com”target=”blank” rel=”noopener noreferrer” title=”Opens in a new window”>Hipchat called <a href=”//www.gitpup.io”target=”_blank” rel=”noopener noreferrer” title=”Opens in a new window”>Gitpup.

The problem that we wanted to solve was “How do you make peer-to-peer code review easier to manage and more fun?” As the name suggests, Gitpup is an assistant that we connect to our Github account. Its job is to announce when a Pull Request is ready for review, assign it to someone, track it, and then close it when the review is approved. Once a review has been completed, the user is given “karma” points that come with bragging rights and have become a currency of sorts amongst our team.

Designing the chatbot

Our “design” and personality came from who we decided that this bot was. We toyed with a bunch of ideas, but once we decided that the bot was a dog, it was easy to start to have some fun with it.

As a dog, Gitpup is everything you would want one to be: helpful, quick to respond, and obedient. But Gitpup has a little sass, too. We wanted it to have some personality, and be somewhat of a jokester. It has random phrases that it throws back at you if you mention it by name, and while the randomness is uncontrollable by nature, it does lead to some fun interactions.

A sample conversation with Gitpup

nate whats unassigned
Gitpup APP No open, unapproved and unassigned Pull Requests
nate Thanks gitpup
Gitpup APP You’re welcome, nate!
nichole hey gitpup
Gitpup APP See if I play fetch with you next time, @nichole
nichole y r u so mean?

It utters plenty of encouragement when a Pull Request is taken or completed by a team member.

Sample workflow with Gitpup

Gitpup APP [Review #3366] marc has been assigned to Bring react ish into main oomph repo!, #233 on oomphinc.com. Go get ’em, Champ!
Gitpup APP [Review #3364] brett closed and merged Add logintoboggan module, #269 on drupal-repo +1 Karma (1,766) Yes. You. Can!
Gitpup APP [Review #3365] brandon approved PR Add style for when there are logos in a grid within content (SASS), #207 on kab. Magnifique!
Gitpup APP [Review #3365] Thanks for participating, brandon +5 Karma (903). Woof!

We also built some randomness into the daily grind. In addition to helping with reviews, assignments, and approvals, Gitpup announces a Raffle for any Karma that might be unassigned once a day at a random time. You can type I’m in or I’ll join to enter the raffle.

Joining Gitpup’s daily Karma raffle

Gitpup APP Attention Today’s community Karma raffle is in effect! Say I’m in! to join the
raffle within the next 5 minutes to join for your chance to win 45 Karma!
nate I’m in
Gitpup APP Thanks for joining, @nate, you’re the first and only participant in this raffle, Purrrrrfect! Chances are looking good!
nichole im in
Gitpup APP Fingers crossed, @nichole! There are now 2 participants in this raffle.
brett ill join
Gitpup APP This could be the one, @brett! There are now 3 participants in this raffle.
stephen I’m in as well
Gitpup APP May the odds be ever in your favor, @stephen! There are now 4 participants in this raffle.
Gitpup APP And the winner of today’s raffle is… @nathandentzau! Congratulations! +45 Karma

After interacting with Gitpup for a few days, it is easy to forget that it is just a bot that we programmed. Personality is an easy thing to create if you start from a place of authenticity. For us, we were not trying to create a person, we simply wanted an amiable cartoon character that would be fun to interact with and make what might be a dreary task into something a little more engaging.

A well-trained Bot will also train the User

This little experiment into Conversational UIs helped show us how any UI will also train the user in ways to get the most out of it. With a visual UI, it might be lots of trial and error — clicking around to get a sense of what things are and where they go. With Gitpup, it was the same sort of trial and error, but the downside is that a command can simply go unnoticed if it is not in the bot’s vocabulary.

For instance, our patterns match phrases with a certain degree of fuzziness — we ignore punctuation and plurals, just so the commands don’t have to be perfectly typed. To ask Gitpup which PRs are not assigned to someone yet, you can type “whats unassigned” and Gitpup will respond. There is a drawback to a Slackbot that is in an open room where other conversations are happening — you don’t want it to respond to everything that anyone says, so, if you try to get Gitpup’s attention but don’t use the right terms, you won’t get any response.

Users can quickly learn the different commands that a simple bot will recognize, and it’s not so bad to use trial and error to figure it out. But with a more complicated bot that you are in a direct conversation with (via a text app), it can be a little more awkward if your bot does not recognize a lot of natural language. Be sure to design some phrases in the bot’s conversational database that can be used when it does not understand something, or it can not complete a particular task. Success messages are easy to create, but it is harder to create a failure message that does not frustrate a user or make them think that the bot is simply useless.

brett gimme 4034
Gitpup APP Ruh roh, I don’t know about Review #4034
nichole give 3466 to @thatguy
Gitpup APP Darn, I don’t know who that is!

When is a Chatbot right for you?

While chatbots are all the rage right now, I don’t think normal point and click websites are going away anytime soon. A chatbot service isn’t right for everyone because the service they offer is very transactional in nature—get me this, research that. They can offer information that a user didn’t know that they wanted, but that is often more complicated, and in the end, the user needs to opt into the service somehow.

The questions you can ask are:

Find a Chatbot and Use it!

If you are a marketer looking for new ways to connect with customers, or a manager looking for tools to keep employees efficient and engaged, you should become familiar with chatbots. We’d recommend these as good starting points:

The barriers between our computers and our normal everyday interactions are coming down. A computer that can hear your voice commands is probably not far away from you at any given time. Conversational UIs are making it easier to get what you need at any moment without finding a particular app, typing on a tiny screen, or doing a lot of clicking around. They become another tool in the belt and another way to be where your customers are.

If you’d like more information about chatbots and whether or not your company can benefit from the power of No GUI, drop us a line. We’d love to chat person to person. 😉


More reading about chatbots: