By Rachel Heidenry & Rachel Hart

Museum websites are beautifully complex. As the digital counterpart to a physical space, they serve many essential functions. They must reflect the museum’s mission and values, while guiding users clearly to key areas of information. Museums with collections often need dedicated sections for research and archives; zoos may focus on telling the stories of their animals; and contemporary art institutions sometimes even use their sites as platforms for artists to showcase new work. At the same time, nearly all museum websites must serve practical needs like selling tickets or memberships, promoting events or fundraisers, and providing essential visitor information, like hours and directions.

Managing that much critical and varied information is a challenge for any website, which is why strong information architecture (IA) is essential. A successful navigation should be intuitive and accessible, with clear labels and well-organized categories.

Mobile-responsiveness is also crucial, especially for visitors who need quick access to find information, like admission prices or the current exhibitions, or who want to purchase tickets on the go.

For cultural organizations, a strong menu and navigation system is arguably the most important indicator of a successful website. 

A Survey of Website Navigation Trends for Art Museums 

While not all cultural organizations prioritize aesthetics, art museums inherently do. As institutions dedicated to the presentation of art, they think critically about how visual design shapes their brand identity. In some cases, aesthetics can overshadow usability, resulting in beautiful or cutting-edge websites that are ultimately difficult for both internal staff members and external visitors to navigate. 

As part of a recent engagement with the Isabella Stewart Gardner Museum in Boston, we conducted a cohort analysis of other leading museum websites. The study focused on key elements of art museum digital platforms including menu design, navigation, content organization, and user flows. One striking insight was that many art museum websites avoid dropdown menus, instead favoring a simple list of four or five top-level categories. These are often labeled with opaque or “insider” terms, raising questions like: What does “Programs” signify? Does “Art” lead to the permanent collection or temporary exhibitions? Does the general population know the difference? And where in the world is the museum’s blog? 

Let’s take a look at the building blocks of a site’s navigation and what we learned from reviewing a cohort of cultural institution websites.

Utility Navigation

The utility navigation should help visitors quickly access essential information. As the name suggests, the utility navigation traditionally contains tools and actions (like login, search, and language select) that help visitors use the website. You’ll typically see it as a secondary list of items above the main menu, often in a smaller font. 

When deciding what to include in it, consider your primary visitors’ goals: What do they need to know or do on your website? Museums often use the utility navigation to drive high-value actions like purchasing tickets or memberships. Our analysis also showed that museums with online shops frequently included links to the store or member login portals when relevant. In general, it’s best practice to limit the utility navigation to 2-4 key items, not including search.

The Dropdown or Mega Menu

Museums have a lot of content. And the larger the institution, the more content its website undoubtedly has to provide visitors. Our analysis showed that institutions that embrace the dropdown menu are overall easier to navigate and more often mobile-friendly. The bottom line: you don’t want visitors to your website to have to go down rabbit holes to find essential information.

Categories & Language

A navigation menu requires words (obviously). These are among the first words anyone sees when they land on your website. Thus they set the tone and expectation for what kind of museum you are, while also telling the story of what someone can do both on-site and online. Making sure the words that comprise the navigation are distinctive, accessible, and concise is key.

Hamburger Menu

To keep the main navigation simple and clean, some museums, like The Barnes Foundation, opt to put additional links behind a hamburger menu, even at desktop widths. In this way, less significant information does not busy up the navigation, but visitors can still intuitively click through to find other key subpages. If you do this, be sure to still repeat the top level menu items, as this pop-out navigation will become your mobile view.

Conclusion: Building a Successful Navigation for Museums

If you are a cultural institution that is starting to rethink your website navigation, the first step is to put yourself in the shoes of your visitor. It’s critical to put aside internal org charts and take a user-centered design approach. Come up with a few key user journeys for various audiences. How would a first-time visitor purchase a ticket? How would a repeat visitor find more information about a particular work of art they loved? And then navigate your site as your user would. What are the pain points? What works well? What makes absolutely no sense at all? 

Once you’ve done that, be sure to take a step back from your website to see what types of content you have and the common ways they might intersect. This is important for establishing the key categories of your site, as well as its subcategories. You’ll often be surprised at the connections you can make and the overlaps in content that can be streamlined together. 

From there, you have the building blocks to start conceptualizing your new navigation, one that is usable, clear, and beautifully intuitive. Learn more about building a successful navigation in a Case Study of our 2025 Re-Architecture project for the Isabella Stewart Gardner Museum.

As a digital services firm partnering with destination marketing organizations (DMOs) across the U.S., we’re helping teams navigate what’s already proving to be a volatile 2025—especially on the inbound side. Analysis from the World Travel & Tourism Council (WTTC) projects a stark reality: the U.S. economy will miss out on $12.5 billion in international visitor spending this year, with inbound spend expected to dip to just under $169B, down from $181B in 2024. Even more concerning, the U.S. is the only country among 184 economies in WTTC’s study forecast to see an inbound-spend decline this year.

While external market forces remain largely beyond control, we’ve identified three strategic areas where DMOs can focus their digital platforms to weather this storm and continue demonstrating measurable demand to their partners.

1. Transform Content Into Action-Driving Experiences

Why this strategic shift matters now

With inbound spend shrinking by $12.5B and key feeder markets weakening, undecided travelers need clarity and confidence to choose your destination. Content that reduces uncertainty and highlights immediate value converts better than generic inspiration.

Strategic implementation approach

Activate “Go Now” signals. Combine always-on inspiration with time-sensitive reasons to visit—shoulder-season value, midweek deals, cooling weather breaks—strategically mapped to the soft periods your analytics reveal. 

Elevate discovery through intelligent architecture. Curate SEO-optimized content hubs organized by Themes (outdoors, arts, culinary) and Moments (fall colors, winter lights). Implement structured data (FAQ, Event, Attraction) with strategic internal linking architecture so travelers find relevant options fast.

Deploy micro-itineraries for immediate conversion. Design 24–48-hour “micro-itins” featuring embedded maps, transit and parking guidance, and seamless handoffs to bookable partners. Partnering with platforms like MindTrip reduces content team effort while accelerating output—a strategy that’s proven particularly effective for our DMO clients facing resource constraints.

Authority-driven event content optimization. Event pages generate the highest intent traffic. Enhance them with rich media, last-minute planning resources, and strategic “if sold-out, try this” alternatives.

Transparent value communication. Feature free experiences prominently, implement intuitive budget filters, and deploy “Best Time to Visit” calendars comparing crowds and pricing by week and month. Transparency builds trust, and trust drives conversion.

2. Build Your Competitive Moat Through Data-Driven Audience Cultivation

Your first-party data represents your most defensible competitive advantage. As platform targeting becomes increasingly constrained and inbound spending softens, DMOs that build and activate their own audience will capture attention far more efficiently than those relying solely on paid channels.

Strategic audience development

Implement high-intent capture everywhere. Deploy contextual email and SMS prompts across high-intent templates—events, itineraries, trip planners, partner directories. Offer valuable micro-perks like exclusive maps and early event alerts. 

Master progressive profiling. Collect visitor preferences—season, interests, party type, origin market—over multiple touchpoints rather than overwhelming users with lengthy initial forms. 

Create actionable audience segments. Develop cohorts around 2025’s market realities: last-minute planners, shoulder-season seekers, road-trippers, value hunters, family weekenders, and meetings planners. 

Future-proof attribution systems. Combine GA4 with server-side tagging and standardized UTM schemas for every partner handoff. Track outbound clicks, partner session quality, itinerary saves and usage, offer redemptions, and newsletter-driven sessions. This comprehensive approach ensures you maintain visibility into conversion paths as third-party cookies disappear.

Deploy trend-driven editorial strategy. Develop weekly dashboards blending organic query trends, on-site search terms, partner click-through rates, and feeder-market signals. When interest dips in one market, pivot homepage modules and paid social toward value and itinerary content targeting more resilient markets.

3. Transform Partner Relationships Through Measurable Value Delivery

In a softening inbound environment where domestic spending carries approximately 90% of the economic load, your partners need two critical elements: qualified attention and proof of conversion. Your website should function as the region’s premier meta-directory and conversion engine.

Experience optimization strategies

Enable one-click handoffs with context preservation. Pass user filters—dates, neighborhoods, price ranges—directly into partner sites and booking engines while preserving state if travelers return. 

Deploy persistent trip planning tools. Allow users to save places and generate shareable itineraries with intelligent handoffs: “Book these two hotels,” “Reserve rentals,” “Get festival passes.” 

Create compelling partner storefronts. Develop rich partner profiles featuring availability widgets, authentic reviews, social proof, and clear calls-to-action. 

Implement strategic co-op modules. Design paid placements that provide value rather than feeling like advertisements: “Local Favorites” carousels, sponsor highlights, seasonal deal tiles—rotated by audience cohort and season. This generates additional revenue while maintaining user experience quality.

Establish closed-loop reporting systems. Standardize UTM tracking, monitor outbound events, and where permitted, implement partner pixels and offer codes to report assisted conversions by category and campaign. Partners need proof of ROI, and data-driven reporting builds stronger, more profitable relationships.

How Oomph Can Accelerate Your Success

If you’re experiencing softer international interest, shorter booking windows, or declining partner satisfaction, you’re facing the same challenges as DMOs nationwide. The organizations pulling ahead aren’t waiting for market recovery—they’re strengthening their digital platforms through strategic content optimization, systematic audience cultivation, and demonstrable partner value creation.

Our proven methodology transforms these challenges into competitive advantages.

We’ll conduct a comprehensive audit of your digital platform against these three strategic pillars, quantify immediate optimization opportunities, and provide your partners with what they need most: qualified, measurable demand. The market headwinds are real, but the right strategic approach can help you maintain resilience and emerge stronger when conditions improve. Let’s navigate these challenges together.

One question we frequently hear from clients, especially those managing web content, is “How can we implement accessibility best practices without breaking the bank or overwhelming our editorial team?”

It’s a valid concern. As a content editor, you’re navigating the daily challenge of maintaining quality while meeting deadlines and managing competing priorities.

When your team decides to prioritize website accessibility, the initial scope can feel daunting. You might wonder “Does this really make a difference?” or “Is remediation worth the effort?” The answer is always a resounding yes.

Whether you’re working on a small site or managing thousands of pages, accessible content improves user experience, ensures legal compliance, boosts SEO performance, and reinforces your brand as inclusive and responsible. As a content editor, you have the power to make steady, meaningful progress with the content you touch every day.

Why Accessibility Creates Business Impact

Accessible content delivers measurable outcomes across multiple business objectives:

Expanded Market Reach: When your content is inaccessible to users with disabilities, you’re limiting your potential audience. Consider that disabilities can be temporary, like a broken arm, and 70% of seniors are now online—a demographic that often benefits from accessible design principles.

Risk Mitigation: Inaccessible websites can lead to legal complaints under the ADA and other regulations, creating both financial and reputational risks.

Enhanced User Experience: Clear structure, descriptive alt text, and keyboard-friendly navigation improve usability for all users while boosting SEO performance.

Brand Differentiation: Demonstrating commitment to accessibility positions your organization as inclusive and socially responsible.

Implementing Accessibility in Your Editorial Workflow

The challenge isn’t whether to implement accessibility—it’s how to do it efficiently without overwhelming your team or budget.

The Fix-It-Forward Approach

Rather than attempting to overhaul your entire site overnight, we recommend a “fix-it-forward” strategy. This approach ensures all new and updated content meets accessibility standards while gradually improving legacy content. The result? Steady progress without resource strain.

Leverage Open Source Tools

Many CMS platforms offer free accessibility tools that integrate directly into your editorial workflow:

Drupal: Editoria11y Accessibility Checker, Accessibility Scanner, CKEditor Accessibility Auditor

WordPress: WP Accessibility, Editoria11y Accessibility Checker, WP ADA Compliance Check Basic

These tools scan your content and flag common WCAG 2.2 AA issues before publication, transforming accessibility checks into routine quality assurance.

Prioritize High-Impact Changes

Focus your efforts on fixes that significantly improve usability for screen reader and keyboard users:

Less critical issues can be addressed during routine content updates, spreading the workload over time.

Manage Legacy Content Strategically

Don’t let your content backlog create paralysis. Prioritize high-traffic pages and those supporting key user journeys. Since refreshing legacy content annually is already an SEO best practice, use these updates as opportunities to implement accessibility improvements.

Build Team Capabilities

Make accessibility part of your content culture through targeted education and resources. Provide internal training, quick reference guides, and trusted resources to keep editors confident and informed.

Recommended Learning Resources:

Track Progress and Celebrate Wins

Measure success by tracking pages published with zero critical accessibility issues. Share achievements in editorial meetings to reinforce your team’s impact and maintain momentum.

Scaling Your Accessibility Program

While regular content checks provide immediate value, sustainable accessibility success requires periodic comprehensive assessments and usability testing. If your team lacks bandwidth for advanced testing, consider adding this to your 1-2 year digital roadmap. Consistent attention over time proves more sustainable and cost-effective than attempting massive one-time remediation.

Start with Free Tools: Google Lighthouse provides immediate insights into accessibility issues and actionable remediation guidance.

Advanced Assessment Options: For teams ready to expand their program, tools like SortSite, SiteImprove, and JAWS screen reader testing offer comprehensive assessments. These advanced tools can uncover complex issues beyond content-level checks, though they may require developer collaboration for implementation.

Quarterly Program Goals:

Consider engaging someone who navigates the web differently than your team does. This perspective will expand your understanding of accessibility’s real-world impact and inform more effective solutions.

Accessibility as Continuous Improvement

Accessibility isn’t a one-time project—it’s an ongoing commitment to inclusive digital experiences.

By integrating accessibility best practices into your publishing workflow, you’ll build a stronger, more inclusive website that protects your brand, empowers your users, and demonstrates digital leadership.

The fix-it-forward approach transforms what seems like an overwhelming challenge into manageable, sustainable progress.

Ready to Accelerate Your Accessibility Journey?

Explore additional insights from our team:

Ready to take action? Contact Oomph to see how we can support your accessibility journey. We start with targeted accessibility audits that identify your highest-impact opportunities, then collaborate with your team to develop a strategic roadmap that aligns with your internal goals while respecting your resources and team size.

When you’re responsible for your organization’s digital presence, it’s natural to focus on what’s visible: the design, the content, the user experience. But beneath every modern website lies a complex ecosystem of technologies, integrations, and workflows that can either accelerate your team’s success or create hidden friction that slows everything down.

That’s where a technical audit becomes invaluable. It’s not just a diagnostic tool—it’s a strategic opportunity to understand the foundation of your platform and make informed decisions about your digital future.

It’s Like a Home Inspection for Your Website

Think about buying a house. You walk through focusing on the big picture—does the kitchen work for your family? Is there enough space? But a good home inspector looks deeper, checking the foundation, examining the electrical system, and spotting that small leak under the bathroom sink that could become a major problem later.

A technical audit takes the same comprehensive approach to your digital platform. We examine not just what’s working today, but what might impact your team’s ability to execute tomorrow. The goal isn’t to find problems for the sake of finding them—it’s to give you the complete picture you need to plan strategically.

Creating Shared Understanding Across Your Entire Team

One of the most powerful outcomes of a technical audit is alignment. Whether you’re managing internal developers, partnering with an agency, or preparing to issue an RFP, having a clear baseline allows everyone to ask better questions and make more accurate decisions.

A strategic technical audit delivers:

Proactive Problem-Solving: Surface technical issues before they become roadblocks to important campaigns or launches.

Performance Optimization: Identify specific improvements that will measurably enhance user experience and conversion rates.

Workflow Enhancement: Reveal friction points that slow down content updates, campaign launches, or day-to-day management tasks.

Vendor Enablement: Provide partners and potential vendors with the context they need to scope work accurately and ask intelligent questions.

Strategic Planning: Create a foundation for long-term digital strategy decisions, from infrastructure investments to editorial tooling.

The organizations we work with often tell us that a technical audit helped them transition from reactive maintenance to proactive digital platform management—a shift that pays dividends across every initiative.

What We Typically Discover

While every platform is unique, certain patterns emerge across industries and organization types. Technical audits frequently reveal:

Security and Maintenance Opportunities: Outdated software, plugins requiring updates, or access configurations that can be strengthened with minimal effort. This often includes ensuring accessibility compliance meets current standards.

Performance Enhancements: Specific optimizations in areas like image compression, caching strategies, or database queries that directly impact user experience. Modern audits also examine search visibility and performance optimization.

Scalability Considerations: Code or architectural decisions that work fine today but could limit growth or flexibility as your needs evolve. This includes evaluating search infrastructure and international expansion capabilities.

Process Improvements: Gaps in version control, deployment workflows, or change management that create unnecessary risk or slow down development cycles.

Editorial Workflow Optimization: Content management processes that feel cumbersome or inconsistent, often because they evolved organically rather than being designed strategically. For global organizations, this includes reviewing translation and localization systems.

Many of these findings aren’t urgent fixes—they’re strategic insights that become incredibly valuable when you’re planning a redesign, launching a major campaign, or evaluating new partnerships.

When a Technical Audit Delivers Maximum Value

You don’t need to wait for problems to emerge. Technical audits are particularly valuable when:

Taking Over Digital Responsibility: You’ve inherited a platform and need a comprehensive understanding of what you’re working with and where the opportunities lie.

Planning Major Initiatives: Before investing in a redesign, platform migration, or significant feature development, understanding your current foundation prevents costly surprises.

Preparing for Vendor Selection: Whether you’re issuing an RFP or evaluating agencies, giving potential partners accurate technical context leads to better proposals and more realistic timelines.

Developing Digital Strategy: When you’re ready to create a roadmap for digital growth, grounding decisions in technical reality rather than assumptions leads to better outcomes. This is especially important when considering AI integration or generative engine optimization strategies.

Our Approach to Technical Audits

We design our audits to build clarity and confidence, not overwhelm you with technical jargon. Rather than simply delivering a report, we walk through findings with your team, prioritize recommendations based on your specific goals, and translate technical insights into actionable business language you can share with stakeholders.

Our methodology goes beyond code analysis. We examine how your platform supports your current workflows, aligns with your organizational objectives, and positions you for future growth. This combination of technical depth and strategic perspective ensures you get insights that drive real business outcomes.

The audit process focuses on partnership, not judgment.

We’re not looking for flaws to criticize—we’re identifying opportunities to help you and your partners make smarter decisions. The result is visibility into the hidden layers of your digital platform and a foundation for more strategic planning, better technology investments, and sustainable long-term success.

Ready to understand what’s really happening under the hood of your digital platform? Let’s talk about how a technical audit could support your goals and strengthen your team’s ability to execute on your digital vision.

In 2025, the way people discover and engage with digital content has shifted dramatically. Traditional Search Engine Optimization (SEO) is no longer the only strategy that brings people to your website. Meet Generative Engine Optimization (GEO), the emerging frontier for content creators and researchers looking to earn visibility through AI-driven platforms like ChatGPT, Google’s Gemini, and Perplexity.

If your organization hasn’t begun adapting its content strategy for GEO, now is a great opportunity. Here’s everything you need to know about what GEO is, why it matters, and how to start optimizing for it.

What is GEO and How Is It Different From SEO?

While SEO focuses on improving your visibility on traditional search engine results pages (SERPs) by using keywords, backlinks, and technical performance, GEO is about making your content the answer in AI-generated responses.

Rather than presenting users with a list of links as typically experienced with a Google Search, GEO centers on AI tools that synthesize information. These platforms use large language models (LLMs) to provide direct answers to a range of questions. Instead of competing for a top 10 ranking on Google, you’re aiming to be cited, summarized, or linked to by tools like Gemini or ChatGPT.

In short: SEO gets you found, GEO gets you featured.

Why GEO Matters in 2025

AI tools are no longer sidekicks to Google. They’re central players in how people research, compare options, and make decisions. As of May 2025, ChatGPT alone receives over 4.5 billion monthly visits, while Perplexity processes over 500 million searches per month. Google remains the dominant force in online search, with billions of daily visits from users worldwide. But with the direct integration of Gemini into search results, the way people find information is changing. Users can now get answers without ever clicking through to your website (this is called a “zero-click search result”).

Consequently, if your content isn’t showing up in AI answers, you’re missing out on a massive and growing segment of online visibility. Depending on what your website offers, this can be especially important for brand recognition and perception, traffic and lead potential, as well as establishing authority and credibility. In 2025, AI summaries are the new front page of search.

How GEO Works: What AI Tools Are Looking For

Each generative engine has its quirks, but several patterns are emerging across platforms:

1. Structure Matters More Than Ever

AI tools rely on clear, structured content. Use schema markup generously, particularly FAQPage, Organization, Article, and Product types. Structured data helps AI understand your content contextually, making it easier to reference in generated answers.

Tip: Google’s Structured Data Markup Helper is a great place to start reviewing your schema.

2. E-E-A-T Principles Still Rule

Google’s Expertise, Experience, Authoritativeness, and Trustworthiness (E-E-A-T) framework, a core concept for SEO, now extends to AI tools like Gemini. Show credentials, cite data, link to reputable sources, and provide content authored by credible experts.

If you have certifications, awards, partnerships, or original research, feature them clearly. This shows your authority in your area of expertise.

3. Conversation > Keywords

GEO is less about keywords and more about natural language. Write in a conversational tone and frame your content in terms of questions and answers. Think: “What are the best family vacation spots in California?” instead of “California vacation destinations.”

4. Content Freshness is Key

AI platforms (especially Perplexity, which indexes content daily) prioritize content that’s up to date. Refresh evergreen posts annually and use a content calendar to help track when to review content. Be sure to prioritize articles with titles like “Top” or “Best,” as these perform well in answer generation, particularly on ChatGPT.

5. Visuals Are Increasingly Important

Gemini and Perplexity are both investing in multimodal search. Media assets like charts, videos, and well-optimized images can increase the chance of being featured. Also make sure your image alt text, captions, and surrounding content are descriptive.

6. Prioritize Performance & Mobile-Responsiveness 

Don’t ignore performance or the site’s mobile experience. A site that performs well on mobile will load quickly, display clearly on small screens, and typically avoids frustrating interactions (like unclickable buttons or pop-ups). Poor mobile performance (i.e. slow Core Web Vitals) can hurt your rankings, which in turn reduces your visibility to LLMs that rely on search results as part of their input sources.

Tool-Specific GEO Tips

Gemini (Google)

Perplexity

ChatGPT

Tracking GEO Performance

A consequence of AI summaries is that websites may see a drop in clicks and visits within their analytics, particularly a decrease in organic traffic month over month. With users getting the answers they need from AI-generated search responses, they may no longer need to visit your website to get information. However, those users who do click through often stay longer and discover more pages than they did previously.  

Additionally, websites may also see an increase in impressions or referrals from AI assistants. This data is increasingly important to track. 

So even if AI tools don’t always send traffic directly, you can still measure their impact. Here’s how:

Action Items for Digital Teams & Clients

  1. Audit your existing content with these optimization strategies in mind. (Tip: You can even use AI tools like Gemini to identify optimization opportunities for particular pages).
  2. Update schema across all major content types, especially Q&A and organizational pages.
  3. Refresh your high-performing or evergreen content regularly, especially pieces tied to seasons, events, or top lists.
  4. Revise your content strategy to include multimedia assets, structured data, and topic clustering.
  5. Optimize your About page and author bios to strengthen trust signals for LLMs.

Final Thoughts

Optimizing for GEO isn’t just a trend, it’s a fundamental shift in how people find and interact with content online. As AI-generated answers become a dominant part of the discovery experience, your brand’s ability to show up in these spaces could mean the difference between gaining trust or going unnoticed.

By embracing schema, writing conversationally, and refreshing content with purpose, your digital presence can evolve to meet the moment, one where the best answer often wins over the best ranking.

Ready to optimize your content for AI-powered search? Let’s make it happen.

Today I learned about a military term that has come into the culture: VUCA, which stands for volatility, uncertainty, complexity, and ambiguity. That certainly describes our current times.

All of this VUCA makes me concentrate on what is stable and slow to change. Its easy to get distracted by that which changes quickly and shines in the light. Its harder to be grateful for what changes slowly. Its harder to see what those things might even be. 

In the face of AI and the way it will transform all industries (if not now, very soon), its important to remember what AI can not yet do well. Maybe it will learn how to create a facsimile of these traits in the future as it becomes more “human” (trained on human data with all its flaws might mean it has embedded within it those traits we find undeniably human). However, these skills seem like the ones that can help us navigate the VUCA that is life today.

Be Curious

AI can ask follow-up questions for clarification, but it does not (yet) ask questions for its own curiosity. It asks when it has been directed to do something. It does not sit idle and wonder what the world is like beyond the walls of the chat window.

Humans and high-order animals have curiosity. We seek information and naturally have questions about our world — why is the sky blue? why does the wind blow? why do waves crash onto the shore?

In our operations, Oomph prides itself on Discovery. This is our chance to ask the big questions — why does your business work the way it does? why are those your goals? who is your audience you have vs. the audience you want? 

In life and work, curiosity is one of our best traits. This means trying new tools, changing our processes and habits for improved outcomes, and exploring something new just to see what it can do. Even with all the VUCA in the world, approaching uncertainty with curiosity keeps us open and engaged with what we can learn next.

Use Judgement

Another important human trait is judgement, and this continues to be invaluable as humans are needed to evaluate AI outputs. 

AI is very good at creating dozens, if not hundreds of outputs. In fact, probabilistic (not deterministic) output is the strength and sometimes weakness of AI — you almost never get the same answer twice.

Our human expertise is needed to curate these outputs. We need to discard what is average and unremarkable to find the outputs that are surprising and valuable. We need to use our judgement and experience to find the ideas that are applicable to the client, the project, and the moment. Given the same 100 outputs, the right ones might be a different selection depending on the problem we want to solve and the industry in which it will be applied.

Exude Empathy

In the world of design and creating software for humans, empathy is what drives the decisions we need to make. In the flow of vibe coding, our judgments will drive technical and architectural decisions while empathy drives interface design and product feature decisions. Humans are still the ones who need to find the problems that are worth solving.

The language on the page, the helpfulness of the tooltip, and the order in which the form elements appear are some examples of how empathy drives interactions. Empathy helps team members identify confusion and redundancy. 

Further, until we are designing for AI Agents and robots as our product’s primary users, we are designing for humans. This means we need to continue to ask humans for feedback, monitor human behavior on our sites and in our apps, and understand why they make the decisions they make. All of this continues to make empathy an important human trait to cultivate.

Make Connections

Mike Bechtel, Chief Futurist at Deloitte Consulting, gave a talk at SXSW this year about how the future favors polymaths instead of specialists. His argument boils down to this: AI is a specialist at almost anything but what humans have shown over time is that the greatest inventions and insights come from disparate teams putting their expertise together or individuals making new connections between disciplines. 

Novel ideas are mash-ups of existing ideas more than brand-new ideas that have never been thought of. And these mash-ups come from curious humans who have broad experience, not deep specialization. They are the ones who can identify and bring the specialists together if need be, but most of all, they can make the connections and see the bigger picture to create new approaches. 

Support Culture

No matter how smart AI gets, it doesn’t “read the room.” It doesn’t build relationships between others, react to group dynamics, or pick up on body language. In an ambiguous human way, it does not sense when something “feels off.”

In group settings, humans command culture. AI won’t directly help you build trust with a client. It won’t read the faces in the room or over Zoom and pause for questions. It won’t sense that people are not engaging and reacting, and therefore you need to change a tactic while speaking. AI is interested in the facts and not the feelings.

Broad team culture and the culture that exists between individuals is built and nurtured by the humans within them. AI might help you craft a good sales pitch, internal memo, or provide ice breaker ideas, but in the end, humans deliver it. Mentoring, supporting culture, collaborating, and building trust continue to be human endeavors.

Break Patterns

AI is very good at replicating patterns and what has already been created. AI is very good at using its vast amount of data to emphasize best practices with patterns that are the most prevalent and potentially the most successful. But it won’t necessarily find ways to break existing patterns to create new and disruptive ones. 

Asking great questions (being curious), applying our experience and judgement, and doing it all with empathy for the humans we support leads to creative, pattern-breaking solutions that AI has not seen before. Best practices don’t stay the best forever. Changes in technology and our interface with it create new best practices. 

The easiest answer (the common denominator that AI may reach for) is not always the best solution. There is a time and a place to repeat common patterns for efficiency, but then there are times when we need to create new patterns. Humans will continue to be the ones who can make that judgement.

Be Human

AI will continue to evolve. It may get better at some of the attributes I mention — or at best, it may get better at looking like it has empathy, supports culture, and mashes existing patterns together to create new ones. But for humans, these traits come more naturally. They don’t have to be trained or prompted to use these traits.

Of all these traits, curiosity may be the most important and impactful one. AI has become our answer-engine, making it less necessary to know it all. But we need to continue to be curious, to wonder about “what if?” AI shouldn’t tell us what to ask, but it should support us in asking deeper questions and finding disparate ideas that could create a new approach.

We no longer need to learn everything. All the answers to what is already known can be provided. It is up to humans to continue with curiosity into what we do not yet know.

The tech industry has never been accused of moving slowly. The exponential explosion of AI tools in 2024, though, sets a new standard for fast-moving. The past few months of 2024 rewrote what happened in the past few years. If you have not been actively paying attention to AI, now is the time to start.

I have been intently watching the AI space for over a year. I started from a place of great skepticism, not willing to internalize the hype until I could see real results. I can now say with confidence that when applied to the correct problem with the right expectations, AI can make significant advancements possible no matter the industry.

In 2024, not only did the large language models get more powerful and extensible, but the tools are being created to solve real business problems. Because of this, skepticism about AI has shifted to cautious optimism. Spurred by the Fortune 500’s investments and early impacts, companies of every shape and size are starting to harness the power of AI for efficiency and productivity gains.

Let’s review what happened in Quarter Four of 2024 as a microcosm of the year in AI.

New Foundational Models in the AI Space

A foundational large language model (LLM) is one which other AI tools can be built from. The major foundational LLMs have been Chat GPT, Claude, Llama, and Gemini, operated by OpenAI & Microsoft, Anthropic, Meta, and Google respectively.

In 2024, additional key players entered the space to create their own foundational models. 

Amazon

Amazon has been pumping investments into Anthropic as their operations are huge consumers of AI to drive efficiency. With their own internal foundational LLM, they could remove the need to share their operational data with an external party. Further, like they did with their AWS business, they can monetize their own AI services with their own models. Amazon Nova was launched in early December.

xAI

In May of 2024, X secured funding to start to create and train its own foundational models. Founder Elon Musk was a co-founder of OpenAI. The company announced they would build the world’s largest supercomputer in June and it was operational by December.

Nvidia

In October, AI chip-maker Nvidia announced it own LLM named Nemotron to compete directly with OpenAI and Google — organizations that rely on its chips to train and power their own LLMs. 

Rumors of more to come

Apple Intelligence launched slowly in 2024 and uses OpenAI’s models. Industry insiders think it is natural to expect Apple to create its own LLM and position it as a privacy-first, on-device service. 

Foundational Model Advancements

While some companies are starting to create their own models, the major players have released advanced tools that can use a range of inputs to create a multitude of outputs: 

Multimodal Processing

AI models can now process and understand multiple types of data together, such as images, text, and audio. This allows for more complex interactions with AI tools. 

Google’s NotebookLM was a big hit this year for its ability to use a range of data as sources, from Google Docs to PDFs to web links for text, audio, and video. The tool essentially allows the creation of small, custom RAG databases to query and chat with.

Advanced Reasoning

OpenAI’s 01 reasoning model (pronounced “Oh One”) uses step-by-step “Chain of Thought” to solve complex problems, including math, coding, and scientific tasks. This has led to AI tools that can draw conclusions, make inferences, and form judgments based on information, logic, and experience. The queries take longer but are more accurate and provide more depth.

Google’s Deep Research is a similar product that was released to Gemini users in December.

Enhanced Voice Interaction

More and more AI tools can engage in natural and context-aware voice interactions — think Siri, but way more useful. This includes handling complex queries, understanding different tones and styles, and even mimicking personalities such as Santa Claus.

Vision Capabilities

AI can now “see” and interpret the world through cameras and visual data. This includes the ability to analyze images, identify objects, and understand visual information in real-time. Examples include Meta’s DINOv2, OpenAI’s GPT-4o, and Google’s PaliGemma

AI can also interact with screen displays on devices, allowing for a new level of awareness of sensory input. OpenAI’s desktop app for Mac and Windows is contextually aware of what apps are available and in focus. Microsoft’s Co-pilot Vision integrates with the Edge browser to analyze web pages as users browse. Google’s Project Mariner prototype allows Gemini to understand screen context and interact with applications.

While still early and fraught with security and privacy implications, the technology will lead to more advancements for “Agentic AI” which will continue to grow in 2025.

Agentic Capabilities

AI models are moving towards the ability to take actions on behalf of users. No longer confined to chat interfaces alone, these new “Agents” will perform tasks autonomously once trained and set in motion.

Note: Enterprise leader SalesForce launched AgentForce in September 2024. Despite the name, these are not autonomous Agents in the same sense. Custom agents must be trained by humans, given instructions, parameters, prompts, and success criteria. Right now, these agents are more like interns that need management and feedback.

Specialization

2024 also saw an increase in models designed for specific domains and tasks. With reinforcement fine-tuning, companies are creating tools for legal, healthcare, finance, stocks, and sports. 

Examples include Sierra, who offers a specifically trained customer service platform, and LinkedIn agents as hiring assistants.

What this all means for 2025

It’s clear that AI models and tools will continue to advance, and businesses that embrace AI will be in a better position to thrive. To be successful, businesses need an experimental mindset of continuous learning and adaptation: 

While the models will continue to get better into 2025, don’t wait to explore AI. Even if the existing models never improve, they are powerful enough to drive significant gains in business. Now is the time to implement AI in your business. Choose a model that makes sense and is low-friction — if you are an organization that uses Microsoft products, start with a trial of AI add-ons for office tools, for example. Start accumulating experience with the tools at hand, and then expand to include multiple models to evaluate more complex AI options that may have greater business impact. It almost doesn’t matter which you choose, as long as you get started.

Oomph has started to experiment with AI ourselves and Drupal has exciting announcements about integrating AI tools into the authoring experience. If you would like more information, please reach out for a chat.

Oomph has been quiet about our excitement for artificial intelligence (A.I.). While the tech world has exploded with new A.I. products, offerings, and add-ons to existing product suites, we have been formulating an approach to recommend A.I.-related services to our clients. 

One of the biggest reasons why we have been quiet is the complexity and the fast-pace of change in the landscape. Giant companies have been trying A.I. with some loud public failures. The investment and venture capitalist community is hyped on A.I. but has recently become cautious as productivity and profit have not been boosted. It is a familiar boom-then-bust of attention that we have seen before — most recently with AR/VR after the Apple Vision Pro five months ago and previously with the Metaverse, Blockchain/NFTs, and Bitcoin. 

There are many reasons to be optimistic about applications for A.I. in business. And there continue to be many reasons to be cautious as well. Just like any digital tool, A.I. has pros and cons and Oomph has carefully evaluated each. We are sharing our internal thoughts in the hopes that your business can use the same criteria when considering a potential investment in A.I. 

Using A.I.: Not If, but How

Most digital tools now have some kind of A.I. or machine-learning built into them. A.I. has become ubiquitous and embedded in many systems we use every day. Given investor hype for companies that are leveraging A.I., more and more tools are likely to incorporate A.I.

This is not a new phenomenon. Grammarly has been around since 2015 and by many measures, it is an A.I. tool — it is trained on human written language to provide contextual corrections and suggestions for improvements.

Recently, though, embedded A.I. has exploded across markets. Many of the tools Oomph team members use every day have A.I. embedded in them, across sales, design, engineering, and project management — from Google Suite and Zoom to Github and Figma.

The market has already decided that business customers want access to time-saving A.I. tools. Some welcome these options, and others will use them reluctantly.

Either way, the question has very quickly moved from should our business use A.I. to how can our business use A.I. tools responsibly?

The Risks that A.I. Pose

Every technological breakthrough comes with risks. Some pundits (both for and against A.I. advancements) have likened its emergence to the Industrial Revolution of the early 20th century. And a high-level of positive significance is possible, while the cultural, societal, and environmental repercussions could also follow a similar trajectory.

A.I. has its downsides. When evaluating A.I. tools as a solution to our client’s problems, we keep this list of drawbacks and negative effects handy, so that we may review it and think about how to mitigate their negative effects:

We have also found that our company values are a lens through which we can evaluate new technology and any proposed solutions. Oomph has three cultural values that form the center of our approach and our mission, and we add our stated 1% For the Planet commitment to that list as well: 

For each of A.I.’s drawbacks, we use the lens of our cultural values to guide our approach to evaluating and mitigating those potential ill effects. 

A.I. is built upon biased and flawed data

At its core, A.I. is built upon terabytes of data and billions, if not trillions, of individual pieces of content. Training data for Large Language Models (LLMs) like Chat GPT, Llama, and Claude encompass mostly public content as well as special subscriptions through relationships with data providers like the New York Times and Reddit. Image generation tools like Midjourney and Adobe Firefly require billions of images to train them and have skirted similar copyright issues while gobbling up as much free public data as they can find. 

Because LLMs require such a massive amount of data, it is impossible to curate those data sets to only what we may deem as “true” facts or the “perfect” images. Even if we were able to curate these training sets, who makes the determination of what to include or exclude?

The training data would need to be free of bias and free of sarcasm (a very human trait) for it to be reliable and useful. We’ve seen this play out with sometimes hilarious results. Google “A.I. Overviews” have told people to put glue on pizza to prevent the cheese from sliding off or to eat one rock a day for vitamins & minerals. Researchers and journalists traced these suggestions back to the training data from Reddit and The Onion.

Information architects have a saying: “All Data is Dirty.” It means no one creates “perfect” data, where every entry is reviewed, cross-checked for accuracy, and evaluated by a shared set of objective standards. Human bias and accidents always enter the data. Even the simple act of deciding what data to include (and therefore, which data is excluded) is bias. All data is dirty.

Bias & flawed data leads to the perpetuation of stereotypes

Many of the drawbacks of A.I. are interrelated — All data is dirty is related to D.E.I. Gender and racial biases surface in the answers A.I. provides. A.I. will perpetuate the harms that these biases produce as they become easier and easier to use and more and more prevalent. These harms are ones which society is only recently grappling with in a deep and meaningful way, and A.I. could roll back much of our progress.

We’ve seen this start to happen. Early reports from image creation tools discuss a European white male bias inherent in these tools — ask it to generate an image of someone in a specific occupation, and receive many white males in the results, unless that occupation is stereotypically “women’s work.” When AI is used to perform HR tasks, the software often advances those it perceives as males more quickly, and penalizes applications that contain female names and pronouns.

The bias is in the data and very, very difficult to remove. The entirety of digital written language over-indexes privileged white Europeans who can afford the tools to become authors. This comparably small pool of participants is also dominantly male, and the content they have created emphasizes white male perspectives. To curate bias out of the training data and create an equally representative pool is nearly impossible, especially when you consider the exponentially larger and larger sets of data new LLM models require for training.

Further, D.E.I. overflows into environmental impact. Last fall, the Fifth National Climate Assessment outlined the country’s climate status. Not only is the U.S. warming faster than the rest of the world, but they directly linked reductions in greenhouse gas emissions with reducing racial disparities. Climate impacts are felt most heavily in communities of color and low incomes, therefore, climate justice and racial justice are directly related.

Flawed data leads to “Hallucinations” & harms Brands

“Brand Safety” and How A.I. can harm Brands

Brand safety is the practice of protecting a company’s brand and reputation by monitoring online content related to the brand. This includes content the brand is directly responsible for creating about itself as well as the content created by authorized agents (most typically customer service reps, but now AI systems as well).

The data that comes out of A.I. agents will reflect on the brand employing the agent. A real life example is Air Canada. The A.I. chatbot gave a customer an answer that contradicted the information in the URL it provided. The customer chose to believe the A.I. answer, while the company tried to say that it could not be responsible if the customer didn’t follow the URL to the more authoritative information. In court, the customer won and Air Canada lost, resulting in bad publicity for the company.

Brand safety can also be compromised when a 3rd party feeds A.I. tools proprietary client data. Some terms and condition statements for A.I. tools are murky while others are direct. Midjourney’s terms state,

“By using the Services, You grant to Midjourney […] a perpetual, worldwide, non-exclusive, sublicensable no-charge, royalty-free, irrevocable copyright license to reproduce, prepare derivative works of, publicly display, publicly perform, sublicense, and distribute text and image prompts You input into the Services” 

Midjourney’s Terms of Service Statement

That makes it pretty clear that by using Midjourney, you implicitly agree that your data will become part of their system.

The implication that our client’s data might become available to everyone is a huge professional risk that Oomph avoids. Even using ChatGPT to provide content summaries on NDA data can open hidden risks.

What are “Hallucinations” and why do they happen?

It’s important to remember how current A.I. chatbots work. Like a smartphone’s predictive text tool, LLMs form statements by stitching together words, characters, and numbers based on the probability of each unit succeeding the previously generated units. The predictions can be very complex, adhering to grammatical structure and situational context as well as the initial prompt. Given this, they do not truly understand language or context. 

At best, A.I. chatbots are a mirror that reflects how humans sound without a deep understanding of what any of the words mean. 

A.I. systems are trying its best to provide an accurate and truthful answer without a complete understanding of the words it is using. A “hallucination” can occur for a variety of reasons and it is not always possible to trace their origins or reverse-engineer them out of a system. 

As many recent news stories state, hallucinations are a huge problem with A.I. Companies like IBM and McDonald’s can’t get hallucinations under control and have pulled A.I. from their stores because of the headaches they cause. If they can’t make their investments in A.I. pay off, it makes us wonder about the usefulness of A.I. for consumer applications in general. And all of these gaffes hurt consumer’s perception of the brands and the services they provide.

Poor A.I. answers erode Consumer Trust

The aforementioned problems with A.I. are well-known in the tech industry. In the consumer sphere, A.I. has only just started to break into the public consciousness. Consumers are outcome-driven. If A.I. is a tool that can reliably save them time and reduce work, they don’t care how it works, but they do care about its accuracy. 

Consumers are also misinformed or have a very surface level understanding of how A.I. works. In one study, only 30% of people correctly identified six different applications of A.I. People don’t have a complete picture of how pervasive A.I.-powered services already are.

The news media loves a good fail story, and A.I. has been providing plenty of those. With most of the media coverage of A.I. being either fear-mongering (“A.I. will take your job!”) or about hilarious hallucinations (“A.I. suggests you eat rocks!”), consumers will be conditioned to mistrust products and tools labeled “A.I.” 

And for those who have had a first-hand experience with an A.I. tool, a poor A.I. experience makes all A.I. seem poor. 

A.I.’s appetite for electricity is unsustainable

The environmental impact of our digital lives is invisible. Cloud services that store our lifetime of photographs sound like featherly, lightweight repositories that are actually giant, electricity-guzzling warehouses full of heat-producing servers. Cooling these data factories and providing the electricity to run them are a major infrastructure issue cities around the country face. And then A.I. came along.

While difficult to quantify, there are some scientists and journalists studying this issue, and they have found some alarming statistics: 

While the consumption needs are troubling, quickly creating more infrastructure to support these needs is not possible. New energy grids take multiple years and millions if not billions of dollars of investment. Parts of the country are already straining under the weight of our current energy needs and will continue to do so — peak summer demand is projected to grow by 38,000 megawatts nationwide in the next five years

While a data center can be built in about a year, it can take five years or longer to connect renewable energy projects to the grid. While most new power projects built in 2024 are clean energy (solar, wind, hydro), they are not being built fast enough. And utilities note that data centers need power 24 hours a day, something most clean sources can’t provide. It should be heartbreaking that carbon-producing fuels like coal and gas are being kept online to support our data needs.

Oomph’s commitment to 1% for the Planet means that we want to design specific uses for A.I. instead of very broad ones. The environmental impact of A.I.’s energy demands is a major factor we consider when deciding how and when to use A.I.

Using our Values to Guide the Evaluation of A.I.

As we previously stated, our company values provide a lens through which we can evaluate A.I. and look to mitigate its negative effects. Many of the solutions cross over and mitigate more than one effect and represent a shared commitment to extracting the best results from any tool in our set

Smart

Driven

Personal

1% for the Planet

In Summary

While this article feels like we are strongly anti-A.I., we still have optimism and excitement about how A.I. systems can be used to augment and support human effort. Tools created with A.I. can make tasks and interactions more efficient, can help non-creatives jumpstart their creativity, and can eventually become agents that assist with complex tasks that are draining and unfulfilling for humans to perform. 

For consumers or our clients to trust A.I., however, we need to provide ethical evaluation criteria. We can not use A.I. as a solve-all tool when it has clearly displayed limitations. We aim to continue to learn from others, experiment ourselves, and evaluate appropriate uses for A.I. with a clear set of criteria that align with our company culture. 

To have a conversation about how your company might want to leverage A.I. responsibly, please contact us anytime.


Additional Reading List

Everyone’s been saying it (and, frankly, we tend to agree):  We are currently in unprecedented times. It may feel like a cliche. But truly, when you stop and look around right now, not since the advent of the first consumer-friendly smartphone in 2008 has the digital web design and development industry seen such vast technological advances.

A few of these innovations have been kicking around for decades, but they’ve only moved into the greater public consciousness in the past year. Versions of artificial intelligence (AI) and chatbots have been around since the 1960s and even virtual reality (VR)/augmented reality (AR) has been attempted with some success since the 1990s (That Starner). But now, these technologies have reached a tipping point as companies join the rush to create new products that leverage AI and VR/AR. 

What should we do with all this change? Let’s think about the immediate future for a moment (not the long-range future, because who knows what that holds). We at Oomph have been thinking about how we can start to use this new technology now — for ourselves and for our clients. Which ideas that seemed far-fetched only a year ago are now possible? 

For this article, we’ll take a closer look at VR/AR, two digital technologies that either layer on top of or fully replace our real world.

VR/AR and the Vision Pro

Apple’s much-anticipated launch into the headset game shipped in early February 2024. With it came much hype, most centered around the price tag and limited ecosystem (for now). But after all the dust has settled, what has this flagship device told us about the future? 

Meta, Oculus, Sony, and others have been in this space since 2017, but the Apple device has debuted a better experience in many respects. For one, Apple nailed the 3D visuals, using many cameras and low latency to reproduce a digital version of the real world around the wearer— in real time. All of this tells us that VR headsets are moving beyond gaming applications and becoming more mainstream for specific types of interactions and experiences, like virtually visiting the Eiffel Tower or watching the upcoming Summer Olympics.

What Is VR/AR Not Good At?

Comfort

Apple’s version of the device is large, uncomfortable, and too heavy to wear for long. And its competitors are not much better. The device will increasingly become smaller and more powerful, but for now, wearing one as an infinite virtual monitor for the entire workday is impossible.

Space

VR generally needs space for the wearer to move around. The Vision Pro is very good at overlaying virtual items into the physical world around the wearer, but for an application that requires the wearer to be fully immersed in a virtual world, it is a poor experience to pantomime moving through a confined space. Immersion is best when the movements required to interact are small or when the wearer has adequate space to participate.

Haptics

“Haptic”  feedback is the sense that physical objects provide. Think about turning a doorknob: You feel the surface, the warmth or coolness of the material, how the object can be rotated (as opposed to pulled like a lever), and the resistance from the springs. 

Phones provide small amounts of haptic feedback in the form of vibrations and sounds. Haptics are on the horizon for many VR platforms but have yet to be built into headset systems. For now, haptics are provided by add-on products like this haptic gaming chair.

What Is VR/AR Good For? 

Even without haptics and free spatial range, immersion and presence in VR is very effective. It turns out that the brain only requires sight and sound to create a believable sense of immersion. Have you tried a virtual roller coaster? If so, you know it doesn’t take much to feel a sense of presence in a virtual environment. 

Live Events

VR and AR’s most promising applications are with live in-person and televised events. In addition to a flat “screen” of the event, AR-generated spatial representations of the event and ways to interact with the event are expanding. A prototype video with Formula 1 racing is a great example of how this application can increase engagement with these events.

Imagine if your next virtual conference were available in VR and AR. How much more immersed would you feel? 

Museum and Cultural Institution Experiences

Similar to live events, AR can enhance museum experiences greatly. With AR, viewers can look at an object in its real space — for example, a sarcophagus would actually appear in a tomb — and access additional information about that object, like the time and place it was created and the artist.

Museums are already experimenting with experiences that leverage your phone’s camera or VR headsets. Some have experimented with virtually showing artwork by the same artist that other museums own to display a wider range of work within an exhibition. 

With the expansion of personal VR equipment like the Vision Pro, the next obvious step is to bring the museum to your living room, much like the National Gallery in London bringing its collection into public spaces (see bullet point #5).

Try Before You Buy (TBYB)

Using a version of AR with your phone to preview furniture in your home is not new. But what other experiences can benefit from an immersive “try before you buy” experience? 

What’s Possible With VR/AR?

The above examples of what VR/AR is good at are just a few ways the technology is already in use — each of which can be a jumping-off point for leveraging VR/AR for your own business.  

But what are some new frontiers that have yet to be fully explored? What else is possible? 

Continue the AR/VR Conversation

The Vision Pro hasn’t taken the world by storm, as Apple likely hoped. It may still be too early for the market to figure out what AR/VR is good for. But we think it won’t go away completely, either. With big investments like Apple’s, it is reasonable to assume the next version will find a stronger foothold in the market.

Here at Oomph, we’ll keep pondering and researching impactful ways that tomorrow’s technology can help solve today’s problems. We hope these ideas have inspired some of your own explorations, and if so, we’d love to hear more about them. 

Drop us a line and let’s chat about how VR/AR could engage your audience. 

High-quality content management systems (CMS) and digital experience platforms (DXP) are the backbone of modern websites, helping you deliver powerful, personalized user experiences. The catch? You have to pick your platform first. 

At Oomph, we have a lot of love for open-source platforms like Drupal and WordPress. Over the years, we’ve also built applications for our clients using headless CMS tools, like Contentful and CosmicJS. The marketplace for these solutions continues to grow exponentially, including major players like Adobe Experience Manager, Sitecore, and Optimizely.

With so many options, developers and non-developers with a project on the horizon typically start by asking themselves, “Which CMS or DXP is the best fit for my website or application?” While that is no doubt an excellent question to consider, I think it’s equally important to ask, “Who is going to implement the solution?” 

CMS/DXP Solutions Are More Alike Than You Might Think

I recently attended the annual Healthcare Internet Conference and spoke with quite a few healthcare marketers about their CMS tools. I noticed a common thread: Many people think their CMS (some of which I mentioned above) is hard to use and doesn’t serve them well. 

That may very well be the case. Not all CMS tools are created equal; some are better suited for specific applications. However, most modern CMS and DXP tools have many of the same features in common, they just come at different price points. So here’s the multi-million dollar question: If most of these products provide access to the same or similar tools, why are so many customers displeased with them? 

Common Challenges of CMS/DXP Implementation

Often, we find that CMS users get frustrated because the tool they chose wasn’t configured to meet their specific needs. That doesn’t necessarily mean that it was set up incorrectly. That’s the beauty of many of today’s CMS and DXP products: They don’t take a one-size-fits-all approach. Instead, they allow for flexibility and customization to ensure that each customer gets the most out of the product.

While enticing, that flexibility also burdens the user with ensuring that their system is implemented effectively for their specific use case. In our experience, implementation is the make-or-break of a website development project. These are just a handful of things that can derail the process:

  1. The implementation partner didn’t fully understand how their client works and configure features accordingly.
  2. The demands of user experience overshadowed the needs of content editors and admins. 
  3. Hefty licensing fees ate away at the budget, leaving behind funds that don’t quite cover a thorough implementation. 
  4. The project was rushed to meet a tight deadline. 
  5. The CMS introduces new features over time that add complexity to the admin or editing experience. 
  6. Old features get sunsetted as new capabilities take their place. 

Most of the work we do at Oomph is to help our clients implement new websites and applications using content management systems like Drupal. We have decades of combined experience helping our clients create the ideal user experience for their target audience while also crafting a thoughtful content editing and admin experience that is easy to use.

But what does that look like in practice? 

4 Steps for a Successful CMS Implementation

Implementation can be the black box of setting up your CMS: You don’t know what you don’t know. So, we like to get our clients into a demo environment as soon as possible to help them better understand what they need from their CMS. Here’s how we use it to navigate successful CMS implementation: 

  1. Assess the Capabilities of the CMS

The first step can be the most simple at face value. Consider what the CMS needs to do for you, then find a CMS that includes all of those features. Content modeling (more on that below) is a key part of that process, but so is auditing your team’s abilities. 

Some teams may be developer-savvy and can handle less templated content-authoring features. Others may need a much more drag-and-drop experience. Either use case is normal and acceptable, but what matters is that you identify your needs and find both a CMS and an implementation process that meets them. That leads us to the next point.

  1. Test-Drive the CMS Early and Often

You wouldn’t buy a car without test-driving it first. Yet we find that people are often more than willing to license a CMS without looking under the hood.

Stepping into the CMS for a test drive is a huge part of getting the content editing experience right. We’ve been designing and engineering websites and platforms using CMS tools for well over a decade, and we’ve learned a thing or two along the way about good content management and editing experiences. 

Even with out-of-the-box, vanilla Drupal, the sky’s the limit for how you can configure it. But that also means that nothing is configured, and it can be difficult to get a sense of how best to configure and use it. Rather than diving into the deep end, we work with our clients to test the waters. We immediately set up a project sandbox that offers pre-configured content types, allowing you to enter content and play with a suite of components within the sleek drag-and-drop interface.

  1. Align User Experience with Content Authoring

Beyond pre-configured content and components, our sandbox sites include a stylish, default theme. The idea is to give you a taste both of what your live site could look like and what your content authoring experience might be. Since so many teams struggle to balance those two priorities, this can be a helpful way to figure out how your CMS can give you both. 

  1. Finalize Your Features & Capabilities 

While a demo gives you a good idea of the features you’ll need, it might include features you don’t. But discovering where our pre-built options aren’t a good fit is a good thing — it helps us understand exactly what YOUR TEAM does and does not need.

Our goal is to give you something tangible to react to, whether that’s love at first type or a chance to uncover capabilities that would serve you better. We’ve found this interactive yet structured process is the CMS silver bullet that leads to a better outcome. 

Content Modeling

Another key part of our project workflow is what we call content modeling. During this phase, we work with you to identify the many content types you’ll have on your website or application. Then, we can visualize them in a mapping system to determine things like: 

With a solid content model in place, we can have a higher level of confidence that our CMS implementation will create the right content editing experience for your team. From there, we actually implement the content model in the CMS as soon as possible so that you can test it out and we can make refinements before getting too far along in the process.

Content Moderation & Governance

Many clients tell us they either have too much or too little control over their content. In some cases, their content management system is so templated or rigid that marketing teams can’t quickly spin up landing pages and instead have to rely on development teams to assist. Other teams have too much freedom, allowing employees to easily deploy content that hasn’t been approved by the appropriate team members or strays from company brand standards. 

Here at Oomph, our mantra is balance. A good content editing process needs both flexibility and governance, so teams can create content when they need to, but avoid publishing content that doesn’t meet company standards. Through discovery, we work with clients to determine which content types need flexibility and which ones don’t. 

If a content type needs to be flexible, we create a framework that allows for agility while still ensuring that users can only select approved colors, font types, and font sizes. We also identify which content needs to be held in moderation and approved before it can be published on the website. 

Taking the time to discuss governance in advance creates a CMS experience that strikes the right balance between marketing freedom and brand adherence. 

Implementation Turns a Good CMS Into a Great One

Modern CMS/DXP solutions have mind-blowing features, and they will only continue to get more complex over time. But the reality is that while picking a CMS that has the features you need is important, how it’s configured and implemented might matter even more. After all, how helpful is it to have a CMS with embedded artificial intelligence if making simple copy updates to your home page is a nightmare? 

Implementation is the “it” factor that makes the difference between a CMS you love and one you’d rather do your job without.

Interested in solving your CMS headaches with better implementation? Let’s talk.

So much of healthcare happens in person. But as the pressure to connect online continues to climb, what are the challenges you face as a healthcare marketer — and the opportunities you’d love to capitalize on? 

Whatever they are, chances are that attendees of the most recent Healthcare Internet Conference (HCIC) can relate. HCIC brings together marketers and digital leaders to explore the unique and sometimes unexpected ways digital innovation is shaping the industry. 

Though this was Oomph’s first time attending, HCIC has actually been around since 1996. The tight-knit community that’s formed over the past few decades offered a safe space for candid conversations about navigating digital in a post-pandemic world. Here are five topics that ruled those conversations, how marketers like you are approaching them, and what we see as the biggest opportunities for each. 

1. To Adopt or Not To Adopt Artificial Intelligence (AI)

A whopping 86% of healthcare companies use some form of AI. But despite the number of organizations adopting AI for everything from IT operations to workforce management, healthcare marketers are still largely stuck in an AI gray zone. 

Many HCIC attendees shared that they were unsure which AI tools were ready to use today or if using AI could introduce regulatory, privacy, and ethical concerns. Would using AI-generated photos in marketing misrepresent patients or providers? Can chatbots effectively and appropriately provide support beyond basic admin and billing functions? 

While some organizations are already building their own tools (a diagnostic AI to help call center employees determine when a caller should go to urgent care caught our eye), others are interested in out-of-the-box solutions.

Our take: Proceed with caution. AI can revolutionize patient care and operations, but it can also introduce costly and reputationally damaging privacy and regulatory issues. If you aren’t well-versed in compliance, work with a partner who is to ensure your AI actually helps — not hurts. 

2. How To Combat Skyrocketing Employee Turnover

The 2020s will go down in history as one of the most difficult decades to work in healthcare. Kicked off by the COVID-19 pandemic, employee attrition only continues to rise as more employees enter retirement or simply burn out. 

While exact attrition rates vary by the healthcare segment, data from Oracle shows that hospitals lose nearly 20% of their employees every year. For nursing homes, that number skyrockets to 94%. 

Given that the cost of replacing an employee is between six and nine months of that employee’s salary, HCIC attendees were understandably interested in swapping ideas to boost employee retention. An intranet was a fairly universal solution, but the question of what makes a truly effective intranet remained.

Our take: Embrace personalization. Talk to your employees, understand their needs, then build custom features and integrations that meet them. We saw firsthand through our work with Rhode Island-based health system Lifespan that this is an effective way to build community and engagement, both of which are key to retention. 

3. The Eternal Quest for Patient Acquisition

There are two questions that keep most healthcare organizations awake at night: How do you find patients? And, once you’ve found them, how do you keep them? 

Attendees almost universally agreed that healthcare is a long way from creating seamless experiences that keep patients coming back. Many systems are fragmented, regulated, or outdated, creating barriers to patient care that patients are all too happy to leave behind. 

Our take: We think healthcare organizations can take a page out of other industries’ books here. Like any industry, a well-designed user experience (UX) is the foundation for interactions that delight patients. 

4. Personalizing the Patient Experience

On the topic of patient experience, one of the most talked-about strategies was personalization. While personalization has long been a favorite technique of ours, we were encouraged to hear the number of HCIC attendees who shared our focus. 

Many saw landing pages as the “front door” to the digital patient experience and understood that personalization could level up those interactions. We also heard excitement around combining personalization with integration — from using implicit data from the patient’s online actions to explicit data from Epic’s MyChart to personalize the information that users see. 

Our take: In our experience, adding a digital or content experience platform to your content management system (CMS) can do a lot of the heavy lifting for you. But like with anything in healthcare, the key is operating within privacy and regulatory restraints. Be sure to work with an implementation partner who’s equally skilled in technology and compliance. 

5. Finding the Right CMS

Content management systems (CMS) aren’t always a hot topic at conferences, but we were pleasantly surprised by how often they came up in conversation at HCIC — and how many opinions attendees had about them.

Most attendees felt strongly about which platforms they loved and which ones they hated. While heavyweights like Sitecore, Drupal, Optimizely, and ScorpionCMS were fixtures of the conversation, the primary takeaway is that having a good CMS experience is critical, but can be challenging to achieve. 

Our take: Take the time to get your CMS right. Choosing a lackluster CMS or underwhelming implementation partner could lock you into a multi-year headache. Many attendees we spoke to are still extremely cost-conscious in the wake of COVID, so they expect a major investment like a CMS to last at least five years. We always suggest setting a budget, mapping an ideal content architecture, and inventorying key features, then finding the right CMS to meet all those needs. 

Let’s Continue the Conversation 

The thing we’ll remember most about HCIC is the connection. As challenging as healthcare can be, it also brings people together: patients, providers, and, yes, even healthcare marketers. The five topics HCIC honed in on are important, but they’re just a snapshot of the many conversations healthcare teams are having about marketing, technology, and the patient experience. 


Our hope is that the conversation will continue until the next HCIC and beyond. If you’re a healthcare marketer, what else is on your mind? We’d love to talk about it.

There’s a new acronym on the block: MACH (pronounced “mock”) architecture. 

But like X is to Twitter, MACH is more a rebrand than a reinvention. In fact, you’re probably already familiar with the M, A, C, and H and may even use them across your digital properties. While we’ve been helping our clients implement aspects of MACH architecture for years, organizations like the MACH Alliance have recently formed in an attempt to provide clearer definition around the approach, as well as to align their service offerings with the technologies at hand. 

One thing we’ve learned at Oomph after years of working with these technologies? It isn’t an all-or-nothing proposition. There are many degrees of MACH adoption, and how far you go depends on your organization and its unique needs. 

But first, you need to know what MACH architecture is, why it’s great (and when it’s not), and how to get started. 

What Is MACH?

MACH is an approach to designing, building, and testing agile digital systems — particularly websites. It stands for microservices, APIs, cloud-native, and headless. 

Like a composable business, MACH unites a few tried-and-true components into a single, seamless framework for building modern digital systems. 

The components of MACH architecture are: 

  1. Microservices: Many online features and functions can be separated into more specific tasks, or microservices. Modern web apps often rely on specialized vendors to offer individual services, like sending emails, authenticating users, or completing transactions, rather than a single provider to rule them all. 
  2. APIs: Microservices interact with a website through APIs, or application programming interfaces. This allows developers to change the site’s architecture without impacting the applications that use APIs and easily offer those APIs to their customers.
  3. Cloud-Native: A cloud-based environment hosts websites and applications via the Internet, ensuring scalability and performance. Modern cloud technology like Kubernetes, containers, and virtual machines keep applications consistent while meeting the demands of your users. 
  4. Headless: Modern Javascript frameworks like Next.js and Gatsby empower intuitive front ends that can be coupled with a variety of back-end content management systems, like Drupal and WordPress. This gives administrators the authoring power they want without impacting end users’ experience. 

Are You Already MACHing? 

Even if the term MACH is new to you, chances are good that you’re already doing some version of it. Here are some telltale signs:

If you’re doing any of the above, you’re MACHing. But the magic of MACH is in bringing them all together, and there are plenty of reasons why companies are taking the leap. 

5 Benefits of MACH Architecture

If you make the transition to MACH, you can expect: 

  1. Choice: Organizations that use MACH don’t have to settle for one provider that’s “good enough” for the countless services websites need. Instead, they can choose the best vendor for the job. For example, when Oomph worked with One Percent for America to build a platform offering low-interest loans to immigrants pursuing citizenship, that meant leveraging the Salesforce CRM for loan approvals, while choosing “Click and Pledge” for donations and credit card transactions. 
  2. Flexibility: MACH architecture’s modular nature allows you to select and integrate individual components more easily and seamlessly update or replace those components.  Our client Leica, for example, was able to update its order fulfillment application with minimal impact to the rest of its Drupal site. 
  3. Performance: Headless applications often run faster and are easier to test, so you can deploy knowing you’ve created an optimal user experience. For example, we used a decoupled architecture for our client Wingspans to create a stable, flexible, and scalable site with lightning-fast performance for its audience of young career-seekers.     
  4. Security: Breaches are generally limited to individual features or components, keeping your entire system more secure. 
  5. Future-Proofing: A MACH system scales easily because each service is individually configured, making it easier to keep up with technologies and trends and avoid becoming out-of-date. 

5 Drawbacks of MACH Architecture

As beneficial as MACH architecture can be, making the switch isn’t always smooth sailing. Before deciding to adopt MACH, consider these potential pitfalls. 

  1. Complexity: With MACH architecture, you’ll have more vendors — sometimes a lot more — than if you run everything on one enterprise system. That’s more relationships to manage and more training needed for your employees, which can complicate development, testing, deployment, and overall system understanding. 
  2. Challenges With Data Parity: Following data and transactions across multiple microservices can be tricky. You may encounter synchronization issues as you get your system dialed in, which can frustrate your customers and the team maintaining your website. 
  3. Security: You read that right — security is a potential pro and a con with MACH, depending on your risk tolerance. While your whole site is less likely to go down with MACH, working with more vendors leaves you more vulnerable to breaches for specific services. 
  4. Technological Mishaps: As you explore new solutions for specific services, you’ll often start to use newer and less proven technologies. While some solutions will be a home run, you may also have a few misses. 
  5. Complicated Pricing: Instead of paying one price tag for an enterprise system, MACH means buying multiple subscriptions that can fluctuate more in price. This, coupled with the increased overhead of operating a MACH-based website, can burden your budget. 

Is MACH Architecture Right for You? 

In our experience, most brands could benefit from at least a little bit of MACH. Some of our clients are taking a MACH-lite approach with a few services or apps, while others have adopted a more comprehensive MACH architecture. 

Whether MACH is the right move for you depends on your: 

  1. Platform Size and Complexity: Smaller brands with tight budgets and simple websites may not need a full-on MACH approach. But if you’re managing content across multiple sites and apps, managing a high volume of communications and transactions, and need to iterate quickly to keep up with rapid growth, MACH is often the way to go. 
  2. Level of Security: If you’re in a highly regulated industry and need things locked down, you may be better off with a single enterprise system than a multi-vendor MACH solution.  
  3. ROI Needs: If it’s time to replace your system anyway, or you’re struggling with internal costs and the diminishing value of your current setup, it may be time to consider MACH. 
  4. Organizational Structure: If different teams are responsible for distinct business functions, MACH may be a good fit. 

How To Implement MACH Architecture

If any of the above scenarios apply to your organization, you’re probably anxious to give MACH a go. But a solid MACH architecture doesn’t happen overnight. We recommend starting with a technology audit: a systematic, data-driven review of your current system and its limitations.

We recently partnered with career platform Wingspans to modernize its website. Below is an example of the audit and the output: a seamless and responsive MACH architecture. 

The Audit

  1. Surveys/Questionnaires: We started with some simple questions about Wingspan’s website, including what was working, what wasn’t, and the team’s reasons for updating. They shared that they wanted to offer their users a more modern experience. 
  2. Stakeholder Interviews: We used insights from the surveys to spark more in-depth discussions with team members close to the website. Through conversation, we uncovered that website performance and speed were their users’ primary pain points. 
  3. Systems Access and Audit: Then, we took a peek under the hood. Wingspans had already shared its poor experiences with previous vendors and applications, so we wanted to uncover simpler ways to improve site speed and performance. 
  4. Organizational Structure: Understanding how the organization functions helps design a system to meet those needs. The Wingspans team was excited about modern technology and relatively savvy, but they also needed a system that could accommodate thousands of authenticated community members. 
  5. Marketing Plan Review: We also wanted to understand how Wingspans would talk about their website. They sought an “app-like” experience with super-fast search, which gave us insight into how their MACH system needed to function. 
  6. Roadmap: Wingspans had a rapid go-to-market timeline. We simplified our typical roadmap to meet that goal, knowing that MACH architecture would be easy to update down the road. 
  7. Delivery: We recommended Wingspans deploy as a headless site (a site we later developed for them), with documentation we could hand off to their design partner. 

The Output 

We later deployed Wingspans.com as a headless site using the following components of MACH architecture:

  1. Microservices: Wingspans leverages microservices like Algolia Search for site search, Amazon AWS for email sends and static site hosting, and Stripe for managing transactions.
  2. APIs: Wingspans.com communicates with the above microservices through simple APIs. 
  3. Cloud-Native: The new website uses cloud-computing services like Google Firebase, which supports user authentication and data storage. 
  4. Headless: Gatsby powers the front-end design, while Cosmic JS is the back-end content management system (CMS). 

Let’s Talk MACH

As MACH evolves, the conversation around it will, too. Wondering which components may revolutionize your site and which to skip (for now)? Get in touch to set up your own technology audit.