Generative Engine Optimization (GEO) is making organizations scramble — our clients have been asking “Are we ready for the new ways LLMs crawl, index, and return content to users? Does our site support evolving GEO best practices? What can we do to boost results and citations?”  

Large language models (LLMs) and the services that power AI summaries don’t “think” like humans but they do perform similar actions. They seek content, split it into memorable chunks, and rank the chunks for trust and accuracy. If pages use semantic HTML, include facts and cite sources, and include structured metadata, AI crawlers and retrieval systems will find, store, and reproduce content accurately. That improves your chance of being cited correctly in AI overviews.

While GEO has disrupted the way people use search engines, the fundamentals of SEO and digital accessibility continue to be strong indicators of content performance in LLM search results. Making content understandable, usable, and memorable for humans also has benefits for LLMs and GEO.

How LLM systems (and AI-driven overviews) get their facts

Understanding how LLMs crawl, process, and retrieve web content helps us understand why semantic structure and accessibility best practices have a positive effect. When an AI system generates an answer that cites the web, several distinct back-end steps usually happen: 

  1. Crawling — Bots visit URLs and download page content. Some crawlers execute javascript like a browser (Googlebot) while others prefer raw HTML and limit their rendering.
  2. Chunking — Large documents are split into small, logical “chunks” of paragraphs, sections, or other units. These chunks are the pieces that are later retrieved for an answer. How a page’s content is structured with headings, paragraphs, and lists determines the likely chunk boundaries for storage.
  3. Vectorization — Each chunk is then converted into a numeric vector that captures its semantic meaning. These embeddings live in a vector database and enable systems to find chunks quickly. The quality of the vector depends on the clarity of the chunk’s text.
  4. Indexing — Systems will store additional metadata (URL, title, headings, metadata) to filter and rank results. Structured data like schema metadata is especially valuable. 
  5. Retrieval — A user asks a question or performs a search and the system retrieves the most semantically similar chunks via a vector search. It re-ranks those chunks using metadata and other signals and then composes its answer while citing sources (sometimes). 

The Case for Human-Accessible Content

There are many more reasons why digital accessibility is simply the right thing to do. It turns out that in addition to boosting SEO, accessibility best practices help LLMs crawl, chunk, store, and retrieve content more accurately.

During retrieval, small errors like missing text, ambiguous links, or poor heading order can fail to expose the best chunks. Let’s dive into how this can happen and what common accessibility pitfalls contribute to the confusion.

For Content Teams — Authors, Writers, Editors

Illustration of the problem with poor alt text on images, comparing one poor example and one good example

Lack of descriptive “alt” text

While some LLMs can employ machine-vision techniques to “see” images as a human would, descriptive alt text verifies what they are seeing and the context in which the image is relevant. The same best practices for describing images for people will help LLMs accurately understand the content. 

Illustration of poor heading structure, where the poor example shows skipped heading levels while the good example shows consecutive heading levels

Out-of-order heading structures

Similar to semantic HTML, headings provide a clear outline of a page. Machines (and screen readers!) use heading structure to understand hierarchy and context. When a heading level skips from an <h2> to an <h4>, an LLM may fail to determine the proper relationship between content chunks. During retrieval, the model’s understanding is dictated by the flawed structure, not the content’s intrinsic importance. (Source: research thesis PDF, “Investigating Large Language Models ability to evaluate heading-related accessibility barriers”) 

Illustration of poor link text context, where the poor example shows Click Here and Read more links and the good example shows more descriptive and unique text samples

Descriptive and unique links

All of the accessibility barriers surrounding poor link practices affect how LLMs evaluate their importance. Link text is a short textual signal that is vectorized to make proper retrieval possible. Vague link text like “Click here” or “Learn More” does not provide valuable signals. In fact, the same “Learn More” text multiple times on a page can dilute the signals for the URLs they point to.

Using the same link text for more than one destination URLs creates a knowledge conflict. Like people, an LLM is subject to “anchoring bias,” which means it is likely to overweight the first link it processes and underweight or ignore the second, since they both have the same text signal. 

Example of the duplicate link problem: <a href=“[URL-A]”>Duplicate Link Text</a>, and then later in the same article, <a href=“[URL-B]”>Duplicate Link Text</a>. Conversely, when the same URL is used more than once on a page, the same link text should be repeated exactly.

Illustration of plain language with a poor example and a more positive example. The poor example is dense and wordy while the good example if succinct and uses a list to break the text into chunks.

Logical order and readable content

Simple, direct sentences (one fact per sentence) produce cleaner embeddings for LLM retrieval. Human accessibility best practices of plain language and clear structure are the same practices that improve chunking and indexing for LLMs

For Technical Teams — IT, Developers, Engineers

An illustration of poor semantic structure, where the left shows a potential structure made only of HTML div elements, while the good example shows semantic elements used correctly.

Poorly structured semantic HTML

Semantic elements (<article>, <nav>, <main>, <h1>, etc.) add context and suggest relative ranking weight. They make content boundaries explicit, which helps retrieval systems isolate your content from less important elements like ad slots or lists of related articles. 

Illustration of data in written form as one way to parse information, but contrasted with schema markup which can make it easier for robots to collect correct information about a subject.

Lack of schema

This is technical and under the hood of your human-readable content. Machines love additional context and structured schema data is how facts are declared in code — product names, prices, event dates, authors, etc. Search engines have used schema for rich results and LLMs are no different. Right now, server-rendered schema data will guarantee the widest visibility, as not all crawlers execute client-side Javascript completely. 

How to make accessibility even more actionable

The work of digital accessibility is often pushed to the bottom of the priority list. But once again, there are additional ways to frame this work as high value. While this work is beneficial for SEO, our recent research uncovers that it continues to be impactful in the new and evolving world of GEO.

If you need to frame an argument to those that control the investments of time and money, some talking points are: 

Staying steady in the storm

Let’s be clear — this summer was a “generative AI search freak out.” Content teams have scrambled to get smart about LLM-powered search quickly while search providers rolled out new tools and updates weekly. It’s been a tough ride in a rough sea of constant change.

To counter all that, know that the fundamentals are still strong. If your team has been using accessibility as a measure for content effectiveness and SEO discoverability, don’t stop now. If you haven’t yet started, this is one more reason to apply these principles tomorrow. 

If you continue to have questions within this rapidly evolving landscape, talk to us about your questions around SEO, GEO, content strategy, and accessibility conformance. Ask about our training and documentation available for content teams.

Additional Reading

As a digital services firm partnering with destination marketing organizations (DMOs) across the U.S., we’re helping teams navigate what’s already proving to be a volatile 2025—especially on the inbound side. Analysis from the World Travel & Tourism Council (WTTC) projects a stark reality: the U.S. economy will miss out on $12.5 billion in international visitor spending this year, with inbound spend expected to dip to just under $169B, down from $181B in 2024. Even more concerning, the U.S. is the only country among 184 economies in WTTC’s study forecast to see an inbound-spend decline this year.

While external market forces remain largely beyond control, we’ve identified three strategic areas where DMOs can focus their digital platforms to weather this storm and continue demonstrating measurable demand to their partners.

1. Transform Content Into Action-Driving Experiences

Why this strategic shift matters now

With inbound spend shrinking by $12.5B and key feeder markets weakening, undecided travelers need clarity and confidence to choose your destination. Content that reduces uncertainty and highlights immediate value converts better than generic inspiration.

Strategic implementation approach

Activate “Go Now” signals. Combine always-on inspiration with time-sensitive reasons to visit—shoulder-season value, midweek deals, cooling weather breaks—strategically mapped to the soft periods your analytics reveal. 

Elevate discovery through intelligent architecture. Curate SEO-optimized content hubs organized by Themes (outdoors, arts, culinary) and Moments (fall colors, winter lights). Implement structured data (FAQ, Event, Attraction) with strategic internal linking architecture so travelers find relevant options fast.

Deploy micro-itineraries for immediate conversion. Design 24–48-hour “micro-itins” featuring embedded maps, transit and parking guidance, and seamless handoffs to bookable partners. Partnering with platforms like MindTrip reduces content team effort while accelerating output—a strategy that’s proven particularly effective for our DMO clients facing resource constraints.

Authority-driven event content optimization. Event pages generate the highest intent traffic. Enhance them with rich media, last-minute planning resources, and strategic “if sold-out, try this” alternatives.

Transparent value communication. Feature free experiences prominently, implement intuitive budget filters, and deploy “Best Time to Visit” calendars comparing crowds and pricing by week and month. Transparency builds trust, and trust drives conversion.

2. Build Your Competitive Moat Through Data-Driven Audience Cultivation

Your first-party data represents your most defensible competitive advantage. As platform targeting becomes increasingly constrained and inbound spending softens, DMOs that build and activate their own audience will capture attention far more efficiently than those relying solely on paid channels.

Strategic audience development

Implement high-intent capture everywhere. Deploy contextual email and SMS prompts across high-intent templates—events, itineraries, trip planners, partner directories. Offer valuable micro-perks like exclusive maps and early event alerts. 

Master progressive profiling. Collect visitor preferences—season, interests, party type, origin market—over multiple touchpoints rather than overwhelming users with lengthy initial forms. 

Create actionable audience segments. Develop cohorts around 2025’s market realities: last-minute planners, shoulder-season seekers, road-trippers, value hunters, family weekenders, and meetings planners. 

Future-proof attribution systems. Combine GA4 with server-side tagging and standardized UTM schemas for every partner handoff. Track outbound clicks, partner session quality, itinerary saves and usage, offer redemptions, and newsletter-driven sessions. This comprehensive approach ensures you maintain visibility into conversion paths as third-party cookies disappear.

Deploy trend-driven editorial strategy. Develop weekly dashboards blending organic query trends, on-site search terms, partner click-through rates, and feeder-market signals. When interest dips in one market, pivot homepage modules and paid social toward value and itinerary content targeting more resilient markets.

3. Transform Partner Relationships Through Measurable Value Delivery

In a softening inbound environment where domestic spending carries approximately 90% of the economic load, your partners need two critical elements: qualified attention and proof of conversion. Your website should function as the region’s premier meta-directory and conversion engine.

Experience optimization strategies

Enable one-click handoffs with context preservation. Pass user filters—dates, neighborhoods, price ranges—directly into partner sites and booking engines while preserving state if travelers return. 

Deploy persistent trip planning tools. Allow users to save places and generate shareable itineraries with intelligent handoffs: “Book these two hotels,” “Reserve rentals,” “Get festival passes.” 

Create compelling partner storefronts. Develop rich partner profiles featuring availability widgets, authentic reviews, social proof, and clear calls-to-action. 

Implement strategic co-op modules. Design paid placements that provide value rather than feeling like advertisements: “Local Favorites” carousels, sponsor highlights, seasonal deal tiles—rotated by audience cohort and season. This generates additional revenue while maintaining user experience quality.

Establish closed-loop reporting systems. Standardize UTM tracking, monitor outbound events, and where permitted, implement partner pixels and offer codes to report assisted conversions by category and campaign. Partners need proof of ROI, and data-driven reporting builds stronger, more profitable relationships.

How Oomph Can Accelerate Your Success

If you’re experiencing softer international interest, shorter booking windows, or declining partner satisfaction, you’re facing the same challenges as DMOs nationwide. The organizations pulling ahead aren’t waiting for market recovery—they’re strengthening their digital platforms through strategic content optimization, systematic audience cultivation, and demonstrable partner value creation.

Our proven methodology transforms these challenges into competitive advantages.

We’ll conduct a comprehensive audit of your digital platform against these three strategic pillars, quantify immediate optimization opportunities, and provide your partners with what they need most: qualified, measurable demand. The market headwinds are real, but the right strategic approach can help you maintain resilience and emerge stronger when conditions improve. Let’s navigate these challenges together.


The Brief

Visit California is a Destination Marketing Organization (DMO) with over 25 years of experience in promoting California as a premier travel destination. The organization, funded through a unique partnership between the state and the travel industry, operates with a substantial budget of over $185 million (2023). As the leader of California’s brand messaging, Visit California previously anchored its campaigns under the “Dream Big” brand positioning. However, following a significant shift in traveler motivations post-pandemic, Visit California recognized a growing desire for experiences that foster joy, connection, and adventure.

This insight led to the evolution of the brand into “The Ultimate Playground,” a strategic repositioning that highlights the state’s unparalleled diversity of geography, activities, and cultural experiences. 

The “Let’s Play” campaign leveraged a robust mix of television, out-of-home, digital, and social media activations to showcase California’s diverse offerings — from wine tasting and rock climbing to luxury hotel stays, food truck adventures, and outdoor music festivals. By highlighting the playful spirit inherent in every experience, the campaign aimed to deepen the audience’s emotional connection to the state, reinforcing California’s identity as The Ultimate Playground and setting the stage for sustained brand engagement.


The APPROACH

The initial roll-out of a rebrand update is critical, and transitioning from “Dream Big” to “The Ultimate Playground” required careful internal alignment and thorough message testing. Oomph, alongside Visit California’s partner agencies, began collaborating on this effort about nine months before the campaign launch. During these early planning meetings, our team contributed key ideas and strategic approaches to help shape the campaign.

Our focus remained on the web user and their position in the customer journey. Previous campaigns had not fully optimized the user flow, so we saw this as an opportunity to reimagine the experience. With paid media driving traffic to the website, it was essential to provide visitors with clear actions once they arrived. The advertising had done its job by capturing attention and sparking interest. Now, our challenge was to build on that momentum, guiding users from interest to meaningful engagement and action.

An interactive Quiz 

The Ultimate Playground campaign needed to accomplish a few things:

  • Educate the consumer about the new brand positioning and why California should be considered the Ultimate Playground
  • Inform the consumer about play and how it is more about kids and theme parks. Adults can and should play as well, and serious activities can be conducted in playful ways
  • Activate the consumer with inspiration by giving them a unique experience and curating inspirational, playful activities throughout the state

Our teams settled on a quiz as a way to engage visitors and serve them personalized content. Based on initial research, we decided an image-based quiz would be the fastest and most fun way to answer questions and receive a set of recommendations. Choosing preferences from a set of images is a quick way to make progress tangible. We limited the questions to nine, and most visitors took two minutes to complete the quiz.

Play Styles

The eight Play Styles were based on personas researched and created by the National Institute for Play, headquartered in California. Content creators at Visit California crafted a series of TV spots with glimpses into different styles of play. Our Play Quiz would highlight which Play Style matched the participant’s preferences, and our results pages served relevant, curated content, a similar Celebrity personality, and even a secondary play style. 

Email collection allowed visitors to send their play style results to themselves and allowed opt-in to more personalized content. Our team worked quickly over three months to solidify the approach, choose the quiz method and weighting criteria of the questions, and design the eight play style pages, two landing pages, a homepage takeover, and supporting pages for the new campaign. 


The Results

Play Quiz: Avg. session duration

3.01 m

Play Styles: Avg. session duration

2.25 m

Compared to Site: Avg. session duration

0.36 m

Our approach to the campaign was to support the bottom of the funnel and give visitors coming from digital ads something useful. Given the wealth of content the Visit California website contains, these broad Play Style personas made visitors see themselves in California. It brought curated content to them and provided what we thought of as a personal homepage with relevant recommendations. 

“Let’s Play” was the first part of a years-long brand campaign. We are already working on the campaign for 2025 which we hope will be even more engaging than the first!

The tech industry has never been accused of moving slowly. The exponential explosion of AI tools in 2024, though, sets a new standard for fast-moving. The past few months of 2024 rewrote what happened in the past few years. If you have not been actively paying attention to AI, now is the time to start.

I have been intently watching the AI space for over a year. I started from a place of great skepticism, not willing to internalize the hype until I could see real results. I can now say with confidence that when applied to the correct problem with the right expectations, AI can make significant advancements possible no matter the industry.

In 2024, not only did the large language models get more powerful and extensible, but the tools are being created to solve real business problems. Because of this, skepticism about AI has shifted to cautious optimism. Spurred by the Fortune 500’s investments and early impacts, companies of every shape and size are starting to harness the power of AI for efficiency and productivity gains.

Let’s review what happened in Quarter Four of 2024 as a microcosm of the year in AI.

New Foundational Models in the AI Space

A foundational large language model (LLM) is one which other AI tools can be built from. The major foundational LLMs have been Chat GPT, Claude, Llama, and Gemini, operated by OpenAI & Microsoft, Anthropic, Meta, and Google respectively.

In 2024, additional key players entered the space to create their own foundational models. 

Amazon

Amazon has been pumping investments into Anthropic as their operations are huge consumers of AI to drive efficiency. With their own internal foundational LLM, they could remove the need to share their operational data with an external party. Further, like they did with their AWS business, they can monetize their own AI services with their own models. Amazon Nova was launched in early December.

xAI

In May of 2024, X secured funding to start to create and train its own foundational models. Founder Elon Musk was a co-founder of OpenAI. The company announced they would build the world’s largest supercomputer in June and it was operational by December.

Nvidia

In October, AI chip-maker Nvidia announced it own LLM named Nemotron to compete directly with OpenAI and Google — organizations that rely on its chips to train and power their own LLMs. 

Rumors of more to come

Apple Intelligence launched slowly in 2024 and uses OpenAI’s models. Industry insiders think it is natural to expect Apple to create its own LLM and position it as a privacy-first, on-device service. 

Foundational Model Advancements

While some companies are starting to create their own models, the major players have released advanced tools that can use a range of inputs to create a multitude of outputs: 

Multimodal Processing

AI models can now process and understand multiple types of data together, such as images, text, and audio. This allows for more complex interactions with AI tools. 

Google’s NotebookLM was a big hit this year for its ability to use a range of data as sources, from Google Docs to PDFs to web links for text, audio, and video. The tool essentially allows the creation of small, custom RAG databases to query and chat with.

Advanced Reasoning

OpenAI’s 01 reasoning model (pronounced “Oh One”) uses step-by-step “Chain of Thought” to solve complex problems, including math, coding, and scientific tasks. This has led to AI tools that can draw conclusions, make inferences, and form judgments based on information, logic, and experience. The queries take longer but are more accurate and provide more depth.

Google’s Deep Research is a similar product that was released to Gemini users in December.

Enhanced Voice Interaction

More and more AI tools can engage in natural and context-aware voice interactions — think Siri, but way more useful. This includes handling complex queries, understanding different tones and styles, and even mimicking personalities such as Santa Claus.

Vision Capabilities

AI can now “see” and interpret the world through cameras and visual data. This includes the ability to analyze images, identify objects, and understand visual information in real-time. Examples include Meta’s DINOv2, OpenAI’s GPT-4o, and Google’s PaliGemma

AI can also interact with screen displays on devices, allowing for a new level of awareness of sensory input. OpenAI’s desktop app for Mac and Windows is contextually aware of what apps are available and in focus. Microsoft’s Co-pilot Vision integrates with the Edge browser to analyze web pages as users browse. Google’s Project Mariner prototype allows Gemini to understand screen context and interact with applications.

While still early and fraught with security and privacy implications, the technology will lead to more advancements for “Agentic AI” which will continue to grow in 2025.

Agentic Capabilities

AI models are moving towards the ability to take actions on behalf of users. No longer confined to chat interfaces alone, these new “Agents” will perform tasks autonomously once trained and set in motion.

Note: Enterprise leader SalesForce launched AgentForce in September 2024. Despite the name, these are not autonomous Agents in the same sense. Custom agents must be trained by humans, given instructions, parameters, prompts, and success criteria. Right now, these agents are more like interns that need management and feedback.

Specialization

2024 also saw an increase in models designed for specific domains and tasks. With reinforcement fine-tuning, companies are creating tools for legal, healthcare, finance, stocks, and sports. 

Examples include Sierra, who offers a specifically trained customer service platform, and LinkedIn agents as hiring assistants.

What this all means for 2025

It’s clear that AI models and tools will continue to advance, and businesses that embrace AI will be in a better position to thrive. To be successful, businesses need an experimental mindset of continuous learning and adaptation: 

While the models will continue to get better into 2025, don’t wait to explore AI. Even if the existing models never improve, they are powerful enough to drive significant gains in business. Now is the time to implement AI in your business. Choose a model that makes sense and is low-friction — if you are an organization that uses Microsoft products, start with a trial of AI add-ons for office tools, for example. Start accumulating experience with the tools at hand, and then expand to include multiple models to evaluate more complex AI options that may have greater business impact. It almost doesn’t matter which you choose, as long as you get started.

Oomph has started to experiment with AI ourselves and Drupal has exciting announcements about integrating AI tools into the authoring experience. If you would like more information, please reach out for a chat.


THE BRIEF

When Seraphic Group’s founder, Zach Bush, MD, saw patterns in people’s health linked directly to problems with the food supply, he became an advocate for regenerative farming. As a potential solution to deteriorating public health, global warming, and even poverty, regenerative farming offers benefits for local and global communities. But, getting farmers to switch to it from conventional techniques is a challenge.

Regenerative farming is good for the environment and the economy in the long run—but, short term, it’s more work and more expensive than chemical-heavy, conventional farming. Add in that the appropriate techniques depend on variables like geography, soil type, and climate, and it’s a difficult thing for people to figure out on their own.

Their platform idea, Atlus∗U, needed to not only educate farmers about regenerative agriculture, but also motivate them to try it, and stick with it, for the long haul.


THE APPROACH

Understanding the Educational Purpose

As we noted in an article on different types of online learning platforms, a platform’s educational purpose determines the tools and features that will best achieve its objectives. Atlus∗U spans two purpose categories, Student Stakes Learning and Broad Stakes Learning, which means that effective education is crucial for both the learners and their larger communities.

To that end, our design vision focused heavily on content comprehension, along with keeping users motivated and engaged. Our framework included educational content and tools, accountability systems, and community features. A key component was personal stories: sharing the experiences of farmers who had successfully converted their businesses to regenerative farming and could help and encourage others to do the same.

Above all, Seraphic wanted Atlus∗U to grow and evolve over time as a kind of living guide to regenerative farming. While most online learning platforms stop when the coursework ends (think of a CPR course, where you get a certificate and you’re done), for this platform, the end of the coursework was just the beginning of the journey.


THE RESULTS

In our design, the whole community drives the learning experience, not just the teachers and coursework. It’s easy for students to connect with others who are taking the same courses, while members-only forums provide a place for productive networking, questions, stories, and support. Some forums are attached to specific lessons, so that the dialogue isn’t just between teachers and students; all members, including alumni, can participate and share their learnings on a given topic.

Another component, the accountability partner system, was crucial for achieving Seraphic’s goal of driving lasting change. Research shows that publicly sharing a goal gives people a 65% chance of success, while reporting to a specific accountability partner boosts that chance to 95%.

Finally, our learning tools were designed to enhance both content comprehension and retention. Course videos were a key feature, designed not just for the course, but for reference over time. Students have the ability to bookmark videos and attach notes to specific sections, letting them revisit important info whenever they need it.


THE IMPACT

While online learning has been around for a long time, recent advancements in design and functionality make it possible for learning platforms to have a transformative impact on individuals and across society.

In the case of Atlus∗U, it’s not just the coursework that drives users’ learning; an entire community is mobilized to help you succeed. With a focus on collaborative, lifelong learning, our design brings together farmers from around the world to improve their business, grow healthier food, and protect our world.

Need help building an effective online learning platform? Let’s talk about your goals and how to achieve them.

Oomph has been quiet about our excitement for artificial intelligence (A.I.). While the tech world has exploded with new A.I. products, offerings, and add-ons to existing product suites, we have been formulating an approach to recommend A.I.-related services to our clients. 

One of the biggest reasons why we have been quiet is the complexity and the fast-pace of change in the landscape. Giant companies have been trying A.I. with some loud public failures. The investment and venture capitalist community is hyped on A.I. but has recently become cautious as productivity and profit have not been boosted. It is a familiar boom-then-bust of attention that we have seen before — most recently with AR/VR after the Apple Vision Pro five months ago and previously with the Metaverse, Blockchain/NFTs, and Bitcoin. 

There are many reasons to be optimistic about applications for A.I. in business. And there continue to be many reasons to be cautious as well. Just like any digital tool, A.I. has pros and cons and Oomph has carefully evaluated each. We are sharing our internal thoughts in the hopes that your business can use the same criteria when considering a potential investment in A.I. 

Using A.I.: Not If, but How

Most digital tools now have some kind of A.I. or machine-learning built into them. A.I. has become ubiquitous and embedded in many systems we use every day. Given investor hype for companies that are leveraging A.I., more and more tools are likely to incorporate A.I.

This is not a new phenomenon. Grammarly has been around since 2015 and by many measures, it is an A.I. tool — it is trained on human written language to provide contextual corrections and suggestions for improvements.

Recently, though, embedded A.I. has exploded across markets. Many of the tools Oomph team members use every day have A.I. embedded in them, across sales, design, engineering, and project management — from Google Suite and Zoom to Github and Figma.

The market has already decided that business customers want access to time-saving A.I. tools. Some welcome these options, and others will use them reluctantly.

Either way, the question has very quickly moved from should our business use A.I. to how can our business use A.I. tools responsibly?

The Risks that A.I. Pose

Every technological breakthrough comes with risks. Some pundits (both for and against A.I. advancements) have likened its emergence to the Industrial Revolution of the early 20th century. And a high-level of positive significance is possible, while the cultural, societal, and environmental repercussions could also follow a similar trajectory.

A.I. has its downsides. When evaluating A.I. tools as a solution to our client’s problems, we keep this list of drawbacks and negative effects handy, so that we may review it and think about how to mitigate their negative effects:

We have also found that our company values are a lens through which we can evaluate new technology and any proposed solutions. Oomph has three cultural values that form the center of our approach and our mission, and we add our stated 1% For the Planet commitment to that list as well: 

For each of A.I.’s drawbacks, we use the lens of our cultural values to guide our approach to evaluating and mitigating those potential ill effects. 

A.I. is built upon biased and flawed data

At its core, A.I. is built upon terabytes of data and billions, if not trillions, of individual pieces of content. Training data for Large Language Models (LLMs) like Chat GPT, Llama, and Claude encompass mostly public content as well as special subscriptions through relationships with data providers like the New York Times and Reddit. Image generation tools like Midjourney and Adobe Firefly require billions of images to train them and have skirted similar copyright issues while gobbling up as much free public data as they can find. 

Because LLMs require such a massive amount of data, it is impossible to curate those data sets to only what we may deem as “true” facts or the “perfect” images. Even if we were able to curate these training sets, who makes the determination of what to include or exclude?

The training data would need to be free of bias and free of sarcasm (a very human trait) for it to be reliable and useful. We’ve seen this play out with sometimes hilarious results. Google “A.I. Overviews” have told people to put glue on pizza to prevent the cheese from sliding off or to eat one rock a day for vitamins & minerals. Researchers and journalists traced these suggestions back to the training data from Reddit and The Onion.

Information architects have a saying: “All Data is Dirty.” It means no one creates “perfect” data, where every entry is reviewed, cross-checked for accuracy, and evaluated by a shared set of objective standards. Human bias and accidents always enter the data. Even the simple act of deciding what data to include (and therefore, which data is excluded) is bias. All data is dirty.

Bias & flawed data leads to the perpetuation of stereotypes

Many of the drawbacks of A.I. are interrelated — All data is dirty is related to D.E.I. Gender and racial biases surface in the answers A.I. provides. A.I. will perpetuate the harms that these biases produce as they become easier and easier to use and more and more prevalent. These harms are ones which society is only recently grappling with in a deep and meaningful way, and A.I. could roll back much of our progress.

We’ve seen this start to happen. Early reports from image creation tools discuss a European white male bias inherent in these tools — ask it to generate an image of someone in a specific occupation, and receive many white males in the results, unless that occupation is stereotypically “women’s work.” When AI is used to perform HR tasks, the software often advances those it perceives as males more quickly, and penalizes applications that contain female names and pronouns.

The bias is in the data and very, very difficult to remove. The entirety of digital written language over-indexes privileged white Europeans who can afford the tools to become authors. This comparably small pool of participants is also dominantly male, and the content they have created emphasizes white male perspectives. To curate bias out of the training data and create an equally representative pool is nearly impossible, especially when you consider the exponentially larger and larger sets of data new LLM models require for training.

Further, D.E.I. overflows into environmental impact. Last fall, the Fifth National Climate Assessment outlined the country’s climate status. Not only is the U.S. warming faster than the rest of the world, but they directly linked reductions in greenhouse gas emissions with reducing racial disparities. Climate impacts are felt most heavily in communities of color and low incomes, therefore, climate justice and racial justice are directly related.

Flawed data leads to “Hallucinations” & harms Brands

“Brand Safety” and How A.I. can harm Brands

Brand safety is the practice of protecting a company’s brand and reputation by monitoring online content related to the brand. This includes content the brand is directly responsible for creating about itself as well as the content created by authorized agents (most typically customer service reps, but now AI systems as well).

The data that comes out of A.I. agents will reflect on the brand employing the agent. A real life example is Air Canada. The A.I. chatbot gave a customer an answer that contradicted the information in the URL it provided. The customer chose to believe the A.I. answer, while the company tried to say that it could not be responsible if the customer didn’t follow the URL to the more authoritative information. In court, the customer won and Air Canada lost, resulting in bad publicity for the company.

Brand safety can also be compromised when a 3rd party feeds A.I. tools proprietary client data. Some terms and condition statements for A.I. tools are murky while others are direct. Midjourney’s terms state,

“By using the Services, You grant to Midjourney […] a perpetual, worldwide, non-exclusive, sublicensable no-charge, royalty-free, irrevocable copyright license to reproduce, prepare derivative works of, publicly display, publicly perform, sublicense, and distribute text and image prompts You input into the Services” 

Midjourney’s Terms of Service Statement

That makes it pretty clear that by using Midjourney, you implicitly agree that your data will become part of their system.

The implication that our client’s data might become available to everyone is a huge professional risk that Oomph avoids. Even using ChatGPT to provide content summaries on NDA data can open hidden risks.

What are “Hallucinations” and why do they happen?

It’s important to remember how current A.I. chatbots work. Like a smartphone’s predictive text tool, LLMs form statements by stitching together words, characters, and numbers based on the probability of each unit succeeding the previously generated units. The predictions can be very complex, adhering to grammatical structure and situational context as well as the initial prompt. Given this, they do not truly understand language or context. 

At best, A.I. chatbots are a mirror that reflects how humans sound without a deep understanding of what any of the words mean. 

A.I. systems are trying its best to provide an accurate and truthful answer without a complete understanding of the words it is using. A “hallucination” can occur for a variety of reasons and it is not always possible to trace their origins or reverse-engineer them out of a system. 

As many recent news stories state, hallucinations are a huge problem with A.I. Companies like IBM and McDonald’s can’t get hallucinations under control and have pulled A.I. from their stores because of the headaches they cause. If they can’t make their investments in A.I. pay off, it makes us wonder about the usefulness of A.I. for consumer applications in general. And all of these gaffes hurt consumer’s perception of the brands and the services they provide.

Poor A.I. answers erode Consumer Trust

The aforementioned problems with A.I. are well-known in the tech industry. In the consumer sphere, A.I. has only just started to break into the public consciousness. Consumers are outcome-driven. If A.I. is a tool that can reliably save them time and reduce work, they don’t care how it works, but they do care about its accuracy. 

Consumers are also misinformed or have a very surface level understanding of how A.I. works. In one study, only 30% of people correctly identified six different applications of A.I. People don’t have a complete picture of how pervasive A.I.-powered services already are.

The news media loves a good fail story, and A.I. has been providing plenty of those. With most of the media coverage of A.I. being either fear-mongering (“A.I. will take your job!”) or about hilarious hallucinations (“A.I. suggests you eat rocks!”), consumers will be conditioned to mistrust products and tools labeled “A.I.” 

And for those who have had a first-hand experience with an A.I. tool, a poor A.I. experience makes all A.I. seem poor. 

A.I.’s appetite for electricity is unsustainable

The environmental impact of our digital lives is invisible. Cloud services that store our lifetime of photographs sound like featherly, lightweight repositories that are actually giant, electricity-guzzling warehouses full of heat-producing servers. Cooling these data factories and providing the electricity to run them are a major infrastructure issue cities around the country face. And then A.I. came along.

While difficult to quantify, there are some scientists and journalists studying this issue, and they have found some alarming statistics: 

While the consumption needs are troubling, quickly creating more infrastructure to support these needs is not possible. New energy grids take multiple years and millions if not billions of dollars of investment. Parts of the country are already straining under the weight of our current energy needs and will continue to do so — peak summer demand is projected to grow by 38,000 megawatts nationwide in the next five years

While a data center can be built in about a year, it can take five years or longer to connect renewable energy projects to the grid. While most new power projects built in 2024 are clean energy (solar, wind, hydro), they are not being built fast enough. And utilities note that data centers need power 24 hours a day, something most clean sources can’t provide. It should be heartbreaking that carbon-producing fuels like coal and gas are being kept online to support our data needs.

Oomph’s commitment to 1% for the Planet means that we want to design specific uses for A.I. instead of very broad ones. The environmental impact of A.I.’s energy demands is a major factor we consider when deciding how and when to use A.I.

Using our Values to Guide the Evaluation of A.I.

As we previously stated, our company values provide a lens through which we can evaluate A.I. and look to mitigate its negative effects. Many of the solutions cross over and mitigate more than one effect and represent a shared commitment to extracting the best results from any tool in our set

Smart

Driven

Personal

1% for the Planet

In Summary

While this article feels like we are strongly anti-A.I., we still have optimism and excitement about how A.I. systems can be used to augment and support human effort. Tools created with A.I. can make tasks and interactions more efficient, can help non-creatives jumpstart their creativity, and can eventually become agents that assist with complex tasks that are draining and unfulfilling for humans to perform. 

For consumers or our clients to trust A.I., however, we need to provide ethical evaluation criteria. We can not use A.I. as a solve-all tool when it has clearly displayed limitations. We aim to continue to learn from others, experiment ourselves, and evaluate appropriate uses for A.I. with a clear set of criteria that align with our company culture. 

To have a conversation about how your company might want to leverage A.I. responsibly, please contact us anytime.


Additional Reading List


THE BRIEF

Same Look, Better Build

Ordinarily, when we embark on rearchitecting a site, it happens as part of a complete front-end and back-end overhaul. This was a unique situation. Visit California users enjoyed the site’s design and helpful content features, so we did not want to disrupt that. At the same time, we needed to upgrade the frustrating back-end experience, look for broken templates, and find optimizations in content and media along the way.   

An underperforming API (which functions like an information pipeline to move content from one part of the site to another) and bloated data/code resulted in sluggish site performance and slow content updates/deployments. If the Visit California team wanted to change a single sentence on the site, pushing it live took well over an hour, sometimes longer — and often the build failed. Poorly optimized images slowed the site down even further, especially for the mobile visitors who make up the majority of site traffic. 

They were in dire need of a decoupled site connection overhaul so they could: 

  • Reduce time and effort spent on updating site content
  • Implement a more reliable build process decreasing frustration and delays
  • Create a better, faster browsing experience for users

THE APPROACH

Oomph started by looking under the hood — or, in this case, under the APIs. While APIs are supposed to make sites perform better, an outdated API was at the root of Visit California’s problem. Over the course of the project, Oomph integrated a new API, optimized images, and corrected bottlenecks across the site to make updates a breeze.

Putting Visit California in the Fast Lane

Implemented a New API

Visit California needed an API that could more quickly move data from the back end to the front. Two previous clients shared Visit California’s back-end architecture but used a modern JSON API Drupal module successfully. Switching from the GraphQL module to JSON API on the back end streamlined the amount of data, resulting in the team updating content or code in minutes instead of hours or days.

Streamlined Data During Deployments

On the front end, a Gatsby Source GraphQL plugin contributed to the issue by pulling and refreshing all data from the entire system with each content update. Oomph replaced the faulty plugin, which had known limitations and lacked support, with the Gatsby Source Drupal plugin.  On the back end, the Gatsby Integration module was configured to work with JSON API to provide incremental builds — a process that pulls only updated content for faster deployments.

Avg. full build time

64 min

Unexplained failure rate Before

52 %

Avg. incremental build time

42 min

Unexplained failure rate After

0 %

Fixed Image Processing Bottlenecks

Because we were already in the code, both teams agreed this was a great opportunity to identify improvements to boost page performance. We found that image processing was a drag — the site previously processed images during deployment rather than processing them ahead of time on the back-end. Oomph used the JSON API Image Styles module to create image derivatives (copies) in different sizes, ultimately decreasing build times. 

Lightened the Load on the Back-End

As Oomph configured the new architecture, we scoured the site for other opportunities to reduce cruft. Additional improvements included removing deprecated code and rewriting code responsible for creating front-end pages, eliminating static queries running thousands of times during page creation. We also resized large images and configured their Drupal site to set sizing guardrails for photos their team may add in the future.

Home page weight before and after:

Page WeightBeforeAfter% Change
Desktop25.41 MB3.61 MBDown 85.79%
Mobile12.07 MB3.62 MBDown 70.01%

Visualizing the improvements to loading speed:

Core Web Vitals Improvements:


THE RESULTS

Exploring the Golden State, One Story at a Time

Once Oomph was done, the Visit California site looked the same, but the load times were significantly faster, making the site more easily accessible to users. By devising a strategy to pull the same data using completely different methods, Oomph created a streamlined deployment process that was night and day for the Visit California team. 

The massive initiative involved 75,000 lines of code, 23 front-end templates, and plenty of collaboration, but the results were worth it: a noticeably faster site, a markedly less frustrating authoring experience, and page performance that would make any Californian proud.

Have you ever waved to someone and they didn’t wave back? Awkward, right? But are you sure they could see you and recognize you? Was the sun in their eyes? Were you too far away? Were you wearing a face mask?

There is a similar situation with your branding on your website. On a smaller mobile device, is your logo legible, or are the words shrunk down and too small? Are the colors high-contrast enough to be seen on a sunny day? Is there consistency between your social media avatar and your website, between your print materials and your digital advertising? Can customers recognize your brand wherever it might be displayed? 

For your brand to be the most successful, it takes a little extra effort to think through all of these possible scenarios. But it’s worth it, or your customers will give you the cold shoulder, whether they intended to or not. 

This extra bit of strategy and planning around your brand is called “Responsive Branding.” Just like responsive design, where your website’s content adapts to the device a customer is using, responsive branding adapts to the device, the medium, and the platform while also considering situations like low light, high light, animated, or static.

Oomph works with organizations across industries to build or refresh responsive brands that serve and delight their users across the full spectrum of digital experiences. Here’s what we’ve learned about responsive branding and our tips for creating one that works. 

What Is Responsive Branding?

Let’s first start with what you’ve probably already heard — responsive web design. Coined by Ethan Marcotte in 2010, the “responsive” part came to mean that a web design responded to the size of the screen, from a phone to a tablet to a widescreen desktop monitor. 

Then came responsive logos. These take the elements of the main logo and adapt them for different sizes and use cases. A logo might have too much detail to be legible as a small social media icon, for example.

Responsive Branding blends these ideas and looks at the design system holistically. A successful responsive brand may include:

Why Responsive Branding Matters

Your business makes a huge investment in building a brand that stands apart from the competition while communicating your personality and value. You are building trust with customers through every interaction. When your brand works well in one situation but not another, it erodes trust. 

A strong brand will be clear, understandable, and memorable for all users in all situations. Whether you have physical locations or digital ones, the brand works with the same consistent strength and message every time.

When you invest in a responsive brand, you: 

3 Elements of a Responsive Brand

A responsive brand is more than a shape-shifting logo. The most responsive brands make strategic use of these three elements: 

1. Logo

Your logo is the first piece of your brand that customers will recognize. Using a single-state logo can compromise that impression — a logo that looks great at a large scale is often unintelligible as a small icon. 

Responsive logo designs help ensure your logomark is clear and impactful no matter where you apply it. Beyond the size considerations we mentioned, it should include different formats like horizontal, vertical, and square to support many different digital, social, and print platforms. 

Some other techniques we use to create scalable logos include:

Oomph Tip: It’s okay to take several design rounds to get it right. Iterating helps uncover where you’ll use the logo, what it must convey, and which colors and iconography can best support that purpose. We went through several design iterations with our client AskRI before settling on a bold, simple font and clear chat bubble icon that plays off the state of Rhode Island’s distinctive shape. 

Color Palette

A responsive color palette is less about picking complementary shades on a color wheel and more about creating an experience that works in all situations. People with visual impairments and people on low-lit smartphones, for example, rely on high-contrast color combinations to engage with your brand. 

Start by following the Web Content Accessibility Guidelines (WCAG), which include specific recommendations for color contrast ratios. Colors that meet that standard include light text with dark backgrounds, or vice versa. 

Depending on where your brand appears, you may need to adjust your color palette for different settings. For example, your full-color logo might look stunning against a solid white background but becomes illegible against bright or dark colors. A single color logo is useful for some digital use cases like Windows web icons and iOS Pinned Tabs. In non-digital spaces, single color logos are great when color printing isn’t an option, such as with engraving or embroidery. Build out alternate color variations where necessary to make sure your palette works with you – not against you – across your materials. 

Oomph Tip: If your brand palette is already set in stone, try playing with the brightness or saturation of the values to meet recommendations. Often your brand colors have a little wiggle room when combinations are already close to passing corformance ratios. Check out our article about this issue for more pointers.

Typography and Layouts

Responsiveness is also important to consider when structuring web pages or marketing collateral. The most legible layouts will incorporate adaptable typography with clear contrast and simple scaling. 

When selecting a font, be sure to think about: 

Oomph Tip: Don’t go it alone. Tools like Typescale and Material UI’s The Type System can simplify typography selection by recommending font sets that meet usability and scalability requirements. And the U.S. Design System has some suggestions as to which typefaces are the most accessible.

How To Get Started With Responsive Branding

To create a responsive brand that resonates, you first have to identify what elements you need and why you need them. That second part is your secret sauce: finding a balance between a design your users can recognize and one that inspires them. 

A design audit can zero in on the needs of your brand and your audience, so you can create a responsive design system that meets both. Not sure where to start? Let’s talk.


The Brief

Cultivating A Meaningful Website

Much like nurturing planted seeds, a digital platform needs careful attention to ensure success. While the website originally fit the needs of its audience, as FFRI’s programs continued to grow over time, so has the need to reframe the digital presence. As new content is continuously added to an inflexible structure, valuable information competes for customers’ attention, leaving messages to fall through the cracks.

While the client team was aware of some pain points, they had no clear direction to begin to make improvements — where to start? What changes would have the most impact? What can they do themselves vs. what do they need help with? Farm Fresh RI needed a quick set of valuable deliverables that could provide a foundation of understanding and roadmap for improvements.

The Customer Experience Audit

Our design team is passionate about helping organizations thrive. We also understand that some organizations do not have the resources to support a full-scale website redesign. Oomph has explored ways to offer value to those in FFRI’s position — an affordable, efficient set of exercises that can create a “Guiding Star” to help clients steer internal initiatives towards iterative improvements.

We created our Customer Experience Audit to provide a streamlined yet comprehensive look at an organization’s digital presence. It combines impactful exercises that uncover user experience (UX) gaps, accessibility issues, and opportunities to improve content and navigation. The goal is to empower organizations with the knowledge and direction they need to implement changes, whether independently or with our support — changes that address their audience’s needs directly, and therefore, the organization’s impact.


The Approach

Tilling the Existing Farm Fresh RI Website

We focused our audit on FFRI’s public-facing site, the primary audience’s first touch point. Our aim was to highlight any barriers preventing these consumers from efficiently accessing information or completing key tasks. Though there is a mobile App in the ecosystem, we focused on the introduction of the brand and its value, thinking that if the site]s initial impression were stronger, more customers would utilize other digital tools. 

Through the audit, we uncovered several key areas for improvement that could significantly enhance the user experience:

Accessibility Gaps  

Through an automated and manual scan, we found and organized a number of accessibility and usability issues across the site.

Organizing Content

Large hero images push important content below the fold, making it difficult for users to access crucial information quickly. By placing content above the fold, the site can quickly show users they have landed in the right place. We suggested revising the layout to shorten the height of page hero imagery, which helps users reach key resources with minimal scrolling.

Finding the Content

The journey to find critical information, such as farmers’ market details, is complicated and fragmented across multiple pages. By consolidating this information, we recommended streamlining the user journey while also freeing up space to highlight other essential offerings.


The Results

Supporting a Better Harvest

This audit and roadmap project wasn’t just about identifying existing problems — it was about providing actionable next steps. It was not meant to lock Farm Fresh into working with us to complete those steps, either — even though we would be happy to — but rather to facilitate and supply them with the tools to push forward internally.

With our comprehensive roadmap in hand, Farm Fresh RI is already implementing our suggestions. The primary contact for the project noted that our audit validated many of their internal concerns and provided a clear path for solving issues they had struggled with for years. The deliverables included:

  • A SortSite accessibility audit with outlined improvements
  • Initial user journeys & information architecture “North Star” ideal to discuss internally
  • Quick wireframes for the Homepage and Farmers Market Listing page to show how a new page flow can support customers
  • Plan and direction for next steps and what they can do now
  • Conversation about how we can help in the future

Our improvement roadmap equips Farm Fresh RI to serve their community more effectively and deliver on their mission. If your organization needs data and expert advice that sets a path forward to an improved customer experience, reach out and contact us. For a small investment, your organization can gain clarity and direction with actionable short- and long-term activities.

The Digital Customer Experience Roadmap

Would your organization benefit from a Digital Customer Experience Roadmap?

  • Do you hear of customer complaints through email or phone outreach?
  • Do you feel your navigation is bloated with too much content and not enough organization?
  • Do pages look too similar, making customer miss important content or get lost within pages that all look the same?
  • Have portions of your organization “gone rogue” to create sub-sites and offshoots they can more easily control?

If your digital platforms —website, e-commerce site, mobile App — suffer from any of these common problems, our exercise will define next steps to address these potential issues. Download our information sheet (pdf) and then get in touch with us to discuss your needs.


THE BRIEF

Never Stopping, Always Evolving

Leica Geosystems was founded on cutting-edge technology and continues to push the envelope with their revolutionary products. Leica Geosystems was founded by Heinrich Wild and made its first rangefinder in 1921. Fast forward to the 21st century, and Leica Geosystems is the leading manufacturer of precision laser technology used for measurements in architecture, construction, historic preservation, and DIY home remodeling projects.

Oomph and Leica collaborated on an initial project in 2014 and have completed multiple projects since. We transitioned the site into a brand new codebase with Drupal 8. With this conversion, Oomph smoothed out the Leica team’s pain points related to a multisite architecture. We created a tightly integrated single site that can still serve multiple countries, languages, and currencies.


THE CHALLENGE

Feeling the Pain-points with Multisite

Leica’s e-commerce store is active in multiple countries and languages. Managing content in a Drupal multisite environment meant managing multiple sites. Product, content, and price changes were difficult. It was Oomph’s challenge to make content and product management easier for the Leica team as well as support the ability to create new country sites on demand. Leica’s new e-commerce site needed to support:

MULTIPLE COUNTRIES AND A GLOBAL OPTION

SIX LANGUAGES

MANY 3RD-PARTY INTEGRATIONS

The pain points of the previous Multisite architecture were that each country was a silo:

  • No Single Sign On (SSO): Multiple admin log-ins to remember
  • Repetitive updates: Running Drupal’s update script on every site and testing was a lengthy process
  • Multiple stores: Multiple product lists, product features, and prices
  • Multiple sites to translate: each site was sent individually to be translated into one language

THE APPROACH

Creating a Singularity with Drupal 8, Domain Access, & Drupal Commerce

A move to Drupal 8 in combination with some smart choices in module support and customization simplified many aspects of the Leica team’s workflow, including:

  • Configuration management: Drupal 8’s introduction of configuration management in core means that point-and-click admin configuration can get exported from one environment and imported into another, syncing multiple environments and saving configuration in our code repository
  • One Database to Rule Them All: Admins have a single site to log into and do their work, and developers have one site to update, patch, and configure
  • One Commerce Install, Multiple stores: There is one Drupal Commerce 2.x install with multiple stores with one set of products. Each product has the ability to be assigned to multiple stores, and price lists per country control product pricing
  • One Page in Multiple Countries and Multiple Languages: The new single site model gives a piece of content one place to live, while authors can control which countries the content is available and the same content is translated into all the languages available once.
  • Future proof: With a smooth upgrade path into Drupal 9 in 2020, the Drupal 8 site gives Leica more longevity in the Drupal ecosystem

LEARN VS. SHOP

Supporting Visitor Intention with Two Different Modes

While the technical challenges were being worked out, the user experience and design had to reflect a cutting-edge company. With the launch of their revolutionary product, the BLK 360, in 2018, Leica positioned itself as the Apple of the geospatial measurement community — sleek, cool, cutting-edge and easy to use. While many companies want to look as good as Apple, few of them actually have the content and product to back it up.

The navigation for the site went through many rounds of feedback and testing before deciding on something radically simple — Learn or Shop. A customer on the website is either in an exploratory state of mind — browsing, comparing, reviewing pricing and specifications — or they are ready to buy. We made it very clear which part of the website was for which.

This allowed us to talk directly to the customer in two very different ways. On the Learn side, the pages educate and convince. They give the customer information about the product, reviews, articles, sample data files, and the like. The content is big, sleek, and leverages video and other embedded content, like VR, to educate.

On the Shop side the pages are unapologetically transactional. Give the visitor the right information to support a purchase, clearly deliver specs and options like software and warranties, without any marketing. We could assume the customer was here to purchase, not to be convinced, so the page content could concentrate on order completion. The entire checkout process was simplified as much as possible to reduce friction. Buying habits and patterns of their user base over the past few site iterations were studied to inform our choices about where to simplify and where to offer options.


THE RESULTS

More Nimble Together

The willingness of the Drupal community to support the needs of this project cannot be overlooked, either. Oomph has been able to leverage our team’s commitment to open source contributions to get other developers to add features to the modules they support. Without the give and take of the community and our commitment to give back, many modifications and customizations for this project would have been much more difficult. The team at Centarro, maintainers of the Commerce module, were fantastic to work with and we thank them.

We look forward to continuing to support Leica Geosystems and their product line worldwide. With a smooth upgrade path to Drupal 9 in 2020, the site is ready for the next big upgrade.


The Challenge

Execute on a digital platform strategy for a global private equity firm to create a centralized employee destination to support onboarding, create interpersonal connections between offices, and drive employee satisfaction.

The key components would be an employee directory complete with photos, bios, roles and organizational structure; News, events, and other communications made easily available and organized per location as well as across all locations; The firm’s investment portfolio shared through a dashboard view with all pertinent information including the team involved.

These components, and the expected tactical assets that an intranet provides, would help the firm deepen connections with and among employees at the firm, accelerate onboarding, and increase knowledge sharing.

The Approach

Supporting Multiple Intentions: Browsing vs. Working

An effective employee engagement platform, or intranet, needs to support two distinct modes — task mode and explore mode. In task mode, employees have access to intuitive navigation, quick page loading, and dynamic search or filtering while performing daily tasks. They get what they need fast and proceed with their day.

At the same time, a platform must also encourage and enable employees to explore company knowledge, receive company-wide communications, and connect with others. For this firm, the bulk of content available in explore mode revolves around the firm’s culture, with a special focus on philanthropic initiatives and recognition of key successes.

Both modes benefit from intuitive searching and filtering capabilities for team members, news, events, FAQs, and portfolio content. News and events can be browsed in a personalized way — what is happening at my location — or a global way — what is happening across the company. For every interaction within the platform, the mode was considered and influential of nearly all design decisions.

From a technical standpoint, the private equity firm needed to support security by hosting the intranet on their own network. This and the need to completely customize the experience for close alignment with their brand meant that no off-the-shelf pre-built intranet solution would work. We went with Drupal 8 to make this intranet scalable, secure, and tailor-made to an optimal employee experience.

The Results

The platform deployment came at a time when it was most needed, playing a crucial role for the firm during a global pandemic that kept employees at home. What was originally designed as a platform to deepen employee connections between offices quickly became the firm’s hub for connecting employees within an office. As many businesses are, the firm is actively re-evaluating its approach to the traditional office model, and the early success of the new platform indicates that it is likely to play an even larger role in the future.