Context
Using Adobe Experience Manager
Adobe Experience Manager (AEM) is the content and digital asset management tool used by marketing and development teams. It is used in the day-to-day operations involving authoring, management, and delivery of content and digital media. 
The challenge was straightforward, create a clear, delightful 30-day trial for the Adobe Experience Manager. The users I was designing for would test the product for their team to convert to using this tool.
In addition, this trial was the first trial experience on the company’s Digital Experience side. Therefore, this trial model needed to be scalable as a repeatable framework for other products and services.
I hit the ground running because our team had a tight deadline to launch the trial, giving product, design, and engineering only a few months to fully implement the trial features.
Meeting the users
Our primary personas, who are invited to the AEM trial, are developers and marketers. Our content would need to be tailored to using the tool according to each role’s needs and workflows.
Defining the product direction
The main idea is that a free trial can show marketing and developers how it actually functions and explore for themselves all those benefits you’ve been trying to describe to them.
The central goals I set for designing this trial experience are …
Learning
Giving a sequential and uninterrupted experience to build up on their knowledge
Hierarchy
Guiding the eyes to the most important elements on the page
User Satisfaction
Meeting the expectations of the user accompanied by the sense of completion
Delight
That little feeling of happiness evoked by surprise or success

See more: Competitive analysis & Success metrics

Competitive Analysis
As trials have been a newer concept in Adobe, there was only one trial from the Document Cloud to reference.
Adobe Sign’s trial helped us understand what trial experiences already exist in Adobe to stay consistent with the design systems in place. This also allowed us to leverage tools already implemented, such a knowledge bots (depicted in image above).
Other tools I referenced included:
Contentful
Landing page gave a comprehensive overview of what to expect from the tools available.
Duolingo
Good user engagement and learning continuity. Keeping their users feel accomplished with each step.
Success Metrics
To validate the design for what makes sense for the end users, team, and product direction, I chose these success metrics -
Progress of trial tasks: learning the tool through recommended tasks for selected role
Interaction with tutorials and documentation: enabling users to find resources and help when needed
Engagement & continuity: Are users inclined to continue learning and exploring the tool?
handoff
Finalized designs
Putting the testing insights into considerations, the main UX touchpoints were edited for launch.
Hero card
- Giving each user clear tasks customized to their role
- Additional tutorials, support, and documentation available
Knowledge bot
- Knowledge Bot (Lightbulb icon) available on across opened tools to show progress and help
- Wizard guides to show key buttons and interactions throughout task walkthrough
iterations
Diving into design decisions
Let’s focus on the design decisions around the Main Hero card. This component directs the user to the key tasks for them to learn in the trial, so I did many explorations on this component.
Option 1: Horizontal card layout
I started out aiming to show the user a detailed view of all of their steps. Each card selected in the bottom row would change the content of the top, similar to a slide show.
Considerations:
- Cards don’t seem clickable to change the main card above.
- Hard to understand there is an intended sequential order.
Option 2 : Vertical card layout
To highlight the relationship between the steps and the detailed view, I tried a version to layout the steps on the right to click on and have the details of each step displayed on the left.
Considerations:
- Felt too visually busy, too many containers within containers.
Option 3: Collapsible side menu
To make it less busy on the hero, I tried a collapsible side bar with a different card treatment, similar to video playing platforms like YouTube.
Considerations:
- Button to view complete menu easy to miss.
- Difficult to realize that there are more steps.
Option 4: Carousel
Another option was a carousel, which could show the sequence of the tasks clearly as well as show a zoomed out view of all of the steps.
Considerations:
- Users might miss the View All button.
- No description, so less context.
- New to design system, would take more time for development.
User testing
Conducting qualitative user interviews
To test my designs, I pushed for testing by creating a research plan for these designs. I got approval from the user research team and received feedback to help the participants to talk through their reactions, expectations, and decisions.
I wanted user impressions of different onboarding wizard guides, main hero card designs, and navigation.
Using a clickable prototype, marketers and developers went through the trial experience of first entering the AEM tool.
Documenting all of the findings and notes, I grouped quotes and insights from each user based on which flow and step the user was in.
Key insights from the testing ...
Naming is confusing
Users were confused about the difference between learning activities: guide vs module vs checklist
Unclear info hierarchy
Developers assume what is shown was for other roles, although content was tailored to them.
Stuck when needing help
Unclear fallback paths when stuck and need to find help or documentation.
final thoughts
Next steps
After fleshing out the trial experience from the onboarding, learning tasks, resources and support, the next steps for this project is to continue testing after releasing the trial. With further refinement, this trial can ideally be leveraged by other tools and teams within the Experience Cloud.
Learnings
Reflecting on my personal internship experience, I spent a significant portion of time collaborating with product, research, and engineering. We worked together at a fast pace to deliver the trial experience starting in June with the release in November. Through advocating for the user experience in the direction of the tool, I gained a stronger understanding of how design plays into the product strategy and the trialists’ journey.
Back to Home