2024
Cernel contextualises your entire webshop inventory & brand to write the best product content on a large scale — saving time and lots of headaches.
Scroll to view project
Jump to
1
Here you can glean a quick overview of Cernel’s finished Personalisation feature, with showcases and a short description of the project.
Cernel's latest product feature is an attempt to mature the users' ability to automate their webshop content writing.
The design process of this project tackles the user's cognitive pressure when using AI tools to write content — discovering where that pressure originates from (research), and figuring out how to build product features that alleviate doubts and uncertainty when in the middle of a session (execution).
While this is not an exercise in branding or marketing, you will still see thoughts and decisions made to push for stylistic choices. Cernel as a digital B2B platform lacked a structured approach to design, so part of my work included strengthening their visual communication, mapping out a design system for repeat use of components, and introducing much needed accessibility paradigms for the user's benefit.
The finished product turned out as quite a luxurious settings menu. Users have a vast pool of knowledge at their fingertips, arranged in an easy-to-navigate structure — presented with fully animated visual descriptions of what each setting will do.
User beta testing yielded very positive results, with most of the critique aimed at animations that could be clearer in their purpose —generally reflecting a solid foundation for a v1.0 release.
2
In this section I’ll take you through the groundwork and decision-making that made us confident in the actions taken throughout the project.
Through Cernel’s customers, we knew that Chat-GPT had left an unpleasant SEO generation experience. We found that coaxing the GPT prompting to yield usable results took a heavy toll on ecommerce users — and that we should focus on reducing cognitive pressure while allowing for high levels of customization.
I analysed the customer’s typical Cernel onboarding experience, and mapped out each step to see where repeat pain points arose. Fairly quickly an ugly pattern emerged. In trying to understand WHAT Cernel does, much of the user’s time is spent testing and moving between our platform and their store — causing confusion, & leading to frustration.
Being at the forefront of AI SEO writing meant constantly treading new ground, and opening doors that customers didn’t even know exist. There was a certain “obscurity” to the tech that made explaining the product tricky and so our experts were spending a lot of time helping customers understand the product — that had to change.
Feature requirements came straight from speaking with our users and testing new features & technologies. As a result, building the structure for this project was done iteratively, building and maintaining overhead views of our system design was vital to properly communicate feature-details internally.
Working iteratively at the cutting edge, where technology keeps changing, meant we needed clear communication. We developed a common plan for every single feature being implemented, each one was assigned critical information so all teams could contribute at the same time to minimize wasted resources — with me as the mediator.
3
Finally, I’ll be showing you how we designed and developed the product, with some of the visual features that made the personalisation a stunningly luxurious experience.
Based off of our customer insights and expert knowledge pool, I set off with a mission:
Allow the user to personalise their SEO setup with all the tips and functionality usually provided by our experts. They need to feel a certain comfort when generating their content, by seeing that the configurations they make are reflected in the content.
Early on, one of the key concepts came from trying to visualise our experts’ knowledge. Many of the customisations our users would be facing were, at the time, unintuitive. I believed we could bridge that gap by showing what we were telling. We were introducing an entirely new method of content generation, there was no user intuition that we could rely on from other sources. What’s worse, is that everyone came in with the preconception of: AI, while cool, isn’t giving us reliable results.
“We were introducing an entirely new method of content generation, there was no user intuition that we could rely on from other sources. What’s worse, is that everyone came in with the preconception of: AI, while cool, isn’t giving us reliable results.”
I toyed with the idea of having a two-part interface, where one side would be for buttons and toggles, while the other side was for explaining and visualising. A design like this could fit quite well into the standard office widescreen monitor setup, and be well within scope for desktop scalability.
In those same early sketches I wanted to evoke the feeling of being able to easily switch/test what each button does. Much of the inspiration for the interactable interface comes from apps like Duolingo, where learning is a key part of the UI design.
Always at the back of my mind, was this idea that the whole feature should be able to fit into a tutorial/onboarding flow. It would be quite beneficial to be able to use this feature flexibly within the platform, but most importantly we had a list of prerequisite settings that the user should deal with before finishing their site setup. Being able to plant these settings within the platforms onboarding flow was therefore a much wanted design feature.
The visual half of the design was very much a derivative of the existing platform’s style. My job as both UX & UI designer would be to bring a bit of order to the chaos, introducing color signaling rules, visual hierarchy and a design system to let the developers begin building a component library.
Ultimately, the design we stuck to was decided within the design/frontend department by weighing the flexibility to workload ratio of each implementation. Early agreements tended towards relying on interactive Rive animations for simple visualization, wile we would use GSAP to display and animate live data when needed.
In early concepts I tried representing “tabs” and/or “steps” that help the user move through the personaliser. I wanted to both save space and provide a comprehensive overview of the full content a user could customise. So how did we decide on a design?
During this iteration phase we were still discussing how to split up the three configurations (Global / Products / Categories) to avoid confusion — and while the split may seem obvious now, at the time we spent days discussing pros and cons.
The condensed version of these discussions would be:
“We have both very similar settings and very different settings for all configurations, how do we visually represent groupings that follow a logical pattern?”
“What do we do if things change and the balance of similarity tips into either side?”
“Can we future proof our decision making or should we build up a multi-step plan to reach a final design in the far future?”
I chose to rely on visual indicators that the professional industry could recognise from other pieces of software. Getting inspiration from programs like Excel & Outlook, and finding inspiration from the ecommerce stores’ own dashboards. This decision paid off in the end, as users easily recognised the visual segmentation, which performed double duty by also being extremely space- & scaling-efficient.
Fundamentally, the whole personalisation experience builds on showing the user what’s going to happen when they enable/disable settings. In essence, we built a fancy settings menu. Due to our users’ limited intuition with the technology, we’d made it as clear as possible what each button, toggle & radio would mean for their content generation.
And the users absolutely loved it! 🎉
Of course, this being the very first implementation of a new feature, it wasn’t perfect 1 to 1 show and tell. Some of the visual animations got misinterpreted by users, who were confused by whether the visuals were showing their ACTUAL actions, or just showing a “looping” preview. These issues were particularly prominent in settings that had seen drastic conceptual changes throughout the project, now acting as mere husks of their originally intended functionality.
During early stages of testing, I wanted to combine many of the settings into singular pages. However, as new features started to emerge and other features got deprecated due to advancements in tech, the “combined designs” became harder and harder to execute on. It was tough, killing some of my earliest darlings, but we were dealing with a complex web of features and had to make decisions that made sense within the entire design structure.
With the finalized design I strongly wanted people to be able to play around and understand each setting at their own pace. It always frustrates me personally whenever an interface hides information behind an activation interaction — such as a toggle only displaying useful added information upon being toggled on.
This choice of design thinking came with it’s own set of problems. When are your settings saved? How do you see your interactions on the visualisation? What happens to the visualisation when you keep going back and forth between settings?
With a bit of inspiration from Webflow, and a lot of back and forth during testing we ended up with a design that allows you to both show & change visualisations as you’re interacting with each setting. Once the user makes a decision, they can choose to save each individual setting at their own leisure.
Now, after my departure, my dear colleagues at Cernel have all the tools and guidelines needed to continue iterating on the personalisation feature and the platform as a whole.
New features will be developed in the context of their now ready-made visual design system, allowing for quick iterations with defined use-case assets throughout the platform, hopefully inspiring further testing and iteration of future features ❤️