2024-2025

Atlas Metrics
Scaling complex ESG reporting through user-centered design.

About the product

Atlas Metrics is a Berlin-based SaaS platform helping it's clients report, analyze, and act on their ESG (Environmental, Social, Governance) data unitizing AI.

My role

As a Product Designer I designed and launched critical features, unified the platform’s UX, and introduced a research-driven design process. Driving adoption from 0 to 225+ product teams and boosting retention by 25%.

Main project goal

As the company scaled, usability and compliance needs increased rapidly. My role was to design solutions that made complex reporting standards simple to use while ensuring the platform and design system could scale sustainably.

Problem framing and research

I introduced structured UX research, mixing qualitative and quantitative methods:

  • PCAF Module: Expert interviews (ESG consultants) to decode technical standards, user interviews (Data Managers) for workflows, Mixpanel funnel analysis for friction points. → Resulted in guided, audit-ready flows.

  • Analytics Dashboard: Customer surveys +customer success team feedback to confirm demand, competitive benchmarking for industry best practices. → Prioritized interactive charts, benchmarking, and simulations.

  • Settings Redesign: Support ticket analysis, card-sorting workshop, usability testing. → Clearer IA, higher engagement.

  • Responsive Layout: Heuristic evaluation + device testing. → Unified spacing/grid system.

  • Website: Stakeholder workshops + A/B copy testing. → Higher conversion and engagement.

I introduced structured UX research, mixing qualitative and quantitative methods:

  • PCAF Module: Expert interviews (ESG consultants) to decode technical standards, user interviews (Data Managers) for workflows, Mixpanel funnel analysis for friction points. → Resulted in guided, audit-ready flows.

  • Analytics Dashboard: Customer surveys +customer success team feedback to confirm demand, competitive benchmarking for industry best practices. → Prioritized interactive charts, benchmarking, and simulations.

  • Settings Redesign: Support ticket analysis, card-sorting workshop, usability testing. → Clearer IA, higher engagement.

  • Responsive Layout: Heuristic evaluation + device testing. → Unified spacing/grid system.

  • Website: Stakeholder workshops + A/B copy testing. → Higher conversion and engagement.

I introduced structured UX research, mixing qualitative and quantitative methods:

  • PCAF Module: Expert interviews (ESG consultants) to decode technical standards, user interviews (Data Managers) for workflows, Mixpanel funnel analysis for friction points. → Resulted in guided, audit-ready flows.

  • Analytics Dashboard: Customer surveys +customer success team feedback to confirm demand, competitive benchmarking for industry best practices. → Prioritized interactive charts, benchmarking, and simulations.

  • Settings Redesign: Support ticket analysis, card-sorting workshop, usability testing. → Clearer IA, higher engagement.

  • Responsive Layout: Heuristic evaluation + device testing. → Unified spacing/grid system.

  • Website: Stakeholder workshops + A/B copy testing. → Higher conversion and engagement.

I introduced structured UX research, mixing qualitative and quantitative methods:

  • PCAF Module: Expert interviews (ESG consultants) to decode technical standards, user interviews (Data Managers) for workflows, Mixpanel funnel analysis for friction points. → Resulted in guided, audit-ready flows.

  • Analytics Dashboard: Customer surveys +customer success team feedback to confirm demand, competitive benchmarking for industry best practices. → Prioritized interactive charts, benchmarking, and simulations.

  • Settings Redesign: Support ticket analysis, card-sorting workshop, usability testing. → Clearer IA, higher engagement.

  • Responsive Layout: Heuristic evaluation + device testing. → Unified spacing/grid system.

  • Website: Stakeholder workshops + A/B copy testing. → Higher conversion and engagement.

First Iterations and Challenges

  • PCAF Module: Early prototypes felt too complex. I simplified the flow by collapsing multiple steps into fewer, clearer screens.

  • Analytics Dashboard: The first version was cluttered with too many metrics. I reduced it to the most essential KPIs and tucked advanced insights behind interactions.

  • Responsive Layout: Long German words broke layouts during testing. Adjusting the grid and spacing solved it without cutting content.

  • Website Redesign: Marketing wanted heavy animations, but I balanced this with lighter micro-interactions to keep performance fast while still improving storytelling.

Home Dashboard before. It felt unresponsive, with a cluttered layout and unused white space.

Home Dashboard after. Unified spacing and grid system. Responsive across all devices.

Execution & Handoff

Design critique round with ESG consultants as primary users,

  • Maintained clean Figma files, documented edge cases, and integrated new components into the design system.

  • Before handoff, I ran design critique rounds with PMs, ESG consultants and users to test flows and uncover blind spots. This made sure business needs, usability staisfaction and technical constraints were aligned early.

  • For handoff, I ran async reviews with engineers by using Jira tickets, comments, and short walkthroughs in Figma. This reduced a lot of back-and-forth during development and sped up delivery.

Outcomes & Impact

Sources: Mixpanel, Google Analytics, Atlas Metrics Customer Success Team.

Key Learnings

  • UX research doesn’t need to be perfect to be valuable. In the end of the day choosing the right mix of user feedback, analytics, and SME input provided the best actionable insights.

  • Compliance-heavy tasks benefit from guided, outcome-oriented flows.

  • Standardizing design (responsive grids, design ops) pays off in reduced bugs and faster delivery.

  • AI tools accelerated ideation, prototyping, copy testing, and edge-case exploration. Learned to effectively combine vibe designing (utilizing AI), hands on designing and meeting product decisions.

  • UX research doesn’t need to be perfect to be valuable. In the end of the day choosing the right mix of user feedback, analytics, and SME input provided the best actionable insights.

  • Compliance-heavy tasks benefit from guided, outcome-oriented flows.

  • Standardizing design (responsive grids, design ops) pays off in reduced bugs and faster delivery.

  • AI tools accelerated ideation, prototyping, copy testing, and edge-case exploration. Learned to effectively combine vibe designing (utilizing AI), hands on designing and meeting product decisions.

  • UX research doesn’t need to be perfect to be valuable. In the end of the day choosing the right mix of user feedback, analytics, and SME input provided the best actionable insights.

  • Compliance-heavy tasks benefit from guided, outcome-oriented flows.

  • Standardizing design (responsive grids, design ops) pays off in reduced bugs and faster delivery.

  • AI tools accelerated ideation, prototyping, copy testing, and edge-case exploration. Learned to effectively combine vibe designing (utilizing AI), hands on designing and meeting product decisions.

  • UX research doesn’t need to be perfect to be valuable. In the end of the day choosing the right mix of user feedback, analytics, and SME input provided the best actionable insights.

  • Compliance-heavy tasks benefit from guided, outcome-oriented flows.

  • Standardizing design (responsive grids, design ops) pays off in reduced bugs and faster delivery.

  • AI tools accelerated ideation, prototyping, copy testing, and edge-case exploration. Learned to effectively combine vibe designing (utilizing AI), hands on designing and meeting product decisions.

Final designs

Contact for collab

© 2025 Iskander Alibayev

© 2025 Iskander Alibayev

© 2025 Iskander Alibayev

© 2025 Iskander Alibayev

Enter Password