AI code pairing interviewer

Experience Design • Visual Design • User testing
Interview Agent is an AI-powered tool that conducts live coding interviews in real-time with human-like guidance, ensuring unbiased evaluations and generating detailed performance reports to help recruiters make faster, data-backed hiring decisions.
Project overview
The Interview Agent is a key pillar of the Geektrust AI Hiring Ecosystem, designed to revolutionize how technical interviews are conducted. It enables live code pairing sessions where candidates interact with a human-guided AI, replicating the feel of a real interviewer.

The AI evaluates the candidate’s problem-solving approach, code quality, and communication in real time and generates an unbiased performance report that helps recruiters make faster, data-backed decisions.
MY ROLE
Product designer,
Research, Ideation, Prototyping, User testing, Iterations.
TEAM
1 Product Designer, 1 Product
Manager, 5+ Developers
TIMELINE
3 months
Jun 2024 - Sep 2024
🔰  2.5×
Faster interview cycles
Reduced end-to-end interview time enabling teams to focus only on top-qualified candidates.
🔰  60%
Less recruiter involvement
The AI interviewer independently conducted structured interviews, minimizing manual coordination.
🔰  87%
Consistency in evaluation reports
Automated scoring and analysis eliminated human bias, ensuring every candidate was assessed fairly.
How does it work?
Below is a user story explaining how the Geektrust candidate app works.
How it started
The problem
Tech hiring had become a tiring and uncertain process. Recruiters received perfect-looking code submissions, only to realise many were generated using AI tools. This made it difficult to trust a candidate’s true ability. The interviews were long, inconsistent, and often biased causing great talent to slip away. Closing even a single role took 30–45 days, leaving both teams and candidates frustrated.
How might we design a human-like AI interview without any market template which balances recruiter efficiency with candidate fairness?
Have a look at the demo video to understand the product better.
Finding the gap
The opportunity
Amidst the growing mistrust in AI-assisted code submissions, we realised the problem wasn’t AI itself. It was how it was being used. Instead of fighting AI, what if we used it to restore trust in hiring? That thought became our turning point.
We imagined an AI that could sit across from the candidate, interact like a real interviewer, and truly understand how they think, not just what they code.

This shift from AI as a threat to AI as an ally became the foundation for what we later built as the Interview Agent.
🥲 Inconsistency
As you can see, our product has many inconsistencies, be it in colors, components, guidelines…, which can have detrimental effects on the overall experience & the brand image.
🤯 Redundancy
For designers, not having a library of components means you’ll have to do repetitive design work to create your product screens. This also applies to the implementation, since developers would have to spend more time and effort to create new components.
Diving Deep
Understanding the user
Hiring manager
Recruiter
Candidate
Extensive user research was conducted through interviews and surveys involving over 50 hiring managers, recruiters, and candidates across the tech industry.

Key findings included:

78% of managers expressed concern about AI-generated plagiarism affecting candidate evaluation.
65% reported slowed hiring cycles due to increased manual screening to detect dishonesty.
Candidates desired a more interactive process that fairly showcased their coding ability beyond code submission.
These insights underscored the need for a system that preserves fairness, reduces manual overhead, and delivers a streamlined candidate experience.
Defining the Direction
Shaping the Interview Agent
To adapt and innovate, we envisioned an AI-driven solution: an AI interviewer agent, capable of conducting code pairing interviews autonomously, eliminating the need for human tech recruiters. After extensive research and deliberation on its scope and limitations, we were confident in our vision.

While we can’t completely replicate human ingenuity, presence of mind, and fluidity, we can get pretty close. Our team’s goal was to design an AI agent with agency, intelligence, and empathy, mirroring the capabilities of a seasoned senior engineer in a code pairing interview.
Designing the Experience
Ideation and Exploration
Initial approach: Integrating chatbot with IDE

ITERATION 1:
Recognizing the structure of a code-pairing interview — comprising an interviewer, interviewee, and an IDE to write code, I along with the product team, proposed a very simple solution: integrating an AI chatbot into the IDE.

In this minimalist approach, the bot will: deliver problem statements, ask questions, provide code review comments, and address candidate queries.



Testing the version internally

After doing multiple rounds of internal testing (keeping almost every best case and worst case scenarios in mind), we identified several limitations and pain points. Some of the common points were:

🔶  Difficulty in finding and understanding the code review button.
🔶  Voice interaction not smooth; users desired a voice to text feature to expedite longer answers.
🔶  Confusion surrounding the purpose of the “Submit solution” button — unclear if it was for the final submission or the current task.
🔶  Code review comments opening in the IDE terminal led to confusion about how to engage with them.
🔶  Confusion regarding the purpose of the “Reply via chat” button, with users unsure if they could discuss the comment with the bot.

The intention was to conduct user testing to gather feedback on this
approach so that we can understand users’ preferences, needs, and
get suggestions, as well as how closely they felt the experience
resembled a real code pairing interview.

Early concept phases generated over 15 design prototypes focusing on an AI-assisted live coding interview. The team evaluated models emphasizing real-time collaboration, plagiarism detection, and AI-driven code support. Rapid user testing filtered out less viable ideas, narrowing the focus to a tool that eliminates the first interview round, introducing AI moderation and interactive candidate engagement to increase accuracy and fairness in the screening process.


ITERATION 2:

Following the initial round of feedback, which highlighted concerns regarding the interview experience versus an online coding environment, navigation challenges, code placement confusion, and other distracting UX issues, I diligently sought to address each root cause and propose fixes.

However, despite our efforts, I remained unsatisfied with the proposed solutions.

It was during discussions with the entire product and development teams that we stumbled upon the concept of an iFrame.
An inline frame (iframe) is an HTML element that embeds another HTML page within the document, essentially integrating a separate webpage into the parent page.

This revelation prompted us to consider developing a completely new product, leveraging the iFrame technology.


What was the idea? 💡

The idea was to create a platform where we could seamlessly transition between a chat mode for generic question-answer for evaluating candidates’ basic knowledge and to understand their high level approach and an integrated IDE within the iFrame for real-time coding, effectively evaluating candidates’ technical skills.


Rough conceptualisation of the idea

Initial rough sketches

After we discovered the potential of using an iFrame, I jumped into Figma and started sketching rough mockups.

This hands-on approach helped me visualize and structure the interface for the AI Interviewer, giving me a clear direction to move forward.
Initial ideation for the product
Figma iterations before landing on to the final design.
After numerous iterations on the AI Interview UI, UX and countless design calls with stakeholders, it was the time for the final designs.
High fidelity outcome
Final designs

1. Onboarding

Onboarding was designed in multiple iterations. Earlier, all the product features were explained in the onboarding. But upon testing, below were the issues found:
(a) Candidates often skipped the steps and missed how the product features worked.
(b) The process felt too long.

So, we changed it and just gave basic things required to get the interview started. And the interaction is such that the candidates don't miss anything given while onboarding.

Below is how the final onboarding looks:

2. Introduction

This is the first screen users see when they land on the AI Interviewer.

The AI Interviewer greets them with an
✔ introductory message, explaining the structure of the interview, and
✔ asking if they are ready to begin



⬇️ Header:

Now that we had our own full-fledged product, I included a header to:
✔ enhance branding,
✔ incorporate main action items, and
✔ add essential UI elements.

The header also improves :
✔ accessibility by providing a structured layout,
✔ ensuring a cohesive and user-friendly experience.


Image: Header consisting of UI elements

⬇️ Progress bar:

Based on user feedback and observing their struggles with the flow and lack of awareness about their current stage in the interview process, we decided to improve transparency.

After internal discussions, I implemented a progress bar to clearly
✔ highlight all stages of the interview and
✔ how users exactly where they are in the process.
Image: Progress bar with different stages of the interview

3. Presenting the coding problem

Once the user clicks “Let’s start” or says “I’m ready,” the AI Interviewer presents the first coding problem in a side panel that slides in from the left.

✔ The user can read the detailed problem statement and ask any questions to the AI Interviewer on the right side of the screen.
✔ Additionally, the AI Interviewer checks via message if the user has understood the problem statement.



⬇️ Problem statement:

I have made the coding problem statement sticky, ensuring it remains accessible throughout the task, whether you are in the IDE or the main chat.

Additionally, the problem title, language, problem number, and details are clearly highlighted and well-formatted for easy reference.

Image: Examples of different coding problem

4. Bringing the IDE into Action

Once the user understood the problem, they can click on the “Start coding” button presented by the AI Interviewer to bring up the IDE.




⬇️ How IDE works?


1. Loading IDE
• Show tips on coding, file access, and IDE use while loading.


2. IDE Loaded

IDE opens in iFrame.
AI chat collapses to a floating icon, can be reopened anytime.


3. Using IDE

Write, test, and review code.
Move to next task or expand IDE to full screen.

All action items and CTAs are conveniently placed at the top of the iFrame, above the IDE, for improved visibility and accessibility.

⬇️ Receiving code review comments:

❗️Previously, code review comments were appearing in the terminal on the IDE, disrupting the flow of the coding experience.
To align with the dynamics of a real code pairing interview, where feedback is typically given by the interviewer, I collaborated with the product team to implement a solution.

We decided to display code review comments within the chat interface alongside the AI Interviewer.
To prevent confusion between code comment discussions and regular chat, we introduced separate tabs in the chat:
one for normal conversation with the AI Interviewer and
another dedicated solely to code review comments.

This segregation ensures a clear distinction and facilitates focused discussions on code-related feedback.

Image: Showing code feedbacks in the chat

5. Voice interaction

Since voice interaction was a key feature in our product for assessing candidates’ communication skills, I decided to prioritize it.

I identified and categorized all the pain points related to the voice feature, mapping them to specific UX improvements.


6. End interview

The Post-Interview Experience
The interview report
Interview report
Testing the Experience
How did users respond?

Following this iteration, we recognized the necessity of testing the initial version with real users.

We reached out to developers from our network, inviting them to engage with our product in a 60-minute interview to gather authentic and real user insights and gauge their feedback.

After gathering feedback from users, I meticulously categorized their input into four key areas: UX, Content, Interview Structure, and Product Features.

I then identified the root causes behind each piece of feedback and tried to develop appropriate solutions to address them.
Snapshots of users attending Geektrust AI code pairing interviews.
After multiple iterations and rounds of user testing with recruiters and candidates, the updated version of the Interview Agent received highly positive feedback and stronger adoption across pilot clients.

✔️ What's working well
Users appreciated how close the experience felt to a real human interview, while still benefiting from the consistency and speed of AI.
Some key highlights include:
Feels very close to a real interview: the human-like voice interaction and adaptive questioning felt natural and engaging.
Guided flow throughout the interview: users felt supported through every stage of the process.
• Easy access to the problem statement: candidates could refer back at any point without breaking focus.
Clear visibility of progress: transparent stage indicators helped users stay oriented and calm.
Professional, modern appearance: the clean interface built confidence in the AI’s credibility.

‍🔶 What needs improvement?
While the response was largely positive, testing also revealed valuable areas to improve:
‍• Refining the AI interviewer’s behavior: making prompts more contextual and less repetitive.
Enhancing voice interaction UX: smoother transitions and real-time transcription could increase clarity.
• Improving technical stability: ensuring zero interruptions during live coding interviews.
Wireframing
With the research and aimed solution as a foundation, I proceeded to create detailed wireframes ensuring a user-centered and cohesive design approach.
Note: There were multiple iterations of the wireframes.

I was given paper wireframes and tasked with designing mid-fidelity wireframes to address the identified problems. Below is my proposed solution.
Issues solved:
The outcomes
Result and impact
This AI-powered code pairing platform enabled companies to automate their first-round interviews helping teams assess candidates quickly, fairly, and at scale.
• 2.5× Faster Interview Process: The AI interviewer streamlined code pairing rounds, cutting down the overall interview cycle significantly.
• 60% less recruiter involvement:
Recruiters were no longer required to attend initial interviews, leading to faster shortlisting and reduced fatigue.
• 87% report consistency:
Standardized evaluation reports provided reliable insights, making candidate comparison objective and fair.
• 45% faster hiring decisions:
The AI-generated interview reports empowered clients to make data-backed decisions more quickly and confidently.

A key highlight was the automatically generated interview report, which consolidated AI observations, code quality analysis, and behavioral signals into a single, actionable document — helping recruiters make confident, data-driven hiring decisions faster.
What’s Next
Next steps
🔰 As a designer, the focus now is to refine how naturally and intelligently the AI interviewer interacts making voice conversations feel more human-guided and context-aware.

🔰 Simplifying the onboarding and overall flow remains key, ensuring even first-time candidates can intuitively navigate the experience.

🔰 The interview report, being central to decision-making, needs deeper visual clarity highlighting performance insights, strengths, and improvement areas in a recruiter-friendly format.

🔰 I also plan to strengthen the product’s accessibility and reliability across varied environments, ensuring smooth performance regardless of tech constraints. Finally, expanding user testing across diverse roles and experience levels will help validate usability, improve personalization, and make the entire interview experience truly seamless and fair.