Product Market Fit | Kandor

OVERVIEW

Kandor is a tool that was made to give feedback on speaking skills. Identifying 1 niche community that’ll get us traction and feedback was our first priority as a bootstrapped start-up. We identified IELTS prep as the user segment which valued speaking feedback the most after running several targeted campaigns on Facebook & LinkedIn. We joined several IELTS communities to learn more about our users as well as build a strong community that can help us with testing, give us feedback and help us grow.

Some of my key contributions to Kandor are

  1. UI clean-up of the initial product for launch

  2. Humanizing the feedback model

  3. Creatives for running targeted campaigns

  4. Redesign of the product for the IELTS community

  5. Explorations of practice-centric pedagogy

ROLE

Product [50%] & Design [50%]

TIMELINE

Aug 2020 - Apr 2021

TEAM

Navdeep (product + front end), Alex (backend), Mayank (ML + POCs) & Me (design).

UPDATE [JAN 2023]

Kandor is now used by 50,000+ users with an app rating of 4.4. We are currently in process of monetizing our product and going deep into the immigration space. Read more here.

PRODUCT DEMO
(Video recorded on 12 Apr 2021)

UI CLEAN-UP FOR PRODUCT LAUNCH

A promising bootstrapped started with an uncertain future. That’s what it looked like when I joined. My objective was to get the product out of the door asap so that we can start evaluating our technology & market.

Initial UI

After UI clean up

GOALS & CHALLENGES

We were pressed for time. To optimize for it I decided to go with the universally accepted blue because its easy to work around with. We were using a bootstrap theme so the UI couldn’t be changed much. But hey constraints make for good design so we were able to design, code & debug in 2 weeks.

HUMANIZING THE FEEDBACK MODEL

Kandor’s initial technology POC gave feedback on 6 parameters. It was tedious to make sense of. It was difficult to figure out what to do with the feedback.

Making metrics understandable

Data model.png
final MVP.png

GOALS & CHALLENGES

Our objective with this exercise was to make the feedback as easy to understand as possible. We did sessions with our friends and acquaintances after we changed our feedback model. People were able to understand clarity, confidence easily. Engaging proved to be a little abstract. In order to give more context and educate users, we started showing what made clarity, confidence and engagement. Once this granular picture was in, people saw real potential in our product. We added the metric delta as well.

We were able to complete a full loop of effort to reward.

CREATIVES FOR RUNNING TARGETED CAMPAIGNS

We started running campaigns to evaluate the need for our tool in different user segments like

  1. People who want to practice for interviews

  2. People whose work involves giving a lot of presentations

  3. Startup founders who want to practice their pitch

  4. People interested in self-improvement

For this exercise, I made creatives and shared them on PowerPoint so that we can test as many ideas as we want.

ads powerpoint.png

We got 60+ signups in a month. Very few out of these users actually used the platform. We were feeling uncertain about Kandor’s future.

The Aha! moment

aha.png

One day when going through our data, we noticed that there were 2 odd users. 1 user had done 20 recordings, others had done 8 recordings. They were the only ones who used the platform with some regularity. So we reached out to them. We found that they were preparing for IELTS during those conversations.

The next day we ran Facebook ads targeted at IELTS aspirants:

  1. We got 10x signups compared to our interview, pitch & presentation ads.

  2. Our Cost per sign up went drastically down

To evaluate the segment further, we found and joined Telegram & whatsapp groups of IELTS aspirants and saw what was happening there.

Majority of IELTS community discussions are around recordings, feedback requests & request for speaking partners

Majority of IELTS community discussions are around recordings, feedback requests & request for speaking partners

After customer research and market studies, we realized that our product was a good fit for this user segment. We can provide feedback on their skills, suggest actionable items & help improve their band score. All of this at 1/10th of the cost of a coaching institute.

REDESIGN OF THE PRODUCT FOR THE ILETS COMMUNITY

We wanted to really sure about our redesign exercise before we start it. We spent considerable time talking with IELTS aspirants, doing secondary research online & evaluating our technology fit for the IELTS segment.

What we found:

  1. Around 3.5 million people took the test in 2018. Out of this, at least 1/3rd give it multiple times because they didn’t get the desired band score in 1st attempt

  2. Coaching centers charge anything between Rs. 50,000 to a few lacs for a 2-month course for IELTS prep.

  3. Several online communities exist on Telegram & WhatsApp where people share their recordings and request feedback

  4. Speaking & writing are the 2 areas where people struggle the most

On the technology front, we found that with an effort of less than 1 month, we can refactor our models to replicate parameters on which IELTS evaluates its participants. Fluency, Vocabulary, Grammar & pronunciation.

Now we knew we had a market. We had a product that could solve its problems. But we also had major UX issues with our MVP.

  1. The first-time experience didn’t communicate the product’s value.

    1. The value of our product was the analysis part. For analysis, people needed to record first!

  2. The first-time experience was unguided and confusing. It felt like “what do I do next?”

    1. It had too many CTAs, and catered to too many different user segments

  3. People struggled with starting practice

    1. The blank canvas effect: What do I speak?

  4. There was nothing to engage and retain users

    1. We didn’t have any strategy for it.

Based on this learning, we began the redesign exercise to:

  1. Create a guided first-visit user experience

  2. Communicate the product’s value on the first visit itself

  3. Create a daily practice flow using daily prompts & reminders

  4. Design a recording screen that didn’t feel like a blank canvas

We deprioritize everything else since these goals provided 80% of the product’s value. We went from start to launch in 2 weeks. We reused the design components.

Our best case looked like “People sign up > practice daily with our daily prompts > band scores improve with practice”.

Our worst-case looked like “People sign up > practiced once > found the feedback stupid > never returned… > we shut down Kandor”

HOME SCREEN

→ Initially, we weren't sure about the segment so we tried to cater to everyone. This time we focused on IELTS daily practice.

→ We added a performance widget to make the home screen more useful

→ A practice calendar was introduced to reinforce the regular practice

These changes are giving us better click rates on the home screen and better practice rates. We have also uncovered a user behaviour linked with “practising in burst starting a week before exam date” which we plan to add on the home screen.

RECORDING SCREEN

→ We changed how people come to this screen, now people start with a prompt sent on their emails. The email’s open rate is around ~50%

→ To remove the fear of what to record, we give prompt & structural hint here

→ We are also adding vocab help

→ We added micro-interactions like time buffer before recording starts as well as a message to make users calm

→ Use sound as a trigger

Our weekly practice numbers indicate that people are practising a lot more than earlier. We’ll also add more hint here. We also want to make the help specific to what round and test the user is practising for.

FEEDBACK SCREEN

→ We reduced items on this screen! We weren't clear about feedback actionability so we removed it. Nobody complained.

→ We speak the user's language: band scores

→ We are not sure about the value of the video here, for now, we have de-emphasized it.

People share their feedback scores with us when they do good; they also send us screenshots when they are not able to understand why a certain score is low at times.

DESIGNING FOR PRACTICE HABIT FORMATION

Once it was ready, we posted about the tool in various online communities and got great traction. People were checking out Kandor and were signing up for it. A lot of them were also practicing. In 1 on 1 interviews with our users, we found that they really liked the granular feedback and recording feature our tool had. But we also discovered that the practice happened more when we reminded them. Without our reminders, they did not record by themselves. We took this as an opportunity to design a daily practice flow on Kandor.

habit loop.png

LEARNING PEDAGOGY

There are two kinds of learning models that we have seen:

  1. Structured curriculum: These tools have a curriculum supported by content, assessment, instructor & more. Examples include Coursera, Udemy, etc.

  2. Practice centric: These tools focus on making people practice regularly leading to learning. For e.g. Duolingo.

We decided to go with the practice-based route because our team is more suited for this; we have technology leverage; it involves lesser investment; can be evaluated quickly & most importantly we are not content experts. But just the tool and metrics won’t be enough. People skip learning, lose out motivation, feel scared, etc. Without identifying key learner behaviors and solving them, learning outcomes can not be achieved.

LEARNINGS AND REFLECTIONS

  1. Design like a scientist: Become friends with experiments

  2. The art of iteration is the secret sauce for any startup’s success

  3. Design is different at different stages of a company [idea, pmf, growth, public]

  4. Developing a good relationship with product manager is essential to building good product