The blocks below describe the setup of your user feedback survey and show the results that you’ll receive.
We aim to offer you fast and affordable user feedback. To achieve this goal for you, we recruit real people from online panels (such as Amazon Mechanical Turk) matching the demographics that you specified. We then ask each of these testers to view the URL that you’ve provided and request to them to answer multiple questions about it.
After the desired number of high-quality responses are in, your feedback is converted into a comprehensive report that will be sent to your email inbox (so you don’t need to create an account).
User feedback can be used to achieve different website optimization goals. For instance, your SEO might be looking for ways to reduce the bounce rate on a critical landing page. Your PPC specialist, on the other hand, is more interested in improving the Quality Score of an AdWords campaign.
Finding out what actual people have to say about your website or landing pages can really help you to reach your goals and targets, or serve as input for further UX research.
Our experience shows that we’ll be able to provide our customers with the best user feedback when specific questions related to your goal are asked to the testers.
For example, a question like “What’s your first impression of this page?” usually doesn’t provide too much novel insights for a customer with the goal of getting new A/B testing ideas. It does, however, often provide valuable insights for customers who want to reduce their homepage bounce rate or learn what the first impression of a redesigned product page is.
Number of people
Deciding upon the optimal amount of testers in a usability study can be complex because of the large number of parameters you should take into account.
To help you with this process, we allow you to choose between a small, medium or large set of testers. Using more testers increases the likelihood of getting new insights, raises the level of reliability and improves the chances of impressing your client or boss.
In order to get the most representative user feedback, you should optimally recruit testers on your own website and have them answer your questions. However, only a very small percentage of website visitors is willing to participate in this, so it requires high traffic volumes to your website.
That is why we allow you to select your desired tester demographics. While these testers might not be your actual website visitors, their demographics should match closely that way, and you don’t have to do any recruiting or processing. Keep in mind, though, that the more specific you select your demographics, the longer it might take until you get your results.
For each of the participants, their individual responses to all the questions will be included in the report. These responses are then grouped per question, ready to be included in a presentation or research document.
You can use these individual responses to work towards the goal that you set out to achieve with the user feedback (such as reducing your bounce rate or getting new A/B testing ideas).
The individual responses (especially for a larger number of testers) can sometimes be a bit much to digest. That is why, in every report, we include a visual word cloud for each of the questions. In it, words that are included more often in the feedback will be displayed larger, therefore making them stand out more.
All the user feedback for a particular question will be included in its word cloud, so it can serve as an effective way of summarizing the feedback.
We ask demographic data from each of our testers, this includes their gender. This data can help you to see how the division between men and women was in your random set of testers that have provided feedback on your website. The grouped data is then visualized in a clear pie chart to give a quick overview of the gender split.
Keep in mind that you can specify the gender of the user testers to only get either male or female ones. You can do this in the demographics section of your order.
Age is another point of demographic data that we ask from our user feedback panel. With this data, you can see what age groups were represented most in our random sample of testers. The age data is then put into age groups, and plotted onto a chart in order to provide a good visual overview.
Keep in mind that you can specify the age group of the user testers to only get feedback from a specific age group. You can include this requirement in the demographics section of your order.
Net Promoter Score
Finally, we ask each of the testers the following question: “How likely are you to recommend this website to a colleague or friend?” on a scale of 0-10.
The average of these scores will determine your Net Promoter Score. This is a popular metric that is often included in (User Experience) UX research documents across the globe. The NPS metric is used as a proxy for gauging the customer’s overall satisfaction with the tested product or service.