A follow up answer with a reply here might be motivating for past and future participants if this kind of examples might be assessed as a valuable tool on your side.
For example
what impact the participants input had upon the in-the-end realized product change
or if it was dropped again because too differential results
or …
i think you get my train of thought. just my 2 cents, no reply needed
On Jan 30 @ 9am, you post saying you’d like us “to participate to this short experiment”, "to make sure our products offer the best user experience possible "
You closed the study on Feb 1@ 530am. By my books, that makes the call for participants and the entire study window less than 48 hours.
I have to wonder, how many participants actually participated in your study? What was your sample size? Relative to your user base? Do you believe this to actually be representative of your user base?
Let’s say this was a first, tentative test of the concept, methodology and tools. Because this was a test of the tools, it had to be managed a bit more manually and intensively than would probably normally be the case.
In short, this was a bit of an experiment. Hopefully we’ll have something to report back soon on how the experiment went.
I’m sorry to hear that you couldn’t take part in this study. As Ann said, this was a first experiment and the test was conducted during quite a short time.
We conducted a short test from January 30 to Februrary 1. Some of you requested information about the test results. I’m really happy to see that this experiment generated such an interest within the community.
This research study was an experiment and we are aware of its biases.
Here are the main aspects and results of this test:
Goal of the research study:
The main goal of this First Click Testing was to understand where people expect to change the settings of the New Code Period.
Participants:
41 members of the community took part to this experiment. This is awesome!
Key findings:
The results were inconclusive and we won’t take them into account to build our design. Most importantly, some aspects of the research methodology make the validity of the data collected questionable.
18 users expect to find this feature under the Administration tab.
13 users expect to find this feature on the Activity page (8 users clicked on the Activity tab & 5 by clicking on Activity in the right sidebar of the overview page.)
Thanks to everyone who participated to the test and for your interest in this experiment.