Challenges faced with Continuous Experimentation

One of ESE’s research focus and core competencies is introducing and conducting continuous experimentation with software product-intensive companies. Based on our competency, we are working on releasing a continuous experimentation handbook to guide companies on carrying out continuous experimentation. To help us garner evidence of what people would like to see in the handbook, we conducted a small survey with fellow researchers and company representatives in December, 2015. For the survey, people were asked to rank the importance of five common challenges with continuous experimentation. The ranking was from 1 to 5, with 1 being the biggest challenge and 5 being the least. The challenges are: (A) Finding the right hypothesis, (B) Designing the right experiment, (C) Getting the right usage data, (D) Integrating experimentation and delivery, and (E) Changing the organizational culture.

In total, we received 28 responses to the survey. Based on the responses, we found that the majority of respondents find topic E: Changing the organizational culture, to be the biggest challenge.

In addition, we found that the most frequent ranking, from the biggest to the least, was as follows:
E: Changing the organizational culture
A: Finding the right hypothesis
B: Designing the right experiment
C: Getting the right usage data
D: Integrating experimentation and delivery

We also performed correlations on the collected responses and identified the following:

A B C D E
A 1.00000000 -0.3682992 0.04056504 -0.47364549 -0.18414527
B -0.36829924 1.0000000 0.28114367 -0.12108926 -0.48277004
C 0.04056504 0.2811437 1.0000000 -0.51615429 -0.63038456
D -0.47364549 -0.1210893 -0.51615429 1.0000000 0.01702777
E -0.18414527 -0.4827700 -0.63038456 0.01702777 1.0000000

If we set the cutoff at .6, then we can see that the topics C and E are negatively correlated, thus if one ranked “Getting the right usage data” high, then they also tended to rank “Changing the organizational culture” low and vice versa. This could be interpreted as follows: there are two groups of respondents, those who tend to focus on “organizational” concerns (for instance, managers) and those that tend to focus on “technical” concerns (for instance, developers).

On the other hand, if we set the cutoff at .5, then we can see that C and D are negatively correlated, thus if one ranked “Getting the right usage data” high, then they tended to rank “Integrating experimentation and delivery” low, and vice versa. This could be interpreted as follows: there might be another group in addition to those who tend to focus on “technical” concerns, namely those that tend to focus on process (for instance, DevOps).

We would love to hear your thoughts! Let us know how you would rate the five challenges and/or if there are other challenges that you have faced.

Leave a Reply

Your email address will not be published. Required fields are marked *