University of Helsinki hosted a Summer Institute in Computational Social Science, following the one week of instruction and one week of student project work. During the instruction week followed Coding Social Science, the textbook developed by Matti Nelimarkka and discussed research ethics, validity and reliability questions, data science, network analysis, simulation models and interactive systems for social science research. We ended the second week already on Thursday to accommodate for Midsummer – a Nordic holiday on Friday.
Pre-activities
We conducted targeted advertisements for Finnish and Nordic audiences via list servers. Overall, we received a modest number of applications across disciplines: computer science, physics, communication and media studies and political sciences. We accepted a total of twenty participants, about half from Finland and others from mostly Europe.
In the spirit of flipped classroom, students were expected to read seven chapters from Coding Social Science and self-study basics of R using Coding Social Science, SICSS online materials or other online assets. We tried to communicate the expected skill level via a set of exercises students should finish before SICSS starts. However, based on feedback survey responses it seems that about half of the responders did not feel sufficiently prepared for the instruction and project weeks, citing lack of time or noting that they should have read and exercise programming more.
Action point: This year the pre-activities were laissez-faire: we did not provide detailed scheduling not required for example one-minute essays or returning coding exercises for evaluation and assessment. However, this approach puts significant burden on participants’ self-regulation skills to ensure they prepare sufficiently. Partly this could be aided with smaller interventions, such as a mandatory diagnostic test for all participants both communicating the expected skill level and allowing participants to self-assess the amount of work they ought to do. Alternatively, the learning activities could be spanned to cover about one month before SICSS and have more instructor-driven assessments.
Action point: I also observed that participants did not use the global SICSS TA channel to ask for any questions before nor contacted me regarding programming problems nor conceptual or theory issues. I am not sure if this relates more to generally high barrier to contact people – especially for novices (at least in software industry, Begel & Simon, 2008) – or poor communication on the availability of these resources.
Beyond knowledge and skill improvement, from previous years we know SICSS is also about building community of like-minded scholars. To support this, we organised three voluntary one-hour meetings before SICSS Helsinki and invited to share their photo and a short bio in the SICSS Slack. About one half of participants joined in these activities. All participants who answered the final survey indicated that participants felt they were welcomed to SICSS Helsinki, which is a critical part in an online activity like this.
First week: instruction
The instruction activities assumed participants had already familiarised themselves with the corresponding chapter and mostly focused to take the theoretical ideas to use. We used group activities to think further how various concepts materialise in research (for example, for network analysis we examined what could be seen as a network, in data science we identified potential data sources for empirical questions etc.). Students were also expected to modify and expand brief tutorial code snippets demonstrating the implementation of the methods in R. Originally we attempted more group-work based approach with programming exercises (in hopes that peer-support would be better organised), but following students’ feedback we modified the process to allow students to either work solo or together with specific Zoom breakout rooms for each part of the exercise and a separate space for solo workers (even while participants rarely moved between these spaces; i.e., people went to the solo workspace or worked in the room for Exercise 1).
Participants’ feedback highlighted occasional unclarity about the learning activities and their goals as well as the level of difficulty to increase too quickly considering their technical skills. Similarly, it was highlighted that I did not teach that much during these activities, but rather facilitated and provided materials and comments.
Action point: I will revise the exercises both on the book and additionally the programming snippets to ensure exercises are clearer and have a better indication on their difficulty (and that there are even more beginner friendly exercises). Programming snippets could be supplemented with voice-over tutorials to make them more accessible and references to the book when possible.
Reflection: I think the root cause of students’ observations on the level of difficulty relates to the heterogeneity of the student population: some have extensive programming experience before hand while others are still beginners. This is a difficult challenge to tackle. The heterogeneity can be addressed via more formal pre-activities, bringing the minimal skills to a certain level. However, even then there will always be people with prior experience in programming who will ace programming activities better. I do not think the problem can be solved by catering these different audiences with different exercise sets as this would break the shared experience of SICSS. I think I need to revise the coding exercises once again to also ensure they communicate the intent: gaining some hands-on experience working with these methods with toy examples.
Second week: group projects
We used Thursday and Friday afternoons in Week 1 to discuss student project, allowing participants to elaborate their research interests and discuss them in groups. We also conduced speed-dating activity where pairs or small groups discussed about what kind of projects they were interested in – and know each other better. Based on participant feedback, participants enjoyed their groups (comments from feedback survey “My team worked very well together, and the collaboration was a pleasant experience.”, “My group worked perfectly as we were on the same level and had the same aims for the project. Our collaboration was wonderful, and it made the project a lot of fun to do despite problems.”).
For each day we had two daily check-ups with the teams (about ten minutes for each team), covering what was done, what is expected to be done and highlighting any issues, prompted with questions such as “what would you do differently tomorrow” and “does the team work efficiently in your opinion?” The overall aim of these meetings was to ensure groups communicated with each other and give the teaching team an overall perspective on how distant teams function (and provide the opportunity to mitigate any conflicts early on) and observe major showstopper issues where intervention is expected.
Student feedback indicated that this strategy was not appreciated by all participants, as they felt useless and take time away from solving actual problems. However, for me it provided an opportunity to follow group projects and attempted to serve as a venue to support reflection during the intensive work. While I did say on several occasions (and for each group if issues emerged), I was available to help teams during the day if needed and followed Slack for any issues – but I was rarely contacted for support.
Action point: Formalize more clearly the structure and purpose of check-up meetings as an avenue to support reflection, for example with small pre-assignments focused on group dynamics, project process and key learnings and insights.
Students also highlighted that for final presentations the ten minutes presentation and five minutes for discussion was insufficient time, as there were more learnings to be shared. Like always, the timing was a balancing act between the number of groups and amount of time we can expect to keep people on Zoom. However, there could be a potential to examine opportunities for sharing a bit wider documentation (such as an extended slide set and any supplementary material) to allow offline engagement for this extended material and keep the ten-minute slots as quick introductions to any extended material they hand. On the other hand, it is a key skill in academia to summarize key ideas to a brief period of time, thus it is a beneficial learning experience at the same time.