Context
The collaborative problem solving teaching and assessment
activity described here is in a business environment - specifically for the
services industry of Market Research. The activity is designed for someone who
is responsible for management or implementation of a typical quantitative market
research project where data is collected from large samples using survey forms.
For the purposes of this assignment, let us focus on three key roles that exist
in a market research organization:
- Client servicing: Responsible for securing new business and managing the client deliverables (Submitting proposals, designing survey forms, data interpretation, report and overall project management)
- Operations: Responsible for data collection (Recruit survey participants directly or through vendors who collect data on their behalf, script questionnaire, provide incentives, check data quality)
- Data Processing & Analytics: Collate all the data collected, clean up data if required, generate tables, statistical models used for interpretation and provide it to the client servicing team
Each of these
roles is unique, requires specialized skills and successful implementation
requires full participation and extensive collaboration by all team members. Though
the process is same for most projects, the research objectives vary in terms or
scope and complexity and the outcome is uncertain.
Assumptions
As we are
dealing with graduates or post graduate students (though in the early stages of
their careers), the participants are assumed to have the requisite ICT skills
in order to be able to use the online collaboration platform.
Objectives
The objective
of this activity is to train the employees on market research project
management and assess them on the quality of the research report delivered and their
collaborative problem solving skills. As all members working on a project
bring completely different skills and have distinct roles to play, collaboration
is fraught with risks of inadequate understanding of responsibilities of all
players and communication gaps.
Nevertheless, the extent of collaboration does impact the quality of the final
research report.
Activity
The teaching
and assessment is designed as a role play activity within an office environment,
and is carried out over a period of three to four weeks using an online
collaboration platform that also allows for project management.
The training is
designed for employees in their first year of employment. This is designed as a
group project with each group comprising three members, representing the three
roles described earlier. The training can be conducted by a senior staff member
who can double up as a “Client”. A trainer should be able to work with 4-6
groups in a training session ~ 12-18 employees in total.
The rationale
for the extended period of training is to simulate real work scenarios: giving
time for client interaction, source market information, design a questionnaire,
data processing, report writing and use the online collaboration platform to
implement the project.
An example of a
platform that can be customized for such activity would be http://aikonlabs.com/how-it-works/.
The customized online collaboration platform records
all the activities of trainees as they are tasked with execution of a market
research project. All activities of the market research project will need to be
carried out barring actual data collection by the operations team. The need for
data collection can be eliminated by selecting for training the specifications
of an old project for which data has already been collected. Here, the trainer
will provide the raw data collected to the analytics team to generate tables
and statistical models. The trainer can adjust the level of complexity of the
project to a level that is anticipated for employees in junior to middle
management. Teams can be
assigned mentors in case they need guidance on the technical aspects of the
project.
The trainer must provide challenges
that necessitate interaction between the three roles in each group and are
representative of those encountered by the employees on an ongoing basis as a
part of their work. Some examples of such challenges could be:
· - Summary analysis half way through the project
· - Additional analysis at end of the project
· - Reduction in the cost of data collection
· - Modification in sample design
· - Poor response to survey
· - Errors in data collected
Assessment
A CPS assessment
rubric has been designed for this activity that takes into account both –
Social and Cognitive dimensions and relies on peer and self-assessment. Some
components of the rubric are specific to certain roles in the organization (and
hence on the assignment) and this has been indicated.
The online
collaboration platform allows trainees to rate the contributions made by themselves
and their colleagues in terms of usefulness by clicking on vote buttons. The
trainer (representing a client) also rates all the trainees. The voting scale
proposed is as follows:
This contribution is
+2: …vital to the successful implementation of the project
+1: … valuable and improves the quality of delivery of
the project
0: … somewhat useful
-1: … not useful
-2: …misleading or of no relevance to the project
The
self-assessed score for each contribution can be multiplied by 3 and correlated
with the total peer assessed score (as there are 3 assessors for each trainee,
including the trainer). The total peer assessed score and self-assessed score
and the correlation for each trainee is made visible real time, incorporating
an element of gamification but detailed votes for each contribution are made
visible only at the end of the assignment. A high positive score and high
correlation between self and peer assessment indicates that the performance of
the trainee and expectations of other team members is aligned. The rationale
for not sharing detailed votes is to avoid conflict/ finger pointing in the
midst of training while the overall score will give encouragement and an
opportunity to reflect in an event the scores are not meeting expectations.
The trainer (or
2-3 mentors) will review the chat history and assess the performance of the
trainees on a rubric designed for measuring social and cognitive dimensions.
Thus, the
assessment is quick, uses multiple sources (self, peer and expert view) without
requiring too much time and can be benchmarked for the entire batch of trainees
and compared with historical data. The assessment is anticipated to promote
learning by making transparent the thought and action of the trainees via the
online platform. Areas of strength as well as those for improvement can be
identified for each employee at the end of the training.
As each group
submits the final research report to the “client”, the report is evaluated on
the following parameters by the trainer (and 2-3 mentors):
·
Meeting
research objectives by answering client’s business issues
·
Timeliness
of delivery
·
Quality
of presentation
·
Quality
of report submitted
In providing
feedback, the trainer should juxtapose the quality of the report with the CPS assessments
for the individual members of the group to pinpoint presence/ absence of collaborative
actions that contributed to the success (or failure) in delivering a project
report that meets client expectations.
Given
the workplace setting, we are not looking for a one-off performance but a habit
forming change in the workplace. The assessment brings to fore aspects of
performance that will impact delivery of real business projects. Providing
feedback for improvement of specific collaboration skills makes the assessment authentic.
Once the organization has identified the level of learner on CPS skills, the
subsequent learning needs can be identified. The training can be repeated
annually at entry and mid-level executives.
Bibliography
Griffin,
P. (2012). Assessment and Teaching of 21st Century Skills. Springer
Science + Business Media.
Brief intro to 21st
century skills