Practical Application

Bridging Instructional Design and Active Learning

Summary

While working at the Center for Learning and Academic Student Success (CLASS) at Syracuse University I learned that the center offers free tutoring services in about thirty science courses, is currently piloting the second phase of Academic Coaching program for first-year students using research-based study strategies. The center also assists students and faculty in Academic Integrity cases. CLASS has been in existence for the last two years striving to aid students in these areas to be successful at Syracuse University. Because the center is relatively new, many pieces are still being planned, revised, and evaluated for improvement.

During the time I was in the center, I helped in the development of Blackboard organizations for tutors, coaches, and students helped to edit curriculum documents, create and edit videos and presentations, and graded the Academic Integrity Seminar. Among many of my daily assignments, I had the opportunity of getting a more profound sense of the AI seminar, study its content to evaluate its functionality, and come up with an improvements plan. Below is a description of the Academi Integrity Seminar pieces, its functions, details about performance gap findings, and possible solutions.

Context Description

The Academic Integrity seminar was created as an essential component of the academic integrity case process, with the intent of educating about the Syracuse University Academic Integrity policy to those students who violated the academic integrity policy. The AI seminar was developed abroad by a group of IDD&E students under the supervision of Professor Kozaltka about three years ago, using the provided information and an older model that the office was using under a different supervisor.

The AI seminar allows students to obtain an in-depth understanding of the Syracuse University Academic Integrity Policy, revise content that supports students knowledge, interact with given information through activities, and evaluate themselves and their case. It is the hope that students that participate in the academic integrity seminar can use the information provided to support decision making in the future years as students and later on in their professional lives. The office expects that most of the students that get registered into the seminar and start a new attempt, earn a passing grade within a month.

Academic Integrity Seminar Specifics:

In a brief introduction, the seminar provides general information about the parts and required activities that students will complete, as well as expectations to complete by earning a passing grade. However, it does not include specific information such as AI seminar goals, learning objectives, grading system, and particular expectations in numbers to pass the seminar. The seminar is totally asynchronous, and some sessions are timed. The entire seminar counts with five main courses and other activities as described below:

  1. Syracuse University’s Academic Integrity Policy session aims to instruct students in what they need to know to avoid any type of plagiarism violation. This session allows students to explore in detail the Academic Integrity Policy documents, learn more about how cases are recorded, reviewed, and decided, and obtain more information on the different types of academic violations, sanctions, and classifications. The activities for this session are based on reading assignments.
  2. Academic Integrity Policy Case Study I session includes readings about avoiding plagiarism, what constitutes plagiarism and how to prevent it. This part prompts students to dive into more complex information and clear explanations about plagiarism and its influences in SU academic integrity policy. Once students are done with this section’s readings, they are prompted to complete a five-question activity that includes multiple choice and short answers.
  3. Academic Integrity Policy case study II aims to expose students to three different case-study videos where major Academic integrity violations are displayed prompting students to reflect on what is happening in each situation and answer questions including their analysis, understanding, and knowledge application for each of the case studies. At the end of this session, students are also prompted to take a true/false pledge question which indicates they completed the exercise on their own.
  4. Academic Integrity policy Test assesses students’ understanding of the University’s Academic Integrity policy taking into account all the previously given information and review exercises. This test contains a series of true/false questions, it is timed -1 hour, can be taken multiple times, and students must achieve 100% to pass the quiz. This piece also contains a  true/false pledge question which indicates they completed all the exercise on their own.
  5. The Personal Reflection Essay session gives students the opportunity to submit into the assignment dropbox, thoughtful ideas about their case and evaluate their writings using an explicit rubric. The first two sessions of the seminar count as essential pieces for the reflection paper since students are indicated to think about their case during the required readings in those sessions.

Application of IDD&E

Although the seminar contains relevant information, meaningful exercises, and assessments, I have found, after six months evaluating the Academic Integrity (AI) Seminar following the ADDIE model, a performance discrepancy between the students’ performance and what is expected from the AI staff. Taking into account that the office has grown, expectations have shifted, and SU population has increased international students attendance, there is a clear indicator of a performance gap. It is the expectation that at least 95% of the students registered and attempted to complete the AI seminar obtain a passing grade between 210 and 220 points by the end of the month in which they started an attempt. Data shows that the actual number of students finishing the AI seminar in the last six months does not exceed the amount of 34% and is as low as 16%  out of the number of students that register and attempt to complete the seminar each month.

To search the primary causes of this performance problem further, a limited review of the instructional characteristics of the AI seminar was performed by collecting data from students’ email inquiries, Blackboard LMS reports about student engagement, interaction with the content assessment results and error reports, and AI seminar grading staff interviews. This process revealed that the instructional design quality of the AI seminar is failing to fulfill current students’ instructional needs resulting in resources, motivation, and knowledge performance gap. A deeper evaluation took place to identify other factors influencing the identified performance gap. The assessments also showed that the staff was not following the established grading rubric and there were even differences among the way staff was grading the seminar. These discrepancies were affecting students’ final seminar grade.

In other words, only an average of 25% of the students registered that started an attempt to complete the AI seminar, obtained a passing grade out of the 95% expected.
Suggestions were made to the office manager to make a few but significant changes, in the design and content structure of the current seminar, to allow students to perform better and engage more in their learnings during the time spent in the seminar. It was also suggested to revise and align changes in the AI seminar design and content of the grading systems, provide professional development for the grading staff, and revise the SOP reference documents.

Suggested Changes to the Academic Integrity seminar: the goals are to

1- increase the percentage of students obtaining a passing grade to 80% in the given time (one month) after their first attempt to complete the AI seminar in the next quarter,

2- reduce the amount of student’s repetitive questions due to misleading or lack information in the AI seminar content,

3- provide students with AI seminar content that states clear grade expectations, goals, and learning outcomes to increase the seminar engagement

4- using the same content modify some of the activities following Bloom’s Taxonomy, as possible, to increase student-learning engagement,

4- develop a transparent support system to motivate students to contact the office as they encounter problems in the seminar to reduce the number of failed attempts,

5- provide AI grading seminar staff with a revised grading rubric and SOP to minimize flaws and grade the AI seminar effectively, and

6- provide AI grading seminar and other staff with plain language and meaningful professional development training to reduce the time spent grading, increase student-staff effective communication and support system, and increase grading reliability.

These changes can be done in the office with the collaboration of and IDD&E staff and should not be a significant arrangement, if any, in their budget.  Resources such as documents and videos as well as general content should be reused. There is no need for new content development. A critical piece will be aligning the AI seminar goal with all the section’s learning objectives, assessments, and evaluations.

Knowledge Gains

This project helped me to practice and refine six instructional designer competencies from the International Board of Standards for Training, Performance, and Instruction (ibstpi) Instructional Designer Standards Competencies guide (http://ibstpi.org/) I think it also helped me realize which competencies come to me naturally and which ones I need to work more on at this point. Bellow is a reflection on the competencies used and in which context, during the time I participated in this project.

As I explained, I was allowed to become an Academic Integrity Seminar grader. Through the process of learning about the seminar, I realized that something was not going as expected by the CLASS managers. That was when I felt the need for finding out more about the instructional pieces of the seminar which led to the number 18. Revise instructional and non-instructional solutions based on data competence. I think this competence was influential in the process of identifying possible flaws in this educational solution based on data compiled from the program itself, ways of delivery, previous program evaluation, and outcomes. Indeed. I found myself reviewing the instructional pieces and some data I had access to.

The next competency I explored and used was number 6. Conduct needs assessment to recommend appropriate design solutions and strategies. I believe this was an essential step in determining what the real needs were, according to a more in-depth analysis that included stakeholders perceptions, performing gaps details, and available resources for a possible solution.

Then, for this particular project, number 7. Identify and describe the target population and environmental characteristics, was an essential step due to the increased international community in SU which is acting very differently from the time the AI seminar was first developed. I think this competency helped to strengthen the instructional solution suggestions for the current students’ needs, the infrastructure that holistically supports the delivery of the instruction, and the alignment of the center’s mission and values with the instructional solution.

Next came number 10. Use an instructional design and development process appropriate for a given project. Because the project was already done and there was no previous evaluation of its functionality, this skill helped me think about how the seminar was first conceived and what the expectations were at that point, to select the best possible modification. This process guided my way to explain the reason for the suggested possible solutions.

Once these points were clear the competency number 11. Organize instructional programs and/or products to be designed, developed, and evaluated was essential to reorganize the seminar content. It was also a critical component to finally identify the entire sequence of the seminar and strengthen the seminar learning and performance outcomes following the new students’ demands.

This last process led to competency number 12. Design instructional interventions which helped me to align existing instructional strategies, goals, and learning outcomes while identifying new possible ways of increasing interactivity, adding feedback to improve motivation, and reasonable language adjustments to accommodate current students’ needs based on the increase of international students on Campus.

In this final step, competency 16. Design learning assessments played a fundamental role in the seminar grading component of the project. This piece helped me to go back to make sure that the learning process and what the office wants to see as outcomes were aligned and that the activities contained reliable grading methods.

I didn’t have the opportunity to finish the cycle by re-evaluating the changes, but I had the turn of disseminating the plan and the rationale for each possible solutions.

Final Reflection

At this point, I firmly believe that instructional design makes learning stronger, enjoyable, challenging, rewarding, and targeted. I felt, when I first started in the program, that somehow I knew the place. But in the way I was seeing it, it was all disorganized. Now that I know this place better, I can assure that all the knowledge earned during the IDD&E program makes perfect sense. Before landing in this program, while developing lesson plans, I had many loose ends that I didn’t know how to tie together. I tried to learn by myself how to develop a learning management system, how to prepare interactive and exciting lessons, but it was not complete.

Now it is all clear. Developing instruction is a meticulous and exciting process full of endless possibilities where there is only one and continuous thread all around. This was possible by the completion of the IDD&E projects where I have learned the importance of the ADDIE model in the development of instructions, as I wanted to do before. I feel like I have a strong knowledge base, resources, and skills to succeed as an instructional designer and continue improving.