Roles
The following individuals help in managing and approving the IDS. As a general process, the IDS have been formulated through faculty and student data collection, constructed through a team of faculty members, and approved through the Deans Council.
In addition to the individuals who have defined and structured roles within IDS, there are groups and teams that collaboratively participate in the continued development and implementation of IDS.
Person, Role, or Group |
Responsibility with IDS |
Vice President for Academic Affairs |
Manages and maintain the Academic Affairs division. Provides guidance and supervision for the maintenance of division directives that may impact the IDS. |
Deans |
Manage the revision and communal implementation of the IDS by providing ongoing feedback and assessing student and faculty needs. |
IDS Core Team Leaders |
Organizes and oversees the IDS team which articulates and plans for the implementation of the IDS. |
Lead IDS Coach |
Executes the implementation of the IDS according to the instruction from the IDS team. Leads, trains, and manages a system of IDS coaches built to support faculty. |
IDS Core Team |
A group of faculty and staff who collaborate on identifying course design and instructional needs at the college that may need to be addressed through IDS. |
IDS Faculty Team |
A group of faculty members that assist in articulating and revising IDS to best meet the needs of both faculty and students. |
IDS Coaches |
A team of coaches that provide support for faculty as they process and actualize IDS in their course design and teaching methods. |
Program Managers and Department Chairs |
A group of faculty who evaluate for consistent implementation of IDS and assess opportunities for ongoing improvement. |
Relationships to Stakeholders
Within IDS, there are numerous stakeholders that benefit from maintaining discussions about student engagement, faculty confidence, and course design. These stakeholders are intrinsically connected through college systems and are not limited to the connections illustrated here.
Student Engagement: The agency afforded to students within instruction and content deliver
Faculty Confidence: Faculty's perception of the quality of their instruction
Course Design: The interface and presentation of course material
Key Terms and Documents
Best Practice: Recommended approaches and strategies for accomplishing tasks and meeting goals. In higher education, this is typically in reference to instruction and content delivery.
Class Calendar and Due Dates: A standardized document that assists students in anticipating current and future workload within a given course.
College Directive: An official instruction or guidance for how to operate in general professional contexts.
College Policy: Official expectations and procedures for how to operate in specific professional contexts.
Course Design: The intersecting practices and processes that oversee the development of the course environment and assignment sequence.
Delivery Method: The modality through which course content is delivered.
Faculty Cluster: Groups of faculty that collaborate on forming approaches to instruction and course design.
Faculty Confidence: Faculty’s perception of their quality of instruction and ability to deliver course content.
Feedback: A process of communication about a person’s performance on a given task or work toward a desired outcome.
Holistic Approach: An approach that accounts for a variety of contexts and circumstances both inside and outside of a professional work environment.
IDS Checklist: A document provided to faculty that prepare them for course review.
IDS Indicators: A document that outlines a list of characteristics that demonstrate and understanding of IDS.
Inquiry: A process through which individuals research information to problem solve and think critically.
Instructional Design Standards: A document that communicates course design expectations.
Instructor-to-Student Interaction: Instructor initiated communication about students’ progress in a given course.
Learning Management System: A system that facilitates course delivery and student engagement.
Learning Outcomes: Expectations for students’ demonstration of knowledge by the end of the course.
Literature Review: A document that provided a limited overview of current research in connection with standards provided in IDS.
Pedagogy: A theoretical and philosophical approach to teaching.
Professional Development: A process for continuing reflection and revision on professional practices.
Reflection: An action through which a person situates themselves in a variety of contexts, such as personal, institutional, and global expectations and immediate circumstance.
Rubric: A document provided to program managers and department chairs to evaluate course design in their departments.
Stakeholder: Individuals or groups who have any stake or potential impact based on a change in policy or directive.
Student Access: The ability and potential for students in any given circumstance and living condition to participate and engage with course content.
Student Engagement: The demonstration of students interacting with and completing course content and assignments.
Student-to-Student Interaction: Interactions that are rooted in students communicating with and responding to one another facilitated through course content and assignments.
Syllabus: A revised document in response to IDS that contains all information students need to understand course evaluation methods, course objectives, and course-specific policies.
Literature Review
Student Engagement
What We Know:
What We Can Do:
Faculty Confidence
What We Know:
What We Can Do:
Course Design
What We Know:
What We Can Do:
References
Bachelder, T., Duburguet, D., Simmons, J. L., King, G. G., and De Cino, T. J. (2019). The usability of an online learning management system in an aviation curriculum blended course design: A case study. Collegiate Aviation Review International, 37(2), 38-57.
Borup, J., West, R. E., Graham, C. R. (2012) Improving online social presence through asynchronous video. The Internet and Higher Education, 15(3): 195-203. 10.1016/j.iheduc.2011.11.001
Cordie, L. A., Lin, X., Brecke, T., and Wooten, M. C. (2020). Co-teaching in higher education: Mentoring as faculty development. International Journal of Teaching and Learning in Higher Education, 32(1), 149-158.
Ebert-May, D., Derting, T. L., Hodder, J., Momsen, J. L., Long, T. M., and Jardeleza, S. E. (2011). What we say is not what we do: Effective evaluation of faculty professional development programs. Bioscience, 61(7), 550-558.
Kim, S. W. and Lee, M. G. (2008) Validation of an evaluation model for learning management systems. Journal of Computer Assisted Learning, 24(4), 284-294. https://doi.org/10.1111/j.1365-2729.2007.00260.x
Lewis-Kipkulei, P., Singleton, J., Singleton, T. S., and Davis, K. (2021) Increasing student engagement via a combined roundtable discussion and flipped classroom curriculum model in an OT and special education classroom. Cogent Education, 8(1). https://doi.org/10.1080/2331186X.2021.1911284
Machajewski, S., Steffen, A., Fuerte, E. R., and Rivera, E. (2018) Patterns in faculty learning management system use. TechTrends: Linking Research & Practice to Improve Learning, 63(5), 543-549.
Malikowski, S. R., Thompson, M. E., & Theis, J. G. (2007). A model for research into course management systems: Bridging technology and learning theory. Journal of educational computing research, 36(2), 149-173.
Parks-Stamm, E. J., Zafonte, M., & Palenque, S. M. (2017). The effects of instructor participation and class size on student participation in an online class discussion forum. British Journal of Educational Technology, 48(6), 1250–1259. https://doi.org/10.1111/bjet.12512
Schnitzler, K., Holzberger, D. & Seidel, T. (2020). All better than being disengaged: Student engagement patterns and their relations to academic self-concept and achievement. European Journal of Psychology of Education. https://doi.org/10.1007/s10212-020-00500-6
Urban, E., Navarro, M., and Borron, A. (2017) Long-term impacts of a faculty development program for the internationalization of curriculum in higher education. Journal of Agricultural Education, 58(3), 219-238.
van Heerden, M., Clarence, S., & Bharuthram, S. (2017). What lies beneath: exploring the deeper purposes of feedback on student writing through considering disciplinary knowledge and knowers. Assessment & Evaluation in Higher Education, 42(6), 967–977. https://doi.org/10.1080/02602938.2016.1212985
Watson, C. E. (2019). Faculty development's evolution: It's time for investment in higher education's greatest resource. Peer Review, 4-7.
Welch, M. and Plaxton-Moore, S. (2017). Faculty development for advancing community engagement in higher education: Current trends and future directions. Journal of Higher Education Outreach and Engagement, 21(2), 131-165.
Yueh, H. P. and Hsu, S. (2008). Designing a learning management system to support instruction. Communications of the ACM, 51(4), 59-63. https://dl.acm.org/doi/abs/10.1145/1330311.1330324
Objective:
During the first year of IDS implementation, at least 75% of faculty should participate in their cluster coaching and the data collection process to establish benchmarks for faculty confidence and perception of instruction.
Rationale:
One of the primary missions of IDS is to increase Faculty Confidence in their instruction. The best way to assess faculties' perceptions and use of the resources formulated through IDS is to actively engage faculty in ongoing, reflective dialogue. Therefore, this objective will manifest through faculty self-reporting perceptions of their instruction through survey questions asked during the coaching process. Initial rounds of this data should act as a baseline to establish a workable benchmark for future semesters of coaching and IDS revision.
Method for Measurement:
Using a digital survey software, faculty will be prompted throughout their coaching experience to reflect on their instruction. These questions could be both qualitative and quantitative:
Qualitative data will be coded for themes of collaboration, cooperation, and community. Quantitative data will be used to juxtapose faculty perceptions with those held by students as well as the program managers and department chairs.
The data collected through this survey should be assessed intermittently (3 times a term) to establish trends in faculty responses and to assess if participation in the coaching process correlate with positive perceptions of the classroom and faculty instruction.
Objective:
At least 75% of student respondents indicate that they were able to navigate their D2L courses consistently during the term.
Rationale:
While the mission of Course Design is easily assessable through the IDS rubric, the ultimate purpose of course design is to assist students in accessing their course content consistently. We could emphasize the following two characteristics:
The data could be collected through students self-reporting through a survey software.
Method for Measurement:
Using a survey software, students will be prompted to reflect on their experience navigating course content as presented in D2L. This reflection will manifest through an objective, scalable question due to constraints placed on student time and previous participation in surveys. Students may have the opportunity to self-articulate their experiences with course design at the end of the survey through an open-response question about the course in general. Likert scale questions could be:
The responses to these questions will reveal students' perceptions of course design and may be compared to the IDS rubric to locate disparities in students' perceptions and faculties' perceptions.
Objective:
The initial objective for this aspect of the mission is data collection through the collection of student responses, 75% of faculty participation in cluster coaching and data collection, and 100% participation from program managers and department chairs in course assessment.
Rationale:
The most difficult mission to measure for IDS is Student Engagement. While we may not be able to measure engagement directly, we can collect data about perceptions of engagement in courses. This should occur through the triangulation of three demographics:
Methods for Measurement:
Students will be prompted to reflect on their interactions in the classroom through questions in a survey software. This reflection will occur through Likert scale questions due to constraints on participation and time:
Faculty will be prompted 3 times a semester during the coaching process to reflect on their classroom environment through both qualitative and quantitative questions provided in a digital survey software:
Program Managers and Department Chairs will assess instructors' courses throughout the term using the IDS rubric to determine if there are opportunities embedded in tasks and content delivery that are conducive to student-to-student and instructor-to-student interactions.