Connected Campus

3.3.1.1 Educational Programs, to Include Student Learning Outcomes

 

The institution identifies expected outcomes, assesses the extent to which it achieves these outcomes, and provides evidence of improvement based on analysis of the results in each of the following areas: (Institutional effectiveness)

 

            3.3.1.1 educational programs, to include student learning outcomes

            3.3.1.2 administrative support services

            3.3.1.3 academic and student support services

            3.3.1.4 research within its mission, if appropriate

            3.3.1.5 community/public service within its mission, if appropriate

 

_X_  Compliance           ___  Partial Compliance          ___  Non-Compliance

 

Narrative

 

Mid-America Baptist Theological Seminary (MABTS) identifies expected student learning outcomes which are clearly defined and measurable. The institution’s assessment process guides all aspects of academic programming, and the annual institutional review method is used to demonstrate achievement of educational program learning outcomes. Due to its size, the school is able to gather information from the entire population of faculty, staff, and students. Sampling is not used. The institution uses various instruments for assessment, such as rubrics, capstone exams, course evaluations, forums, and surveys which leads to the improvement of educational program learning outcomes.

 

I. Identification of Educational Program Learning Outcomes

 

MABTS’s approach to assessment of educational programming is campus-wide, participatory, decentralized, and based upon the Strategic Planning model [1]. This philosophy of assessment is based upon the assumption that assessment will be more effective if developed and monitored by the academic departmental units providing the instruction. Students, faculty, staff, and graduates give input concerning educational programs through annual surveys [13] and forums [14], and the MABTS program learning outcomes assessment model (PLOAMs) [15]. The institution addresses the process of assessment that supports the institution’s educational programs. Institutional effectiveness focuses on the design and improvement of educational experiences to enhance student learning.

 

The educational programming planning process is best understood by first considering the history of the planning process and the methodological design it employs. A description of the PLOAMs and the procedure for their measurement and data collection helps in constructing clearly defined and measurable Student Learning Outcomes (SLOs), which ultimately leads to evidence of SLO and PLOAM assessment.

 

History of the Process

 

In the 2013-14 academic year, the institution improved a seven-phase design begun in 2005 for producing clearer and broader SLOs as a result of the response by SACSCOC to an interim 5th year report. The Christian Education department devised an assessment prototype and reported findings back to the Academic Council. They, in turn, directed the Dean of the Masters and Undergraduate programs to train all degree program coordinators in producing SLOs and identifying specific assessment criteria. All degree program coordinators then met with their departments to produce SLOs based upon the prototype given to them. They were then asked to submit the new SLOs and assessment criteria reviewed by departmental and outside reviewers. These data were analyzed by the Director of Institutional Assessment and given back to the degree program coordinators who then completed columns 4 and 5 (Assessment Results and Use of Results) of the PLOAMs. As a result of implementing this process, the school was notified by SACSCOC that this process was acceptable for determining that the institution identifies expected outcomes, assesses the achievement of those outcomes, and gives evidence of improvement based on stated results for educational programming [16].

 

Phase One – Academic Council. The Academic Council met on September 20, 2013 to begin a discussion on improved ways of to assess educational programming based upon SACSCOC Comprehensive Standard 3.3.1.1 [17]. The Academic Council is comprised of the Executive Vice President, Academic Vice President, the Dean of the Doctor of Philosophy Program, the Dean of the Doctor of Ministry Program, the Dean of the Masters and Undergraduate Programs, and the Director of Institutional Assessment.

 

Phase Two – Prototype Production. Dr. Seal (Academic Vice President) tasked two of the Academic Council members, Drs. Thompson and Bickley, with creating a prototype to be used in all academic departments. Dr. Bickley, the Master of Arts in Christian Education (MACE) degree program coordinator, met with the Christian Education faculty (Drs. Thompson and Hickman) in room L108 on September 18, 2013, and the team formulated six SLOs in their PLOAM (Column 2) by reviewing the goal statement for the degree (Column 1) in the 2013-14 Academic Catalog, and by using the Bloom’s Cognitive Taxonomy verb list, course descriptions, and course syllabi. Upon completion of the prototype, Drs. Thompson and Bickley met with the Academic Council to discuss using the process in all academic departments.

 

Phase Three – Degree Program Coordinator Training/Departmental Meetings. Dr. Kirk Kilpatrick, Dean of the Masters and Undergraduate programs, was tasked to instruct each degree program coordinator in the process of defining student learning outcomes (SLOs) (Column 2), and to identify assessment criteria (Column 3) with their departmental faculty. During the month of October, he met with the department program coordinators for training.  They were encouraged to begin with the goal statement of their program (Column 1 - Purpose and Goals), and to review the complete list of required courses for their degree and corresponding syllabi. Once they reviewed these items, they were encouraged to use action verbs provided from Bloom’s Cognitive Taxonomy to craft the succinct statements which would emphasize assessable student learning outcomes based upon the degree program curriculum. In like manner, each degree program coordinator met within their departments and sought to strengthen the stated SLOs.

 

Phase Four – Submission of New SLOs and Assessment Criteria. Upon completion of updated SLOs and assessment criteria, the program coordinators submitted degree program PLOAMs to the Director of Institutional Assessment with column 1 (Purpose and Goals), column 2 (Student Learning Outcomes), and column 3 (Assessment Criteria) completed during the first week of November 2013. She, in turn, compiled all artifacts (work product) corresponding to each degree program in a spreadsheet and submitted it to the Dean of the Masters and Undergraduate program for review. He then made minor revisions to the spreadsheet in artifacts to be used [18] and met with all faculty whose classes would be used as a basis for assessment criteria in improving student learning outcomes.

 

Phase Five – Rubric Production and Artifact Collection. To aid departmental faculty, the academic council produced two generic rubrics – one for papers [19] and one for projects [20] during the second week of November 2013. A rubric for presentations was discussed, but the idea was abandoned because of the difficulty in reviewing a course requirement so diverse. Faculty were encouraged to personalize rubrics to fit particular course content. Upon completion of the rubric, the Dean of the Masters and Undergraduate programs (Dr. Kilpatrick) discussed and trained the faculty in the use and grading of both rubrics during the last scheduled faculty meeting of the year in December 2013. He also discussed the process of submitting rubrics and how they would be used for departmental and peer review. Faculty were asked to send in two ungraded copies of the students’ papers (or written copies of projects) to the office of the Academic Vice President at the conclusion of the 2013 fall semester. The administrative assistant for the Academic Vice President collected all the items and prepared them for dissemination to the departmental and peer reviewers.

 

Phase Six – Departmental and Peer Review. On January 24, 2014, the departmental and peer review personnel met in the Betty Howard Special Events Room of MABTS at 8:30 a.m. A light breakfast was served and a brief time of instruction was conducted before the peer review process began at 9:00 a.m. Drs. Seal and Kilpatrick assigned each faculty member artifacts to score with either a project or paper rubric. Some artifacts were scored within the department and others were peer reviewed by faculty outside their department. The identity of each student was removed from each artifact to prevent scoring bias and a Likert scale instrument was used. The scores were then transferred to a tally sheet for the analysis of findings. The last rubrics were scored by 1:00 p.m.

 

Phase Seven – Analysis of Findings. All artifacts, with the rubrics attached, were scanned and saved in .pdf format. The Director of Institutional Assessment (Dr. Bickley) created a spreadsheet with tabs according to each degree program, and on each degree program tab, the percentage of scores for each rubric row was delineated. The findings results were then emailed to all degree program coordinators on February 24, 2014, wherein they were encouraged to fill in Column 4 (Assessment Results) and Column 5 (Use of Results) on their degree program PLOAMs. The degree program coordinators were also asked to return their finalized PLOAMs to Dr. Bickley by March 3, 2014, who then used the data as source material for this narrative. Part of the data analysis was a comparison of 2011-12 SLOs to the updated 2013-14 SLOs, based upon this seven-step process.

 

This initial seven-phase process served as a prototype for the current two-phase, six-stage process described below.

 

Methodological Design

 

The institution uses an assessment methodology with an appropriate balance of direct and indirect assessment measures built upon the PLOAM. The seminary’s plan to continuously improve the model is illustrated below according to the mission of the institution (column 1), the student learning outcomes based upon its mission (column 2), the assessment criteria based upon SLOs (column 3), the assessment results based upon assessment criteria (column 4), and the use of results based upon assessment results (column 5). This model is effective in the identification of expected learning outcomes in that it provides differentiation and accessibility of student learning outcomes (SLOs), diversity of program assignments other than course assignments and test results, variety of assessment measurements, and greater reliance upon other means of assessing learning outcomes. A description of the two-phase, six-stage PLOAM is followed by an explanation of the measurement and data process.

 

Program Learning Outcomes Assessment Model (PLOAM)

2014-2015 Sample

Purpose & Goals

 

Mission Statement

Mid-America Baptist Theological Seminary is to provide undergraduate and graduate theological training for effective service in church-related and mission vocations through its main campus and branch campuses.

 

Goal Statement

 

Expected Outcome

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Assessment Criteria

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Assessment Results

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Use of Results

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Table 1. Sample PLOAM

 

Description of the Program Learning Outcomes Assessment Model (PLOAM)

 

At the end of each academic year, the purpose and goals (column 1) in the five-column model are reviewed, including the mission statement and the goal statement of the seminary. These are reexamined by the executive administration and trustees of the school during summer planning meetings and reaffirmed or changed. The expected outcomes (column 2) are reviewed by program coordinators for each degree program, and are evaluated based upon performance from the previous academic year. Program coordinators then formulate departmental assessment criteria (column 3) and show results of assessment (column 4) at the end of the academic year. The use of results (column 5) closes the loop in presenting student learning outcomes as it informs the purpose and goals (column 1) for the upcoming academic year.

           

The MABTS PLOAM is a six-step planning model which provides the institution with a means of assessment of student learning outcomes. The model and process begins annually (Column 1A) with the Trustees as they meet bi-annually in August. As part of their agenda, they review and reaffirm the mission statement of the seminary [21]. Based upon this reaffirmation or change in mission, the school then begins to plan student learning outcomes for the upcoming academic year. 

 

Phase One – Columns 1A, 1B, and 2. During faculty in-service, the week before the academic year begins in August, the Academic Vice President meets with the academic council (all academic program deans and director of Institutional Assessment) to discuss the reaffirmation or change in the mission statement (column 1A) in conjunction with the fall trustee meeting. After reviewing the institutional mission with the academic council, each program dean (PhD, DMin, Masters, and Undergraduate) then meets with their department chairman during faculty in-service and tasks them with formulating a goal statement (column 1B) for their respective departments based upon the PLOAM. These goal statements serve as a basis for student learning outcomes.

 

After meeting with the Academic Vice President, the department chairmen meet during faculty in-service with program coordinators (PhD, DMin, MDiv, MCE, MACE, MMICS, ADiv/AAS, ACE) to create expected student learning outcomes (column 2) for departments within each of the four programs. Careful consideration is given to constructing these expected outcomes and departmental goals are reviewed by the academic council to insure they are both connected to the mission statement of the seminary and are measurable. Columns 1A, 1B, and 2 are created during in-service week before classes begin for the academic year.

 

Phase 2 – Column 3, 4, and 5. After graduation in May, program coordinators begin collecting data as a source of assessment criteria (column 3). They consult the office of the Registrar, individual professors, and academic and administrative offices for supporting documents to assess the measurement of student learning outcomes. As a result of this analysis, program coordinators determine whether assessment criteria for student learning outcomes were met or not, and results are then displayed as assessment results (column 4). Program coordinators then use the achievement or failure of assessment criteria for the use of results (column 5) and in devising student learning outcomes for the next academic year. Trustees then use the success or failure of these outcomes as a basis for adoption or revision of the mission statement in their bi-annual meeting, which closes the loop in the planning and assessment of student learning outcomes for the institution.

 

Measurement and Data

 

Achievement of student learning outcomes is expressed in the MABTS PLOAM data streams: 1) classroom assignments and educational software, 2) field experience, 3) student evaluations 4) student surveys and forums, 5) a capstone and/or comprehensive examination, and 6) rubrics/peer review.

 

Classroom Assignments and Educational Software. According to individual course syllabi, completion of classroom assignments are used as a measurement of student learning [22]. Term papers, learning portfolios, oral presentations, and practicums are used to demonstrate learning. Students are also required to purchase language and Bible software and demonstrate proficient use through classroom assignments.

 

As an example, during faculty in-service, the academic council reviewed a syllabus audit for the 2011-12 academic year and discovered that less than half of all classes taught (45%) required a term paper. As a result, these findings were discussed in plenary session and a decision was made to require either a term paper, portfolio or a presentation in every class taught, as a demonstration of student learning outcomes.

 

Field Experience. As demonstrated on page 61 of the 2014-15 MABTS Catalog, “students are expected to fulfill the biblical command to witness and thus are required to meet mission assignments each week, share their faith, and report on the work completed. This linking of the classroom and the practical aspects of ministry and evangelistic zeal is one of the unique identities of Mid­-America.” This field experience known as the Practical Missions program is a demonstration of student learning outcomes in that it provides a praxis for classroom learning. Every student must complete this field experience assignment to receive academic credit for coursework [23]. 

 

Student Evaluations. Each year, perception data is collected from students through course evaluations and forums [24]. Question 14 on this form states, “this course made a significant contribution to my overall theological/ministry preparation.”

 

Student Surveys and Forums. Each year, perception data is collected from students through student surveys and forums [13, 14]. The responses to the questions on the surveys and the remarks made by students during the forum serve as a basis for future educational program planning.

 

Capstone and/or Comprehensive Examinations. A capstone experience and comprehensive examinations are used to determine student learning outcomes. Non-doctoral degrees require a capstone examination (pre and post degree) to measure student learning [25]. Students are tested on degree specific areas.

 

Doctoral students (PhD, DMin) must pass a written comprehensive examination for every seminar taken in the program (eight examinations for PhD and six examinations for DMin [26]) and an oral examination conducted by the doctoral committee. The written examinations are based upon questions concerning course material and must be completed without using outside sources. Doctoral students must pass at least half of their comprehensive examinations (PhD must pass four of eight and DMin must past three of six) in order to continue in the program. If a written comprehensive examination is failed the student must take the seminar again and retake the exam. The oral examination must receive a passing grade for the student to advance to candidate status [27].

 

Rubrics/Peer Review. At the conclusion of fall and spring semesters, designated faculty gather to review course artifacts (papers and projects) in relation to rubrics devised for assessment. Some artifacts are scored within the department and others are peer reviewed by faculty outside their department. The identity of each student is removed from each artifact to prevent scoring bias and a Likert scale instrument is used [19 and 20]. The scores are then transferred to a tally sheet [28] for the analysis of findings.

 

Assessment Population. MABTS does not use population samples due to the size of the school. Every student is included in the assessment process.

 

Clearly Defined SLOs

 

Each educational program possesses clearly defined SLOs as demonstrated on annual PLOAMs.

 

Associate Program. The educational objectives for each associate level degree may be found in the 2014-15 MABTS Catalog [29]. As an example, the educational objectives for the Associate of Divinity program are designed to promote growth in students toward personal maturity and professional ministry competence. These program objectives are accomplished through classroom instruction, the modeling of professors, the practice of ministry through the Practical Missions program, chapel services, and student organizations [30]. These objectives are expressed clearly in the stated student learning outcomes for associate level degree programs (column 2, 2013-14 ADIV/ACE PLOAMs) [31]. For example, SLO 3 for the Associate of Divinity program states that “The student demonstrates good to excellent knowledge of historical and theological foundations of the Christian faith, including identifying the basic historical movements and theological doctrines of the Christian faith.” This SLO clearly fulfills the educational objective of promoting “growth in students toward . . . professional ministry competence.”

 

Bachelor Program. The educational objectives of the Bachelor of Arts in Christian Studies program are designed to promote growth in students toward personal maturity and professional ministry competence. These program objectives are accomplished through classroom instruction, the modeling of professors, the practice of ministry through the Practical Missions program, chapel services, and student organizations [32]. These objectives are expressed clearly in the stated SLOs for bachelor level degree programs (column 2, 2013-14 BACS PLOAM) [33]. For example, SLO 2 for the bachelor program states that “Students demonstrate effective communication skills through writing and technology.” This SLO clearly fulfills the educational objective of promoting “growth in students toward . . . professional ministry competence.”

 

Master Program. The educational objectives for each master level degree may be found in the 2014-15 MABTS Catalog [34]. As an example, the educational objectives for the Master of Divinity program are designed to promote growth in students toward personal maturity and professional ministry competence. These program objectives are accomplished through classroom instruction, the modeling of professors, the practice of ministry through the Practical Missions program, chapel services, and student organizations [35]. These objectives are expressed clearly in the stated SLOs for master level degree programs (column 2, 2013-14 MDIV PLOAM) [36]. For example, SLO 6 for the MDIV program states that “That students demonstrate a good to excellent understanding of learning how to develop and preach a sermon.” This SLO clearly fulfills the educational objective of promoting “growth in students toward . . . professional ministry competence.”

 

Doctoral Program. The educational objectives for both doctoral degrees may be found in the 2014-15 MABTS Catalog [37]. As an example, the educational objectives for the Doctor of Philosophy program are designed “to guide students to develop the capacity for critical evaluation and quality in research which produce creative scholarship and contribute to the field of theological knowledge and literature, to guide students to develop competence in principles of independent research and to achieve a proficiency in the techniques of scholarly writing, to guide students in advanced studies in a specialized field and to help them develop skills which qualify them for teaching at the graduate level in a college, university, or theological seminary, and to prepare students for the assumption of specialized pastoral leadership in the church, in missions, and/or in administrative leadership in the denomination” [38]. These objectives are expressed clearly in the stated SLOs for doctoral level degree programs (column 2, 2013-14 PHD PLOAM) [39]. For example, SLO 1 for the PHD program states that “Students will accurately evaluate and engage critically and productively with major scholarly approaches in their field.” This SLO clearly fulfills the educational objective of guiding students “to develop the capacity for critical evaluation and quality in research which produce[s] creative scholarship….”

 

Measurable SLOs

 

Each educational program possesses measurable SLOs as demonstrated on annual PLOAMs.

 

Associate Program. The SLOs for each associate level degree may be found in the 2013-14 associate level PLOAMs [40]. As an example, SLO 4 for the Associate of Christian Education program states that “The student demonstrates a good to excellent skill in personal spiritual development, ministerial counseling, and basic church administration for the effective practice of ministry.” This SLO is defined in measurable terms (column 3, ACE PLOAM) through the portfolio assignment for PM7557 which is scored by the MABTS project rubric [20].

 

Bachelor Program. The SLOs for the bachelor level degree may be found in the 2013-14 bachelor level PLOAMs [41]. As an example, SLO 2 states that “Students demonstrate effective communication skills through writing and technology.” This SLO is defined in measurable terms (column 3, BACS PLOAM) through the paper assignment for EN4901/4902 which is scored by the MABTS paper rubric [19].

 

Master Program. The SLOs for each master level degree may be found in the 2013-14 master level PLOAMs [42]. As an example, SLO 4 for the Master of Christian Education program states that the student will “understand the historical development and significance of Christian and Baptist beginnings.” This SLO is defined in measurable terms (column 3, MCE PLOAM) through the paper assignment for CH6001 scored by the MABTS paper rubric [19].

 

Doctoral Program. The SLOs for each doctoral level degree may be found in the 2013-14 doctoral level PLOAMs [43]. As an example, SLO 4 for the Doctor of Philosophy program states that “Students will demonstrate a thorough acquaintance with literature in their area of specialization—especially the ability to summarize, analyze, critique, and apply journal articles published in their major field.” This SLO is defined in measurable terms (column 3, PhD PLOAM) through the supervised departmental reading assignment for DR9945 scored by the PhD evaluation rubric [44].

 

II. Achievement of Educational Program Learning Outcomes

 

As described in I. Identification of Educational Program Learning Outcomes, the achievement of educational program learning outcomes is reviewed in phase 2 of the six-step planning process. Column 1 (Mission) and column 2 (Student Learning Outcomes) are submitted by degree program coordinators during faculty in-service at the beginning of the school year. At the conclusion of the spring semester, program coordinators begin collecting data as a source of column 3 (Assessment Criteria). They consult the office of the Registrar, individual professors, and academic and administrative offices for supporting documents to assess the measurement of student learning outcomes. As a result of this analysis, program coordinators determine whether assessment criteria for student learning outcomes were met or not, and results are then displayed as column 4 (Assessment Results).

 

Evidence of SLO Achievement

 

Each educational program possesses measurable SLOs as demonstrated on annual PLOAMs.

 

Associate Program. The SLOs for each associate level degree may be found in the 2013-14 associate level PLOAMs [45]. As an example, SLO 4 for the Associate of Christian Education program states that “The student demonstrates a good to excellent skill in personal spiritual development, ministerial counseling, and basic church administration for the effective practice of ministry.” This SLO is defined in measurable terms (column 3, ACE PLOAM) through the paper assignment (4a) for PM4300 which is scored by the MABTS paper rubric [19]. In referencing the Fall 2013 Master Scores SLOs spreadsheet, it is indicated that students did not meet the assessment criteria target of “80% of artifacts reviewed will meet or exceed expectations” [46]. As a result of not achieving the target percentage, improvements will be presented in III. Improvement of Educational Program Learning Outcomes.

 

Bachelor Program. The SLOs for the bachelor level degree may be found in the 2013-14 bachelor level PLOAMs [47]. As an example, SLO 2 states that “75% of students will demonstrate good to excellent ability in expression of Old Testament and New Testament concepts/truths.” This SLO is defined in measurable terms (column 3, BACS PLOAM) through the paper assignment for OT2101/NT5601 which is scored by the MABTS paper rubric [19]. In referencing the fall 2013 Master Scores SLOs spreadsheet [48], it was indicated that students did meet the assessment criteria target of “75% of students will demonstrate good to excellent ability in expression of Old Testament and New Testament concepts/truths,” although, it was noted that “Results were generally good; organization/sources need attention.” As a result, improvement of this SLO will be presented in III. Improvement of Educational Program Learning Outcomes.

 

Master Program. The SLOs for each master level degree may be found in the 2013-14 master level PLOAMs [49]. As an example, SLO 5 for the Master of Christian Education program states that the student will “develop a comprehensive education model for effective church ministry.” This SLO is defined in measurable terms (column 3, MCE PLOAM) through the portfolio assignment for CE7484 which is scored by the MABTS project rubric [20]. In referencing the fall 2013 Master Scores SLOs spreadsheet [50], it was indicated that students met the assessment criteria target of “80% of students scored a 3 or higher on rubric rows 1, 2, 3, and 4.” The improvement of this SLO will be presented in III. Improvement of Educational Program Learning Outcomes.

 

Doctoral Program. The SLOs for each doctoral level degree may be found in the 2013-14 doctoral level PLOAMs [50]. As an example, SLO 4 for the Doctor of Philosophy program states that “Students will demonstrate a thorough acquaintance with literature in their area of specialization—especially the ability to summarize, analyze, critique, and apply journal articles published in their major field.” This SLO is defined in measurable terms (column 3, PhD PLOAM) through the supervised departmental reading assignment for DR 9945 which is scored by the PhD evaluation rubric [44]. In referencing the 2013-14 doctoral level PLOAM [50], it was indicated that students met the assessment criteria target of “75% rate ‘Excellent’ or ‘Good’ on criteria on the evaluation rubric (composite score).” The improvement of this SLO will be presented in III. Improvement of Educational Program Learning Outcomes.

 

III. Improvement of Educational Program Learning Outcomes

 

As described in I. Identification of Educational Program Learning Outcomes, the achievement of educational program learning outcomes is reviewed in phase 2 of the six-step planning process. As discussed in II. Achievement of Educational Program Learning Outcomes, program coordinators collect and analyze data to determine whether program outcomes were met. Based upon this identification and analysis of outcomes, the institution is tasked with providing evidence of program learning outcome improvement.

 

Evidence of Program Learning Outcomes Improvement

 

Each educational program assesses the use of results based upon the assessment of measurable SLOs as demonstrated on annual PLOAMs. Column 1 contains the mission statement of the institution and degree program goals of the institution, column 2 provides the stated student learning outcomes for degree programs, column 3 demonstrates the assessment criteria for meeting those SLOs, column 4 displays the success or failure of achieving the SLOs, and column 5 provides evidence of improvement for program learning outcomes.

 

Associate Program. The use of results (column 5) for each associate level degree may be found in the 2013-14 associate level PLOAMs. As an example, the use of results for SLO 4 for the Associate of Divinity program states that “Courses will continue to be monitored. Rubrics for non-research oriented presentations need to be developed for projects related to these courses” [51]. These results were submitted to the Dean of the Masters and Undergraduate program for consideration to develop rubrics for presentations in the associates program, thus improving the strength of learning outcomes within the program.

 

Bachelor Program. The use of results (column 5) for the bachelor level degree may be found in the 2013-14 bachelor level PLOAMs [52]. As an example, the use of results for SLO 4 for the Bachelor of Arts in Christian Studies program states that professors should “help with organization and use of sources.” These results were submitted to the Dean of the Masters and Undergraduate program for consideration to revise syllabi for OT2101/NT5101 to include more instruction on paper organization and citation skills in the bachelor program, thus improving the strength of learning outcomes within the program.

 

Master Program. The use of results (column 5) for each masters level degree may be found in the 2013-14 masters level PLOAMs. As an example, the use of results for SLO 6 for the Master of Christian Education program states that professors should “spend course time on choosing appropriate source material for the notebooks” [53]. These results were submitted to the dean of the Masters and Undergraduate program for consideration to revise syllabi for MS6405 to include more instruction on choosing source material for a missions notebook in the masters program, thus improving the strength of learning outcomes within the program.

 

Doctoral Program. The use of results (column 5) for each doctoral level degree may be found in the 2013-14 doctoral level PLOAMs. As an example, the use of results for SLO 3a for the Doctor of Philosophy program states that professors should “help PhD students better communicate their research effectively in writing and that the Form and Style Committee will revise the MABTS Guide for Form and Style for the fall semester” [54]. These results were submitted to the dean of the doctoral program for consideration to revise the MABTS Guide for Form and Style, which was accomplished in the fall of 2014, thus improving the strength of learning outcomes within the program.

 

Conclusion

 

In conclusion, MABTS identifies expected outcomes, assesses the achievement of those outcomes, and gives evidence of improvement based upon results within the institutional effectiveness process. This process is built upon its philosophy of planning, which results in the school’s institutional effectiveness plan, which culminates in the long-range plan that enables the school to fulfill its mission, thus closing the institutional effectiveness loop.

 

Documentation

 

1. MABTS Strategic Plan Model

13. 2013-14 Faculty, Staff, Student Survey Results

14. 2013-14 Faculty, Staff, Student Forum Results

15. Generic MABTS PLOAM

16. SACSCOC Letter

17. Academic Council Minutes September 20, 2013

18. Artifact Assignment Spreadsheet

19. MABTS Paper Rubric

20. MABTS Project Rubric

21. Trustee Minutes Review of Mission

22. MABTS Course Syllabi Sample

23. 2014-15 MABTS Catalog, p. 61

24. Course Evaluation - Question 14

25. Capstone Exams

26. PhD/DMin Comprehensive Exams

27. PhD/DMin Oral Exam

28. Sample Tally Sheet for Analysis of Findings

29. 2014-15 MABTS Catalog, p. 93-102

30. 2014-15 MABTS Catalog, p. 93

31. 2013-14 Associate Level PLOAM

32. 2014-15 MABTS Catalog, p. 105

33. 2013-14 BACS PLOAM

34. 2014-15 MABTS Catalog, p. 121-166

35. 2014-15 MABTS Catalog, p. 121

36. 2013-14 MDIV PLOAM

37. 2014-15 MABTS Catalog, p. 167-206

38. 2014-15 MABTS Catalog, p. 181-182

39. 2013-14 PHD PLOAM

40. 2013-14 ACE PLOAM SLO 4

41. 2013-14 BACS PLOAM SLO 2

42. 2013-14 MCE PLOAM SLO 4

43. 2013-14 PHD PLOAM SLO 4

44. PHD Evaluation Rubric

45. 2013-14 ACE PLOAM SLO 4

46. Fall 2013-14 Master Scores SLO Spreadsheet (ACE)

47. 2013-14 BACS PLOAM SLO 2

48. Fall 2013-14 Master Scores SLO Spreadsheet (BACS)

49. 2013-14 MCE PLOAM SLO 5

50. Fall 2013-14 Master Scores SLO Spreadsheet (MCE)

51. 2013-14 ADIV PLOAM column 5 SLO 4

52. 2013-14 BACS PLOAM column 5 SLO 4

53. 2013-14 MCE PLOAM column 5 SLO 6

54. 2013-14 PHD PLOAM column 5 SLO 3a

Sign Up


Sonis Web
New or prospective student?  Sign-up here for access to application information, grades, and account billing status.

 

Career Center

Alumni may register here to have their resume loaded into our career center.  Allow us to connect you with churches looking for qualified personnel.

 

Witness One:Seven Report
New Students may sign-up for access to the Witness One:Seven site here.

 

Login


Sonis Web
Students, Faculty, staff may access their SONIS web account here.  Please enter your SONIS ID and password/pin number for access.

 

Career Center
Sign-in here to access your resume and make changes.

 

Witness One:Seven Report
Students may update their Witness One:Seven information here.

 

Student Email
Sign-in here to access your student email account. Your Office 365 account name is [Student_ID]@mabts.edu (e.g. MA1234567@mabts.edu). Your password is your MABTS assigned PIN.

 

PhD Form
Click here to access the PhD form.