Saturday, February 28, 2015

MMC 620: Critical Thinking Case Studies (Self-Assessment Portfolio)

This course introduced case analytic skills. These skills involve dividing a case study into three parts: 

1.data
2. problem
3. intervention
I began mastering the skills necessary to practice critical thinking in my every day life, and here's how I did it... 
I applied the course learning objectives in our weekly discussion/assignments and in my workplace.
The following is a description of each learning objective:
Learning Objective #1: Identify the relevant data contained in a case study

CLASSROOM APPLICATION
In the Week 1 Discussion Boarding posting, I analyzed the relevant data in the Midwest Zoo case study.  Below is that post, highlighting the data I identified as relevant. 

The purpose of this essay is to analyze the Midwest Zoo case study in terms of the three argument spheres Inch and Tudor (2014) discuss in our text.  According to Inch and Tudor (2014), “argument spheres are social constructs that guide how arguments are produced and evaluated” (p. 23).  The authors suggest that argument spheres can be broken down into three categories – public, personal and technical.  To clarify which sphere(s) I believe are present in the Midwest Zoo case study, a summary of each sphere can be found below.

Arguments that fall under the personal sphere typically take place with individuals who   have a personal relationship.  Furthermore, the argument is evaluated by the individuals   involved in that relationship. These arguments are generally informal in nature and take place in casual settings.

Public sphere arguments take place amongst the general public and are evaluated by the general public.  These arguments are relatively formal in nature and take place on a broader stage for community engagement.

Arguments that fall into the technical sphere involve individuals who are considered specialized members of a field and the arguments are evaluated by standards set in place by the field.  These arguments are quite formal with rules that the individuals are expected to adhere to.  An argument of this nature typically takes place among professional groups like doctors and lawyers.

The Midwest Zoo case student involves two long time co-workers turned friends who find themselves arguing over how to handle a subordinate’s tardiness at their place of employment.  Jan and Yolanda are team leaders in the concessions department and are the two main individuals involved in the argument. 

A member of the Zoo’s board of trustees had his nephew, Mark, hired as a summer employee. Mark was frequently late for work and sometimes never showed up at all. Jan and Yolanda were reluctant to discipline, or recommend discipline, for relatives of trustees because their supervisor seemed not to support this approach to dealing with this sensitive political problem.  

Jan and Yolanda had discussed this problem with Mark, but never worked out an explicit plan for dealing with him.  They tried various work assignments at different stands in an effort to find some location which would minimize his disruptive nature, but nothing had solved the problem.  They settled into the practice of moving Mark from stand to stand without having really agreed that this was a permanent solution to the problem.  Yolanda thought the best solution was to permanently assign Mark to a concession stand that Jan supervised because it was remote and she thought Mark might quit if he was permanently assigned there.  Jan was very unhappy with this arrangement.

Based on this information, I believe the two main argument spheres that are present and overlap in this scenario are the personal and technical spheres.  Jan and Yolanda are co-workers and friends who disagree on a plan of action for a subordinate.  At first the arguments takes place just between the two individuals involved.  Once those individuals not come to any kind of compromise or agreement, they elicit the help of their supervisor.  The supervisor is considered a specialized member of a field.  She is neutral to the friendship and therefore her evaluation of the argument is not personal, it falls under the technical sphere.  This issue was never taken to broader public for debate and therefore would not fall under the public sphere.  However, the Midwest Zoo is a county owned entity and if this issue of family hiring continued to cause problems, it is very likely that a public audience could get involved in the argument to decide best practices regarding the zoo’s hiring policies. 

Learning Objective #2: Develop a plausible problem statement which identifies the root cause of the problem suggested by the data

WORKPLACE APPLICATION
I am the Coordinator of Veteran Services at Iowa Western Community College.  When I took the position, it came to my attention that we did not have an official policy regarding active duty military students who get deployed during the school year.  I created the following problem statement and presented it to leadership. Below is the official deployment that I created after leadership at Iowa Western agreed with my problem statement. 

"As a veteran friendly school, our active duty military students should be afforded the peace of mind to know what our institutional deployment policy is; however, we do not have one."



 

 

         
STUDENTS CALLED INTO MILITARY SERVICE

1. GENERAL

This Policy shall be implemented in order that Iowa Western Community College might provide equitable, consistent treatment to its students who are called into military service and to facilitate their ability to continue their education once that military service is completed.

2. ELIGIBILITY

Students who are regularly enrolled in any class or program offered by Iowa Western Community College are eligible for the benefits described in this Policy, if they: (a) belong to a military unit that is called into active duty, or (b) are drafted and not eligible for deferment; such that the date upon which they are required to report to active duty prohibits them, as a practical matter, from completing the term in which they are enrolled.

3. COURSE AND GRADE OPTIONS

An eligible student may elect to cancel registration in all classes in which he or she is enrolled at the time the call for duty is received. In such case, the student shall receive a full refund for all tuition and student fees paid on behalf of that student. In the alternative, the student may request his or her instructors to award a grade or an incomplete for all classes. If an incomplete is given, then the instructor shall file in the student’s educational records and provide to the student specific instructions regarding the study and activities required to complete the course. If a grade and credit are awarded, then the instructor shall award a grade reflective of the student’s performance, taking into consideration the quantity and nature of the curriculum through the time of the student’s departure. Finally, the student shall have the option of withdrawing from selected courses, receiving a pro-rated refund of tuition and fees for those courses, while also opting to receive a grade or incomplete in other courses in which the student is enrolled.

 4. STUDENTS RECEIVING FINANCIAL AID

Notwithstanding any provision to the contrary in this Policy, administration of financial aid with respect to any eligible student shall be consistent with federal and state law. Students otherwise eligible for these benefits and receiving financial aid should immediately contact the financial aid office, where each case must be addressed individually based upon the particular rules applicable to the relevant student. The campus financial aid office shall address these matters in such a way so as to minimize the financial hardships to the student, while complying with the applicable law and regulations.

5. PUBLICATION

This Policy shall appear in all student catalogs and placed on ROC.

6. SYSTEM APPLICATION

This Policy applies to all administrative units of Iowa Western Community College. Supplemental guidance may be provided, consistent with this Policy, that is designed to implement the provisions herein, including guidance relating to fees associated with meals and housing, textbooks, parking, lab and course fees, as well as other ancillary fees.


Learning Objective #3: Propose an intervention to correct the identified problem

WORKPLACE APPLICATION
One of the very first problems I identified when I took on my current role as the Coordinator of Veteran Services was that military-affiliated student files were not housed in our online student data system like the rest of our students; therefore, I did not have the ability to run automated reports.  

I did not have the time to enter hundreds of files myself, so I proposed that we hire a work study to help me enter all military-student data into our online system.  I just hired this student last week!  We will start entering student data next week and I couldn't be more excited.  Having the ability to run automated reporting will significantly increase efficiency and make our school much more veteran-friendly. 



Learning Objective #4: Develop a detailed plan to implement the intervention

WORKPLACE APPLICATION 
I identified the need to create a  part-time position to support the front desk in the Welcome Center.  I developed a detailed plan and then created a proposal.   Below is the plan and the proposal. 

 

New Position Proposal

To: Tori Christie, Vice President of Student Services
CC: Keri Zimmer, Dean of Advising & Academic Success
From: Stephanie Larsen, Coordinator of Advising & Veteran Services
Date: 03/02/15
RE: Student Services Specialist

Proposed: I propose that we create a new part-time position to support the Welcome Desk. The schedule for this position would be Monday through Thursday from 1 p.m. to 6 p.m. for a total of 20 hours a week.

Rationale: A new position is needed to ensure the Welcome Center runs as smoothly and efficiently as possible. This position is necessary because a single staff member at the Welcome Desk is not able to serve students to the full extent. From July 2013 to the end of June 2014 17,695 people were assisted at the Welcome Center (see breakdown below). In addition to the busy times of year, there are also peak times for traffic on a daily basis. The afternoons between 12 and 3 are the busiest time of day and wait times can be long during those times. Adding a second staff member at the desk would enable us to serve students efficiently and accurately, while also decreasing wait times and improving the overall experience of visitors to the Welcome Center. Having one part-time position to assist at the desk would also help guarantee that information being given is consistent rather than varying as it can be when multiple people are covering the Welcome Desk.  In addition to the benefit to students, this would also take some strain off of other staff members since this position would be able to cover breaks and evening for the College Receptionist.



13-14
July
Aug
Sept
Oct
Nov
Dec
Jan
Feb
Mar
April
May
June
Week 1
171
482
313
311
555
437
146
196
257
274
287
237
Week 2
307
494
344
367
589
542
495
234
283
381
339
164
Week 3
311
1064
334
260
468
193
719
175
82
325
381
160
Week 4
368
546
368
338
236

395
265
336
315
260
200
Week 5
505

122
267


286


133
208
56
Total
1499
2749
1481
1543
1848
1172
2041
870
958
1430
1287
817






Job Responsibilities: The full job description is attached. However, below is an explanation of how the position’s responsibilities would vary based on the time of year.

During the busy times of year when traffic is heavy (July, August, September, October, November, and January) this staff member would assist with:
Ø  Covering the Welcome Desk during the College Receptionists breaks and from 4 p.m. to 6 p.m. in the evening
Ø  Signing students in at the Welcome Desk and directing them where they need to go. During the busy times one person signing students in is not enough.
Ø  Scheduling appointments for students with advisors
Ø  Helping students in the lobby with tasks such as:
o   Requesting loans through ROC
o   Completing Master Promissory Notes and Loan Entrance Counseling
o   Printing tax transcripts
o   Making schedule changes
o   Receiving financial aid paperwork and reviewing to make sure it is complete
o   Signing up for New Student Orientation and Registration (NSRO)

During the slower months when lobby traffic is reduced (June, December, February, March, April, and May), this staff member would work on other important initiatives and projects in addition to their other duties such as:
Ø  Covering the Welcome Desk during the College Receptionists breaks and from 4 p.m. to 6 p.m. the evenings
Ø  Manage the advisor assignment process to ensure that students are assigned an advisor quickly and accurately
Ø  Communicating with students who (All phone calls would be made from one of the cubicles in the back. No calls will be made from the Welcome Desk):
o   Have not registered for classes  and scheduling appointments with advisors
o   Have not completed payment arrangements
o   Need to complete a FAFSA and scheduling appointments with ICAN
o   Are eligible to graduate but haven’t applied to graduate
Ø  Reviewing Advising Information Site content for accuracy and clarity and identify resources that may be missing
Ø  Collects data for the Welcome Center including but not limited to student surveys and lobby traffic information from the sign-in
Ø  Ensures that students who left without being helped are followed up with and receive the help they need

Budgetary Allocation: The proposed hourly wage for this position is $11-12 an hour.


Learning Objective #5: Assess the logic of their analysis from data selection to implementation

CLASSROOM APPLICATION

In the Week 10 Discussion Board posting, I read and assessed the logic and contents of a peer-reviewed journal article.  Below is that article analysis. 

Sidelinger, R.J., & Booth-Butterfield, M. (2010).  Co-constructing student involvement: An examination of teacher confirmation and student-to-student connectedness in the college classroom.  Communication Education, 59 (2), 165-184. http://dx.doi.org/10.1080/03634520903390867

In the article Co-constructing Student Involvement: An Examination of Teacher Confirmation and Student-to-Student Connectedness in the College Classroom by Sidelinger and Booth-Butterfield, start by giving an illustration of how college instructors are viewing the involvement from the students in their classrooms.  Many college professors feel that in spite of numerous attempts to engage students in their college classrooms, students are becoming more and more non-responsive.   Many instructors reported that they have given up asking questions and encouraging discussion and have resorted to traditional lecture format due to the lack of involvement from the students (NSTA , 2007).  Several significant relationships between connected learning environments and student involvement have been found (Smith, Kopfman, & Ahyun, 1996; Chatman, 1997; Gibbs & Lucas, 1996; Coffield, 1981; Fyrenius, Bergdahl, & Silen, 2005; Myers, Edwards, Wahl, & Martin, 2007; Fassinger, 2000).  The authors’ purpose in writing this article was to explore how the students and instructor could make a combined effort to co-construct an educational environment that supports the involvement of students.  More specifically, an environment that focuses on attitudes and perceptions regarding the classroom experience tied to learning within the college environment.

Earlier research on instructional communication has vigorously explored and reported on the effects of different classroom climates.  Research covered in this area includes: (1) enhanced student involvement through connected classrooms, (2) student involvement, (3) teacher confirmation, and (4) classroom connectedness.  The instructor-student relationship has indicated that post-secondary schools are often times not student-centered (Kerr, 2001), students don’t learn if they don’t have positive attitudes in the classroom (Marzano, 1992), and student-to-student engagement enables more interaction and better performance in the classroom because they are more linked to their colleagues (Kagan, 1997).  Furthermore, the authors used Maranzo’s Dimensions of Learning (DOL) model to explore teacher confirmation behaviors and student-to-student connectedness as a way to forecast a student’s willingness to speak up in a college classroom, as well as come to class prepared.  Marzano’s DOL model has been adapted by many K-12 programs.  I find this model to be extremely student-centered and a lot of research is pointing out that post-secondary schools should be incorporating at least some dimensions from this model into the way college classroom environments are constructed and implemented.  Due to these findings, it is important to further research whether the communication variables of teacher confirmation behaviors and student-to-student connectedness co-construct a student’s involvement, both in and out of the classroom setting.

The authors had the following hypotheses and research questions:
H1: Student-to-student connectedness is a stronger predictor of student in-class involvement than class size.
H2: Student-to-student connectedness mediates the relationship between teacher confirmation behaviors and student in-class involvement. 
Q1: Which of the classroom variables, student-to-student connectedness or teacher confirmation behaviors, better predicts student out-of-class involvement?
               
The findings of this study indicated that, overall, student-to-student connectedness is a greater predictor of student in-class involvement than class size alone.  The study also confirmed that student-to-student connectedness fully mediated the relationship between teachers’ response to questions and student involvement.   A direct link was found between teachers who demonstrated interest and student-to-student connectedness as well as teachers who demonstrated interest and student involvement.  Further, the authors found that a sense of peer connectedness with a teaching style that promotes participation also promoted out-of-class involvement with the students.

This study’s original premise was that learning environments are co-constructed by the relationship students have with their instructors and peers.  However, I think it is apparent that a complex balance exists as each could encourage or discourage general student involvement. I believe the most consistent finding that aligned with the DOL model was that positive perceptions of student-to-student connectedness are associated with increased involvement by students, regardless of the class size.  Overall, it was found that teachers confirming communication, teaching style, and student-to-student connections increased the probability that students actively prepared for in-class interactions, outside of class time.

In my role as a college advisor, staff supervisor, and college instructor I find the DOL model to be a notable resource for all three areas of my job.  Dimension one of Marzano’s DOL encourages that instructors give students the opportunity to think positively and constructively about themselves, peer groups, teachers, and tasks.  Key components that are mentioned include the classroom climate, feeling accepted by teachers and peers, and experiencing a sense of comfort and order.  In summary, I found this article to be extremely relevant.  The authors were able to accurately depict struggles that many instructors face when trying to solve issues with student engagement.  In my opinion, this problem is even more prevalent at community colleges.  It would have been interesting for the authors to study this issue and find out if two-year colleges versus four-year colleges had any discrepancies in findings.  Having attended both types of institutions, I believe student-to-student connectedness is more of an issue at community colleges.  Community colleges have much higher populations of students who are considered first-generation or lower income, not to mention a large population of non-traditional students who work and families outside of class.  With that being said, it can be even more difficult to cultivate these kinds of environments where students are engaged and feel connected with their peers.  I am actually interested in using some of the research used in the article to create a survey at the community college I work at to assess how our students and faculty feel about connectedness in our classrooms. 

Tuesday, August 19, 2014


Training Initiatives in the Workplace:
The Importance of Utilizing Business Analytics to Evaluate Organizational Impact 

Stephanie Larsen
Bellevue University 


Leaders of organizations are trying to better understand if and how the training and development program dollars they expend each year are impacting the bottom line of their organizations. Research shows that leadership is willing to spend the money on these programs, but it also shows that their investments lack data to justify their worth. In fact, the American Society for Training and Development (ASTD) found that only three percent of all training courses are evaluated for business impact (Spitzer, 2005). Therefore, the purpose of this paper is to explore the importance of incorporating business analytics into evaluative methods when measuring the effectiveness of training initiatives. Further, this paper will discuss the challenges training professionals face when attempting to implement these evaluation methods. Finally, this paper will identify training evaluation models for both small organizational structures and medium to large organizational structures that rely heavily on business analytics to ultimately better align training initiatives with business objectives.

Mondore, Douthitt, and Carson (2011) defined analytics as “demonstrating the direct impact of people data on important business outcomes” (p. 21). For the purpose of this paper, the term metrics is defined as tangible data that is easy to obtain and measure, but is of low value to overall organizational impact and strategy. Conversely, the term business analytics is defined as intangible data that is difficult to obtain and measure, but is of high value to overall organizational impact and strategy. Analytics is what shows the actual cause and effect relationship among a training activity, the business outcomes, and building a strategy based on that information (Mondore et al., 2011). The term training professional is defined as any practitioner in the field of learning and development, onboarding, training, and organizational development. Mondore et al. (2011), assert that using analytics to evaluate training programs affords “…leaders an opportunity to show the direct impact of their processes and initiatives on business outcomes” (p. 21). Mondore et al. also found that this kind of measurement gives CEO-level leadership valuable insight into return on investment (ROI). This insight ultimately leads to better decision making abilities in the future. Furthermore, training professionals can better advocate for continued support and funding for programs and initiatives that are working and cut the ones that are not working (Mondore et al., 2011).

Background 

Griffin (2012) reviewed several reasons why training professionals should better understand how to most effectively evaluate their organization’s investments in training initiatives. Griffin’s extensive review of literature found that training professionals would be able to better justify resource allocation for these programs if they could properly evaluate those programs. Further, Griffin emphasized that better evaluation will hold them accountable for identifying ways of improving these programs year after year. Moreover, the author claimed that better evaluation enables better decision making pertaining to identifying the opportunity cost of investing in one training initiative over another. Lastly, Griffin warned that the most negative side effect of not implementing better evaluation methods could be a workplace that does not understand the impact of learning investments and therefore trainers are not taken seriously by leadership or employees. Davenport (2006) provided examples of large, profitable organizations using complex data and statistical analysis to rise above their competitors in overall company-wide success.  Davenport argued that industrial-strength business analytics across many activities within an organization (including training activities) is what sets organizations like Amazon, Harrah's, Capital One, and the Boston Red Sox apart.

However, even with all the supporting evidence showing the immense benefits of incorporating business analytics when evaluating training programs, the majority of organizations are still not utilizing this type of evaluative method regarding training. Mondore et al. (2011) found that the definition and process details of using business analytics the right way have not been fully articulated in a consistent way across organizational development literature. Spitzer (2005) argued that it should be distressing for training professionals and organizations to know that only three percent of all training courses are evaluated for business impact. These programs are being measured through happiness indexes rather than bottom line, ROI results. Spitzer also claimed that even the training professionals that are advocating for results-oriented measurement tend to fall back on low-level, easy-to-measure metrics like attendance and participant satisfaction, rather than doing the hard work to uncover true analytics that will impact business objectives.

Similarly, Baker, Chow, Woodford, and Maes (2012) found that training programs are often failures within organizations because they are unable to establish a positive link between training and the organization’s desired change. Baker et al. (2012) found that companies are continuing to spend billions of dollars on training programs, but many are still falling short of their intended goals or at least having the ability to measure whether or not they have met their intended goals. Kapoor (2010) made the argument that organizations have been implementing analytics in other areas such as financial management and forecasting, but that they have not yet taken advantage of the benefits that analytics has to offer in training and organizational development. Most concerning of all is the assertion by Spitzer and Conway (2002) that the survival of training programs are being threatened, stating “…the crisis is the failure to show that investing in training and employee development produces demonstrable business results” (Spitzer & Conway, 2002, p. 55).

Solution

The research shows strong arguments in favor of better aligning training initiatives with bottom line business objectives. Researchers and experts in the field of organizational development and training favor utilizing business analytics framework to more effectively evaluate training and development programs in the workplace. However, the research proves that the majority of organizations have still not figured out a best practice for effectively implementing analytics into training evaluation. This is not to say that evaluation methods do not exist as potential solutions to this problem. The next portion of this paper will introduce and explain two possible training evaluation models.  The first model could be easily implemented in a smaller organization, while the second model is intended for a medium to large organization. Both training evaluation models emphasize the use of business analytics.

Small Organizational Model 

The Kirkpatrick Four Levels is a training evaluation model that has been an industry standard for more than 50 years. The levels are meant to guide training professionals through different metrics that should be evaluated or measured in any training initiative (See Figure 1).  However, the levels are not meant to be indicative of order or consequential steps in any way. This model requires that practitioners start with level four and work their way back to level one prior to developing or delivering any kind of training program.



Kirkpatrick Partners state that the intended use of their model has always been for practitioners to begin implementing the steps prior to executing any training initiative. According to Kirkpatrick and Kayser-Kirkpatrick (2014), it is nearly impossible to create significant training value if the four levels are implemented after a training program has already been delivered. Although this method has been widely recognized in the training field for five decades, Kirkpatrick and Kayser-Kirkpatrick claim that when conducting the Four Levels workshops, training professionals often relate level one to smiley surveys, level two to pre and post-tests, and levels three and four to a ‘hope for the best’ mentality. Consequently, Kirkpatrick Partners developed the Kirkpatrick Business Partnership model (See Figure 2), which is intended to help eliminate a ‘hope for the best mentality’ and provide training professionals with a step by step executable framework.



As detailed in Figure 2, there are many moving pieces and several steps that must be taken before a training professional begins to evaluate a training event using The Kirkpatrick Four Levels model. The initial steps involve data collection around identification of business needs, which could also be described as collecting business analytics. The next step is creating value or buy-in with key leadership by asking them as many questions as possible about their vision of success for that particular training initiative. The answers to those questions become the targeted outcomes.

Next, the training professional must form business partnerships throughout the organization with key players, like supervisors. According to Kirkpatrick and Kayser-Kirkpatrick (2014), “Learning professionals need assistance and partnership from the business for training to deliver the results it is expected to yield” (p. 4). Furthermore, the authors note that training events alone do not provide substantial positive, bottom line impact to a business. This claim is backed by Garr and Loew (2011), which found that only 10 percent of learning takes place during a training event. The authors assert that 70 percent of learning takes place on the job and the other 20 percent is in the form of coaching and mentoring with immediate supervisors. Kirkpatrick and Kayser-Kirkpatrick assert, “At this point, an organization has the opportunity to move from consumptive training metrics to impact metrics” (Kirkpatrick & Kayser-Kirkpatrick, 2014, p. 9).

Medium to Large Organizational Model

The model that best fits into a medium to large organizational structure was created by Mondore et al. (2011). This model is a comprehensive outline with detailed steps on effectively implementing a training initiative that aligns with the organization’s overall objectives, can be measured, and impacts the bottom line. When compared to The Kirkpatrick Four Levels, the Mondore et al. Business Partner Roadmap (see Figure 3) is more comprehensive in nature and would be not as easy to implement in a smaller organization due to a lack of available personnel and resources. This model could be more easily executed in a medium to large organization with several departments with more personnel and resources with differing areas of expertise. Effective execution would involve several individuals from various cross-functional teams throughout an organization.



The Mondore et al. Business Partnership Roadmap emphasizes that an organization must first determine the top two to three most critical outcomes on which to focus. They suggest interviewing key leadership before finalizing this step to not only get advice on critical outcomes, but also to get key leadership buy-in. The next step is to create a cross-functional data team. This step involves identifying a team consisting of measurement experts, key metric owners, and organizational development leadership. This team determines data requirements, links the necessary datasets, and conducts analyses. Then, the training team is charged with assessing critical outcomes. This step is intended to determine how an organization currently stores their data, like frequency of measurement (e.g., monthly, quarterly, annually), level of measurement (e.g., by line of business, by work unit, by manager, at the store level, at the department/function level), and organizational owners (e.g., the department or leader of the particular measurement). Mondore et al. assert that these measurement characteristics must be understood before any linkages to employee data can be made. The authors state, “The goal is to have apples-to-apples comparisons of the data, which means that if you want to look at productivity numbers, you need to have productivity data that is measured at the same interval (e.g., monthly) and at the same level for each manager” (Mondore et al., 2011, p. 23).

Next, the team must build a list of priorities that have data and analysis behind each, which is intended to ensure impact on the organization’s bottom line objectives. Moreover, this step could potentially show which training events are not having the desired outcomes. Actual training event planning is to take place in the next step. This step is intended to show leadership that the training initiative is a deliberate attempt to focus on those employee processes/skills/attitudes/demographics that have been shown to have a direct impact on the organization’s desired bottom line outcomes. Finally, the team must measure and adjust/re-prioritize. This step is an ongoing effort to re-measure, to assess progress, and calculate actual return on investment. Mondore et al. suggest picking two to three priorities, building action plans around those, measuring progress against those plans two to three more times, and then re-calculating the linkages and re-prioritizing.

Conclusion 

This paper identified two possible models that serve as potential solutions to the struggle that many training professionals face, which is to better align training initiatives with overall business objectives to gain desired business impact. Both of the aforementioned models heavily rely on the collection and use of business analytics to effectively implement and evaluate these training events from beginning to end. The first model could be easily implemented in a smaller organizational structure, while the second model was intended for a medium to large organizational structure.

In summary, leaders of organizations are trying to better understand if and how the training program dollars they expend each year are impacting the bottom line of the organization. Research shows that leadership is willing to spend the money on these programs, but it also shows that their investments lack data to justify their worth. The purpose of this paper was to explore the importance of incorporating business analytics into evaluative methods when measuring the effectiveness of training initiatives. Further, this paper discussed the challenges training professionals face when attempting to implement these evaluation methods. Finally, this paper identified training evaluation models for both small organizational structures and medium to large organizational structures that rely heavily on business analytics to ultimately better align training initiatives with business objectives.

 References

Baker, F., Chow, A., Woodford, K., & Maes, J. D. (2012). Applying continuous improvement techniques   to instructional design technology (IDT) for greater organizational effectiveness. Organization Development Journal, 30(1), 53-62. Retrieved from http://ezproxy.bellevue.edu:80/login?url=http://search.proquest.com/docview/963777467?accountid=28125

Garr, S. & Loew, L.B. (2011). High-Impact Leadership Development: Best Practices for Building 21st Century Leaders. Bersin & Associates. Retrieved from http://www.bersin.com/News/Content.aspx?id=15596

Griffin, R. (2012). A practitioner friendly and scientifically robust training evaluation approach. Journal of Workplace Learning, 24(6), 393-402. doi:10.1108/13665621211250298

Davenport, T. H. (2006, 01). Competing on analytics. Harvard Business Review, 84, 98-107. Retrieved from http://ezproxy.bellevue.edu:80/login?url=http://search.proquest.com/docview/227818006?accountid=28125

Kapoor, B. (2010). Business intelligence and its use for human resource management. The Journal of Human Resource and Adult Learning, 6(2), 21-30. Retrieved from http://ezproxy.bellevue.edu:80/login?url=http://search.proquest.com/docview/867401292?accountid=28125

Kirkpatrick, J. & Kayser-Kirkpatrick, W. (2014). . [White Paper]. Retrieved from http://www.kirkpatrickpartners.com/Portals/0/Resources/Kirkpatrick%20Four%20Levels%20white%20paper.pdf

Mondore, S, Douthitt, S., & Carson, M. (2011). Maximizing the impact and effectiveness of HR analytics to drive business outcomes. Strategic Management Decisions (34, 2, 21-27)

Spitzer, D. R. (2005). Learning effectiveness measurement: A new approach for measuring and managing learning to achieve business results. Advances in Developing Human Resources, 7(1), 55-70. Retrieved from http://ezproxy.bellevue.edu:80/login?url=http://search.proquest.com/docview/221135772?accountid=28125

Spitzer, D. R., & Conway, M. (2002). Link training to your bottom line. Alexandria, VA: American Society for Training and Development.