RMIT University Document: SET LTIF Final Report Abdollahian.doc/Katrina Woodland Save Date: 06-03-2009 Page 1 of 14 RMIT University Learning and Teaching Investment Fund 2008 Final Report Due date is February 20, 2009 to your LTIF College Coordinator Project title New ways of Learning - Teaching and assessment of large classes Project leader Mali Abdollahian Team members Assoc Prof Cliff Da Costa, Dr Yousong Luo, Dr Claude Zorzan, Dr Ian, Grundy,Anthony Bedford, Funds approved $45,433 Funds acquitted (attach financial statement) Introduction This project is a service teaching innovation which will build on the current RMIT leadership and student feedback project. It is designed to address key educational problems and issues experienced by lecturers and those identified as the top three issues by students in their course experience feedback. Service teaching is an integral aspect of learning and teaching in Higher education and is acknowledged by universities nationally and internationally as an important element of cross-disciplinary study. Service teaching has many models. In a traditional service teaching model the discipline expertise is recognized and valued and is utilized across courses, programs and discipline areas within the university. Educational quality of courses and programs are enhanced when the students are taught by discipline experts with the appropriate expertise, skills and knowledge base rather than home program lecturers. Student learning outcomes and experiences should be satisfying and memorable experiences for the students. However; this is not currently the case for some courses in several programs. Effective service teaching and best practices still remain undefined. This project will develop, implement and evaluate three classroom curriculum initiatives each addressing a specific educational problem in providing maths and stats service teaching to large classes. Detailed project description and outline of what was done The initiatives to be addressed in this project are: Innovation 1 – Addressing student diversity in multi-discipline large classes by providing relevant disciplinary context-related exemplars in teaching service classes Innovation 2 – Addressing tutor teaching practices by trialling innovative ways of providing and supporting tutors in large service classes Innovation 3 - Exploring and implementing effective ways to provide feedback to students in large service classes The proposed integrated learning system will utilise existing IT infrastructure currently available at RMIT and will blend with what is available via resources such as WebLearn. RMIT University Document: SET LTIF Final Report Abdollahian.doc/Katrina Woodland Save Date: 06-03-2009 Page 2 of 14 Description - Innovation 1 This innovation led by Dr Mali Abdollahian and Dr Yousong Luo will use the course Maths 2123 to develop, pilot and evaluate relevant discipline specific examples for teaching the course. During semester 1, 2008, 3 3rd and 4th year students from the relevant disciplines will be selected to develop relevant data and examples for teaching statistics and mathematics in the first year multi-discipline large class. These examples will be reviewed and validated by the chief investigators, then uploaded into the course material on the Distributed Learning Management system (DLS). The redesign of the online component of the course will include changing the resources and exemplars into sub-categories related to the multiple disciplines and student cohort taught in the service teaching classes. These exemplars will be used for teaching and evaluated in terms of their effectiveness in student learning experiences and outcomes. Description - Innovation 2 This innovation led by Dr Ian Grundy and Dr Claude Zorzan will institute a pilot of using the Queensland University model of volunteer tutoring cited in the Australian University’s Educational quality best-practice here at RMIT in the course Maths 2117. Two lecturers are exploring the possibilities and ways of implementing it here and would apply what they have learnt into RMIT L&T practices. The initiative uses 3rd and 4th year students to peer tutor lower years of the program on a voluntary basis. All tutors are mentored and supported by academics and only the good committed tutors are provided formal training in learning and teaching and as accredited tutors. In the first semester 2008 the investigators will set up the scheme, select tutors, provide support and mentor them when they are conducting tutorials. In 2nd semester this will continue but there will be interviews to find out what are the key success factors and barriers to implementing such a scheme. The interviews will also identify the key elements that tutors will need on their training. The project will be evaluated and refined for use in 2009 with focus of improving teaching practices and student learning outcomes. Description - Innovation 3 Innovation 3 led by A/P Cliff da Costa and Dr Anthony Bedford will expand on their current practices of teaching online by exploring effective ways of providing feedback to the students. Currently students use Weblearn applications to self assess their learning and are provided hurdles so they achieve mastery of the topics and their learning objectives. What is lacking in this approach is providing immediate, prompt and appropriate feedback for learning. In semester 1 2008 the investigators will explore common mistakes students make, common misconceptions and develop appropriate feedback for the databank. This information will be gleaned from the multiple choice tests that the students will have done during semester 1 and in 2007. They will input this on the Weblearn database so that it feeds into the teaching and learning modules on the DLS In semester 2 first year students will use it. The feedback would be refined as students use the learning assessment tasks. The project will be evaluated and refined for use in 2009 with the focus of improving teaching practices and student learning outcomes. Attach the full and detailed report and evaluation of your project outcomes including evidence of Innovation 1 Due to the fact that grant money was finalized at the end of March, 2008 we could not get enough students on time to complete the project for the first semester of 2008, however we are ready to implement it in the first semester of 2009. During semester RMIT University Document: SET LTIF Final Report Abdollahian.doc/Katrina Woodland Save Date: 06-03-2009 Page 3 of 14 the impact the project has had. Also make reference to how the outcomes address the five key objectives: Improved student learning experiences, outcomes and employment opportunities Innovation Strategic alignment University wide application Value for money 1, 2008, three 3rd and 4th year students from the relevant disciplines were selected to develop discipline related data and examples for teaching statistics in the first year multi-discipline large class. Lecture notes were prepared in three disciplines by changing the exemplars to applications in Food science, Environmental Science and Biomedicine. The redesign of the online component of the course involved a separation of resources and exemplars into sub-categories related to the multiple disciplines and student cohort taught in the service teaching classes. These examples were reviewed and validated by the chief investigators, then uploaded into the course material on the Distributed Learning Management system (DLS). Although the lecture material covered in class was common students were able to refer to their discipline oriented lecture notes with relevant examples in the DLS. Because the examples are related to the disciplinary context, students find it more interesting and easy to understand. Apart from disciplinary related lecture notes, the step by step graphical Learning Guide of MINITAB (statistical package that is used heavily in the course) for two mathematics courses (both courses have more than 220 students) was prepared with 8 web pages to help students learn to use MINITAB to perform statistical analysis on the topics covered in their lecture notes. In order to make the MINITAB Guide an effective self teaching tool, we have presented step by step graphical designed examples for each topic that enables students to use the package for their required statistical analysis without getting help from any one. We have also generated a smart weblearn question bank to address the diversity in the academic background of students in the course. In the weblearn test questions a help link was inserted for each question that required use of the MINITAB statistical package. The link displays step by step graphical application of the MINITAB for the related topic. Therefore students can use the MINITAB Learning Guide for any difficulties they come across in solving problems. Another important change done in weblearn tests is the introduction of statistical definitions help link. In the questions of weblearn modules. For each question in the weblearn question bank, if students click on a statistical word they are not familiar with they will be directed to a webpage of definitions and related examples. These changes would enable student to get instant feedback on each question. This is a significant step that helps them learn independently at their own pace about the topics taught during the lecture. The Minitab help link was implemented in 2008 and the feedback was very positive since the students who needed extra help could get it instantly on line. They also mentioned that the link made it easy for them to complete their weblearn tests with additional lab assistance. However, the definition help link will be implemented for the first time during the first semester of 2009. These changes were trialled on two mathematics course each of which has a statistics and a mathematics component. While the Minitab learning guide help link and the definition help link were added to the statistics component, other changes were introduced to the weblearn tests within mathematics component. Under the topics covered in the two courses, new question banks are created. The majority of the questions in those question banks are new with only a few questions modified from the existing questions. The new questions with built-in intelligence were created with the help of a maths honours student and a computer science master student who had knowledge of Maple, LaTex, HTML and Graphic editing. There are two major innovations to the new questions. The first is the feedback mechanism. When students answered a question wrongly they will be provided with a detailed solution to their individualized question. This will help them to achieve better in their next attempt. The second is the editing tool for entering Maple codes. Most questions require students to enter a symbolic mathematical expression as their answer and this is done by using Maple code. Previous CES shows that students have a great difficulty with entering Maple code. Now the new questions employ an external Java Applet: DragMath to assist students to edit mathematical expressions graphically. RMIT University Document: SET LTIF Final Report Abdollahian.doc/Katrina Woodland Save Date: 06-03-2009 Page 4 of 14 mathematics component will also be implemented in the first semester of 2009. Innovation 2 The large class considered for this innovation was a mathematics course servicing on Engineering department. The course had a mathematics and statistics component and an enrolment of over 220 students. Tutors were selected from the relevant Engineering department and also from the Maths department. For the mathematics component, students were divided into two groups. Three tutors from the Engineering department were assigned to one group and 3 tutors from the Mathematics department were assigned to the other group. At the end of semester 2 2008 the Mathematics practice class marks of each group was then compared using a two-sample t-test to decide whether the home department tutors are more effective in the learning process or not. The results show that there was no significant difference in marks between the two groups. In the same two groups, students who scored exam marks above 40% were also compared using a two sample t-test and again there was no significant differences between the two groups. For the statistics component, students were again divided into two groups. A tutor from the Engineering department was assigned to one group and a tutor from the Mathematics department was assigned to the other group. Statistics Weblearn test marks for each group were then compared using a two-sample t-test. The results showed a significant difference in marks between the two groups. i.e. students who had a tutor from their home department achieved higher marks compared with students who were tutored by statistics student. It should be mentioned that students could do the test at any time within the opening period of each test ( this was at least a two week period), therefore we have no prior- knowledge of whether they did the test while the tutor was present in the lab or not. For these two groups students with exam marks above 40% were compared using a two-sample t-test. The result showed that there was no significant difference in the exam marks between the two groups. The graphical and statistical results for this innovation are attached in Appendix A. Innovation 3 The service courses considered for this innovation were the two compulsory first-year statistics courses included in the Bachelor of Applied Science (Psychology) degree program. Typically these courses have around 90 students enrolled at both the Bundoora and City campuses. These students usually have a strong dislike for mathematics, including statistics, and consider it onerous being forced to undertake these courses despite the clear need for the objective skill set in their studies. Assessment in these service courses comprises mainly of multiple-choice question based paper tests. Initial analysis of the student performance on Semester 1 2008 assessment tasks identified some unexpected, potentially confounding factors, viz: (a) the number of response items; (b) the placement of the correct response; and (c) previous student performance. The project was then modified for these students in Semester 2 2008 to allow further identification and analysis of these potentially confounding factors. Analysis of assessment tasks undertaking in Semester 1 2008 indicated that students, when presented with a four response item question, were more likely to select an incorrect response than expected (p <0.001). Further to this, the placement of the correct response in these four response questions was also found to be a considerable influence on the student’s performance (with p<0.001). Given these results, the first assessment task presented to the students in the Semester 2 unit was adapted to contain only 4 response type questions and considerable care was taken to ensure the wording and style of each question were consistent, i.e. only positively phrased questions. Great care was also taken to ensure that the correct responses RMIT University Document: SET LTIF Final Report Abdollahian.doc/Katrina Woodland Save Date: 06-03-2009 Page 5 of 14 were randomly placed to avoid creating predictable patterns in the correct responses. Following this assessment task, students were given a thorough feedback session, covering identified areas of weakness and ensuring that students understood why a particular response was the most correct. Students were also encouraged to discuss why they had selected a particular incorrect response to assist in the identification of areas of weakness and/or confusion. Analysis of their performance on this task demonstrated that the correct response positioning effect had disappeared (with p=0.309), however, a new unexpected result arose from the analysis. The performance of these students was found to be highly predictable (with R2=0.98 and p<0.001) perhaps suggesting that whilst students may learn new material, they master this new material to the same level as previous material. Subsequent assessment tasks demonstrated similar results in terms of predicting student performance. (See Appendix B) Despite these potentially confounding factors, we identified four main areas of weakness for students in these service courses, viz: (1) students miss vital steps in calculations; (2) students confuse technical terms (e.g. thinking bimodal is the same as bivariate); (3) students misread the question or response text; and (4) students misread data tables. Following the work undertaken for the first assessment task in Semester 2 and the subsequent feedback session, we determined that these factors had been reduced to just (3) and (4) on the remaining assessment tasks. This outcome strongly suggests that the bi-directional feedback between the students and the lecturers allows both parties to better identify areas of weakness and implement appropriate intervention strategies. However, further research involving a larger sample size and more teaching staff is required to validate this finding. Work with regard to predicting student performance and refining assessment tasks is ongoing. We are attempting to establish a tool based upon Rasch models (Item Response Theory) to quantify the fairness and reliability of a question and thereby an assessment task. This work is intended to inform future assessment development and to aid in providing prompt feedback to both the students and the lecturers involved. Dissemination of project outcomes both completed and planned. This should include both within RMIT and externally. The major benefits that would accrue from innovation 1 are: a. A Student learns the statistics and mathematical concept via an examplar and sees its immediate application in their respective discipline b. Student can simulate the data analysis employed in the discipline which enhances understanding of the concept c. Student-learning occurs more efficiently and is integrated within their own discipline areas Innovation 2 indicated that if we train the tutors from the service department they can be as effective as tutors from the mathematics department in tutoring mathematics and statistics courses, despite the fact that their mathematical or statistical knowledge would not be at the same level of Maths and stats students. Since the major part of innovation 1 has just been completed and is going to be implemented in the first semester of 2009 we will be publishing the final guidelines after analysing the CES score in 2009. A draft paper is already prepared to be submitted in an Education conference in 2009. We have already achieved improved CES score of 10 points on one of the pilot course where the improved statistics weblearn test and innovation 2 were implemented in semester 2 of 2008. RMIT University Document: SET LTIF Final Report Abdollahian.doc/Katrina Woodland Save Date: 06-03-2009 Page 6 of 14 Summary of the project, outcomes, impacts and dissemination The Main objective of this project was to develop an integrated learning system that would utilise existing IT infrastructure currently available at RMIT and would blend with what is available via resources such as WebLearn. The project was designed to: Address student diversity in multi-discipline large classes by providing relevant disciplinary context-related exemplars in teaching service classes Address tutor teaching practices by trialling innovative ways of providing and supporting tutors in large service classes Exploring and implementing effective ways to provide feedback to students in large service classes Lecture notes are prepared in three disciplines by changing the exemplars to applications in Food science, Environmental Science and Biomedicine. Students in multi-discipline class will be able to refer to their discipline oriented lecture notes with relevant examples. This part of the project will be implemented in the first semester of 2009. A smart weblearn question bank has been generated to address the diversity in the academic background of students in the course. This will ultimately help students learn independently at their own pace about the topics taught during the lecture. Within the statistical weblearn question bank a help link was inserted for each question that required use of the MINITAB statistical package. The link displayed step by step graphical applications of the MINITAB package for the related topic. Another significant change undertaken in the weblearn tests is the introduction of a statistical definitions ‘help link’. For each question in the weblearn question bank, if students were unfamiliar with a specific statistical term they would be directed to a webpage of definition and related examples which further enhanced their understanding of the term and its application. For the mathematical component of these courses new questions with built-in intelligence were created. There are two major innovations to the new questions bank. The first of which is the feedback mechanism which provides detailed solution to any incorrect answers and the second is the editing tool for entering Maple codes. Most questions require students to enter a symbolic mathematical expression as their answer and this is done by using Maple code. Previous CES shows that students have great difficulty with entering Maple code. As a result of this, new questions now employ an external Java Applet: DragMath to assist students to edit mathematical expressions graphically. Tutors from the Engineering department were trained to tutor one group of engineering students for the maths and statistics components of the course. The other group’s tutors were composed from the Maths department. The statistical analysis of the exam marks showed that there is no significant difference in the exam marks between the groups. We have identified the following four main areas of weakness for students in the service courses: (1) students miss vital steps in calculations; (2) students confuse technical terms (e.g. thinking bimodal is the same as bivariate); (3) students misread the question or response text; and (4) students misread data tables. Following the work undertaken for the first assessment task in Semester 2 of 2008 and the subsequent feedback sessions, we determined that these factors had been reduced to just (3) and (4) on the remaining assessment tasks. This outcome strongly suggests that the bi-directional feedback between the students and the lecturers allows both parties to better identify areas of weakness and implement appropriate intervention strategies. However, further research involving a larger sample size and more teaching staff is required to validate this finding. Appendix A: Innovation 2 Group 1 – students who had tutors from engineering department for Mathematics Group 2 – students who had who tutors from Mathematics department for Mathematics Comparison of the Maths assignment marks D at a Group 2Group 1 20 15 10 5 0 Boxplot of Group 1, Group 2 Two-Sample T-Test and CI: Group 1, Group 2 Two-sample T for Group 1 vs Group 2 N Mean StDev SE Mean Group 1 49 17.64 2.79 0.40 Group 2 119 17.48 3.76 0.34 Difference = mu (Group 1) - mu (Group 2) Estimate for difference: 0.164346 RMIT University Document: SET LTIF Final Report Abdollahian.doc/Katrina Woodland Save Date: 06-03-2009 Page 7 of 14 95% CI for difference: (-0.879030, 1.207722) T-Test of difference = 0 (vs not =): T-Value = 0.31 P-Value = 0.756 DF = 119 Conclusion: P-Value = 0.756 > 0.05 Do not reject H0. There is no significant difference in maths assignment marks between engineering department tutor group and maths department tutor group. Comparison of the students who have achieved Exam marks above 40% and had different tutors for the maths component of the course Group 1 – students who had tutors from engineering department for Mathematics Group 2 – students who had who tutors from mathematics department for Mathematics D at a Group 2Group 1 100 90 80 70 60 50 40 Boxplot of Group 1, Group 2 Two-Sample T-Test and CI: Group 1, Group 2 Two-sample T for Group 1 vs Group 2 N Mean StDev SE Mean Group 1 43 65.8 14.8 2.3 Group 2 105 64.1 12.0 1.2 Difference = mu (Group 1) - mu (Group 2) Estimate for difference: 1.72292 95% CI for difference: (-3.35133, 6.79718) T-Test of difference = 0 (vs not =): T-Value = 0.68 P-Value = 0.500 DF = 65 Conclusion: P-Value = 0.5 > 0.05 Do not reject H0. There is no significant difference in MATH2114 exam marks between engineering department tutor group and maths department tutor group. RMIT University Document: SET LTIF Final Report Abdollahian.doc/Katrina Woodland Save Date: 06-03-2009 Page 8 of 14 Comparison of Statistics weblearn tests marks Group 1 – students who had tutors from engineering department for Mathematics Group 2 – students who had who tutors from mathematics department for Mathematics D at a Group 2group 1 30 25 20 15 10 5 0 Boxplot of group 1, Group 2 Two-Sample T-Test and CI: group 1, Group 2 Two-sample T for group 1 vs Group 2 N Mean StDev SE Mean group 1 43 23.02 3.62 0.55 Group 2 126 21.17 6.49 0.58 Difference = mu (group 1) - mu (Group 2) Estimate for difference: 1.84865 95% CI for difference: (0.26691, 3.43039) T-Test of difference = 0 (vs not =): T-Value = 2.31 P-Value = 0.022 DF = 131 Conclusion: P-Value = 0.022 < 0.05 Reject H0. There is a significant difference in MATH2114 Statistics weblearn marks between engineering department tutor group and maths department tutor group. RMIT University Document: SET LTIF Final Report Abdollahian.doc/Katrina Woodland Save Date: 06-03-2009 Page 9 of 14 After eliminating the outliers, Two-Sample T-Test and CI: group 1, Group 2 Two-sample T for group 1 vs Group 2 N Mean StDev SE Mean Group 1 42 23.24 3.38 0.52 Group 2 123 21.64 5.82 0.52 Difference = mu (group 1) - mu (group 2) Estimate for difference: 1.59582 95% CI for difference: (0.13229, 3.05935) T-Test of difference = 0 (vs not =): T-Value = 2.16 P-Value = 0.033 DF = 123 Conclusion: P-Value = 0.033 < 0.05 Reject H0. There is a significant difference in MATH2114 Statistics weblearn test marks between engineering department tutor group and maths department tutor group. Same conclusion as above. Comparison of the students who have achieved Exam marks above 40% and had different tutors for the statistics component of the course Group 1 – students who had tutors from engineering department for Mathematics Group 2 – students who had who tutors from mathematics department for Mathematics D at a Group 2Group 1 100 90 80 70 60 50 40 Boxplot of Group 1, Group 2 Two-Sample T-Test and CI: Group 1, Group 2 Two-sample T for Group 1 vs Group 2 RMIT University Document: SET LTIF Final Report Abdollahian.doc/Katrina Woodland Save Date: 06-03-2009 Page 10 of 14 N Mean StDev SE Mean Group 1 40 64.1 12.9 2.0 Group 2 110 64.7 12.9 1.2 Difference = mu (Group 1) - mu (Group 2) Estimate for difference: -0.615909 95% CI for difference: (-5.371550, 4.139732) T-Test of difference = 0 (vs not =): T-Value = -0.26 P-Value = 0.797 DF = 69 Conclusion: P-Value = 0.797 > 0.05 Do not reject H0. There is no significant difference in MATH2114 exam marks between engineering department tutor group and aths department tutor group. Appendix B: Innovation 3 Analysis of four response questions (1) versus other questions (2) in Semester 1 course: Chi-Square Test: Incorrect, Correct Expected counts are printed below observed counts Chi-Square contributions are printed below expected counts Incorrect Correct Total 1 306 986 1292 379.46 912.54 14.220 5.913 2 393 695 1088 319.54 768.46 16.886 7.022 Total 699 1681 2380 Chi-Sq = 44.042, DF = 1, P-Value = 0.000 Note: We reject the null hypothesis with p=3 x 10-11 and =0.05. The data provides significant evidence that there is an association between the number of response items on a question and the performance of students (i.e. Correct or Incorrect). Analysis of four response questions correct answer placement in Semester 1 course: Chi-Square Test: Incorrect, Correct Expected counts are printed below observed counts Chi-Square contributions are printed below expected counts Incorrect Correct Total 1 66 240 306 72.47 233.53 0.578 0.179 2 68 306 374 88.58 285.42 4.781 1.484 3 89 319 408 96.63 311.37 0.603 0.187 RMIT University Document: SET LTIF Final Report Abdollahian.doc/Katrina Woodland Save Date: 06-03-2009 Page 11 of 14 4 83 121 204 48.32 155.68 24.899 7.727 Total 306 986 1292 Chi-Sq = 40.438, DF = 3, P-Value = 0.000 Note: We reject the null hypothesis with p=8.9 x 10-9 and =0.05. The data provides significant evidence that there is an association between the placement of the correct answer response item on a question and the performance of students (i.e. Correct or Incorrect). Analysis of four response questions correct answer placement in Semester 2 course: Chi-Square Test: Incorrect, Correct Expected counts are printed below observed counts Chi-Square contributions are printed below expected counts Incorrect Correct Total 1 98 210 308 100.98 207.02 0.088 0.043 2 129 235 364 119.34 244.66 0.782 0.381 3 107 257 364 119.34 244.66 1.276 0.622 4 125 239 364 119.34 244.66 0.268 0.131 Total 459 941 1400 Chi-Sq = 3.592, DF = 3, P-Value = 0.309 We do not reject the null hypothesis with p=0.309 and =0.05. The data does not provide significant evidence that there is an association between the placement of the correct answer response item on a question and the performance of students (i.e. Correct or Incorrect). RMIT University Document: SET LTIF Final Report Abdollahian.doc/Katrina Woodland Save Date: 06-03-2009 Page 12 of 14 Analysis of student performance in Semester 2 course using Semester 1 final mark as a predictor: Scatterplot showing possible linear relationship: 908070605040 45 40 35 30 25 20 Semester 1, 2008 course Final Mark Se m es te r 2, 2 00 8 Ta sk 1 m ar k Scatterplot of S2 Task 1 mark vs S1 Final Mark Regression analysis: Regression Analysis: Task 1 mark versus S1, 2008 Final Mark The regression equation is Task 1 mark = 0.465 S1, 2008 Final Mark Predictor Coef SE Coef T P Noconstant S1, 2008 Final Mark 0.46530 0.01402 33.19 0.000 S = 5.19493 Analysis of Variance Source DF SS MS F P Regression 1 29720 29720 1101.27 0.000 Residual Error 26 702 27 Total 27 30422 Unusual Observations Obs S1, 2008 Final Mark Task 1 Fit SE Fit Residual St Resid 26 68.0 43.00 31.64 0.95 11.36 2.22R 29 63.0 18.00 29.31 0.88 -11.31 -2.21R R denotes an observation with a large standardized residual. RMIT University Document: SET LTIF Final Report Abdollahian.doc/Katrina Woodland Save Date: 06-03-2009 Page 13 of 14 Lecturer notes about unusual observations: Subject 26 had demonstrated a natural aptitude and curiosity for the material Subject 29 had difficulties with the material and was identified as “at risk” following this assessment. A successful intervention resulted in this student achieving a Pass grade in the upper range (i.e. 55-59). Note: R2 = 0.9769 and We reject the null hypothesis (H0: =0) with p=3 x 10-22 and =0.05. The data provides significant evidence that the slope of the population regression line is non-zero. We also reject the null hypothesis (H0: =0) with p=3 x 10-22 and =0.05. The data provides significant evidence that the population correlation coefficient is non-zero. RMIT University Document: SET LTIF Final Report Abdollahian.doc/Katrina Woodland Save Date: 06-03-2009 Page 14 of 14