Research

2,962,627 All Time Assessments

We encourage individual researchers (often learners developing a dissertation) and institutions to conduct research using their SmarterMeasure data. Such research not only further validates the assessment, but it is adding to a growing body of research on the topic of online student readiness.

Use the menu on the left to explore the following content:

  • Research Results – View summaries and full-text white papers of research conducted at peer institutions.
  • Assessment Details – Contains an overview and definition of each of the constructs measured by the assessment.
  • Construct Validity – Measurements of the degree to which the assessment is measuring the defined constructs.
  • Item Reliability – Measurements of the consistency of the assessment items.
  • Case Studies – Detailed descriptions of the implementation and benefit of using SmarterMeasure at select institutions.
  • Usage Patterns – Measurements of the most common implementation patters for utilizing the assessment.
  • Accreditation – Link to statements made by several major accrediting agencies concerning the requirement to measure online student readiness.

If you are interested in conducting research related to online student readiness using SmarterMeasure data please contact your SmarterMeasure account manager or contact Dr. Mac Adkins using the information on the Contact page of this website. The table below provides some suggested research strategies.

Construct Analysis Data Sources

Academic Success

What is the correlation between SmarterMeasure™ scores and learner’s grades?

Correlation and Analysis of Variance (ANOVA):

Stronger relationships may be found with scores of individual attributes and academic achievement. Other case studies have found individual attributes as the strongest indicator of academic success.
  • SmarterMeasure™ scores at the scale and sub-scale level as listed in Table 1
  • Student's grades in a specific course
  • Student's overall GPA

Student Engagement

What is the correlation between SmarterMeasure™ scores and metrics of student engagement?

Correlation, Independent Samples t-tests, Discriminant Analysis:

Stronger relationships may be found with scores in technical competency and technical knowledge. This may especially be the case for learners in their first term of enrollment. As is demonstrated in the National Student Readiness Report scores on technical competency and knowledge improve as the student gains experience in studying online or in a technology-rich environment. First-time online students are often confused about how to participate in an online course. See the section below labeled “Engagement Metrics” for more information.
  • SmarterMeasure™ scores at the scale and sub-scale level as listed in Table 1
  • Numbers of discussion board postings
  • Metrics of total "clicks" a student has made in a learning management system

Student Satisfaction

What is the relationship between SmarterMeasure™ scores and metrics of student satisfaction?

Analysis of Variance (ANOVA), Independent Samples t-tests, Discriminant Analysis, Structural Equation Modeling:

Responses to end of course survey items such as “I would enroll in another online course” could be used to segment students into groups and then the means of the SmarterMeasure™ scale scores could be compared across the groups.
  • SmarterMeasure™ scores at the scale and sub-scale level as listed in Table 1
  • Student satisfaction measures such as items from end-of-course surveys

Student Retention

What is the relationship between SmarterMeasure™ scores and metrics of student retention?

Correlation, Independent Samples t-tests, Discriminant Analysis, Multiple Regression:

A comparison of SmarterMeasure™ scale scores between retained and non-retained students could be calculated.
  • SmarterMeasure™ scores at the scale and sub-scale level as listed in Table 1
  • A listing of students who took SmarterMeasure™ which is adjusted to indicate whether or not the student enrolled for the subsequent term

Quantitative Student Feedback

What is the relationship between SmarterMeasure™ scores and quantitative points of student feedback.

Correlation:

Students typically take SmarterMeasure™ near the beginning of their enrollment. After the students have completed their first term of enrollment encourage the students to submit a survey which allows the student to provide feedback about their experiences in the online or technology-rich courses. Then correlations between these reported experiences and the student’s initial SmarterMeasure™ scores can be calculated as a measure of construct validity of SmarterMeasure. Questions which would be appropriate for this survey are provided below.
  • SmarterMeasure™ scores at the scale and sub-scale level as listed in Table 1
  • Results from a post-course survey

Qualitative Student Feedback

What is the relationship between SmarterMeasure™ scores and qualitative points of student feedback.

Comparison:

Assemble a focus group of students for a one-hour conversation about topics such as the construct of learner readiness and realities of online learning. A listing of possible discussion starting questions is presented below. Compare the observations made by the students either to their SmarterMeasure™ scores individually or to aggregate scores from the general population of students who have taken SmarterMeasure.
  • SmarterMeasure™ scores at the scale and sub-scale level as listed in Table 1
  • Compilation of notes taken during the focus group. These notes can be categorized and quantified to facilitate data analysis

Integration Plan Comparison

Is there a difference in SmarterMeasure™ scores between schools with strong and schools with weaker implementation plans?

Independent Samples T-test:

Considerable variance exists between the implementation plans of different schools/campuses. The impact that SmarterMeasure™ is having could be impacted by the strength of the implementation plan. A comparison of results of some of the suggested research strategies above could be made between schools with different implementation plans.
  • SmarterMeasure™ scores at the scale and sub-scale level as listed in Table 1
  • A categorization of the strength of implementation plans of various schools/campuses. SmarterServices can assist in the classification of implementation plans based on our experience with other institutions

Comparison to other Standardized Tests

What is the relationship between SmarterMeasure scores and scores on other standardized exams such as the Myers Briggs, SAT, Compass or AccuPlacer?

Correlations:

Calculate the correlations between measurements of online student readiness and other measurements of student aptitude taken through other admissions assessments.
  • SmarterMeasure™ scores at the scale and sub-scale level as listed in Table 1
  • Scale scores from third-party standardized tests

Student Retention

What is the relationship between SmarterMeasure™ scores and metrics of student retention?

Correlation, Independent Samples t-tests, Discriminant Analysis, Multiple Regression:

A comparison of SmarterMeasure™ scale scores between retained and non-retained students could be calculated.
  • SmarterMeasure™ scores at the scale and sub-scale level as listed in Table 1
  • A listing of students who took SmarterMeasure™ which is adjusted to indicate whether or not the student enrolled for the subsequent term

Rationale for Withdrawal

What is the relationship between SmarterMeasure™ scores and the reasons students give for why they stopped or dropped out of courses or degree programs. At one institution 55% of the reasons provided for dropping out were due to "life happening."

Descriptive statistics for qualitative data and correlation for quantitative data:

An analysis of SmarterMeasure™ sub-scale scores for the Life Factors section with stated reasons for withdrawal.
  • SmarterMeasure™ scores at the scale and sub-scale level as listed in Table 1
  • A paired data set matching an individual student’s stated reason for dropping or stopping out to their SmarterMeasure scores. The reasons for withdrawal could be codified for ease of analysis.

Extraneous Factors to Consider

When conducting research using SmarterMeasure data one should be aware of the possible impact of these extraneous factors.

Self-Selection Bias – At some institutions students are given the option to take the SmarterMeasure Learning Readiness Indicator. In that case it can be assumed that at some level the student is demonstrating a degree of diligence simply by completing the 125 item assessment. In that case the data set for the school may be over populated with data from "diligent" students and under populated with data from "non-diligent" students. If this skewed data set was then used, for example, to compute a correlation to grades and grades were selected across the entire population of students, including "diligent" and "non-diligent" students, then the comparison would not be valid. To compensate for this schools are encouraged to require all students to complete the assessment.

Treatment Effect – We stress that the SmarterMeasure Learning Readiness Indicator is an indicator of levels of non-cognitive readiness for learning online or in a technology-rich environment, it currently does not provide the remediation. This is much like how a thermometer is a measure of a fever, it is not the medicine. While we do link to several resources for remediation, schools are encouraged to link to and provide their own remediation resources. Some schools excel in individually following up with students about their SmarterMeasure scores and providing the student with rich resources for remediation. This is exactly what we want to have happen. However, when this happens it can serve to invalidate the assessment scores. For example, suppose a student has low levels of readiness as indicated by SmarterMeasure. Then the school diligently provides rich resources for remediation in an orientation course which results in the student improving in several areas. Due to this remediation the student achieves higher levels of academic success, engagement, satisfaction and retention than he/she would have otherwise. Then if the school correlates SmarterMeasure scores to these key student performance indicators there will appear to be a non-valid relationship.