Support & Troubleshooting
-
When are the Predict Models updated?
Student Success & Engagement Predict Models are updated three times each year before the Fall, Spring, and Summer terms.
Fall term models are updated in late July or early August
Spring term models are updated in late November or early December
Summer term models are updated in late April or early May
Each time a course completion model is retrained, it may include al...
-
Why is there a large drop in High School student retention at the end of the Fall term?
The reason there is a large drop in the High School student population retention at the end of the Fall term is most likely because at that time, it is very likely that only very few students have been "retained" either by earning a certificate or degree at the end of the Fall term, or by enrolling in the following Fall term.
By default, retention is measured by calcula...
-
Predict Troubleshooting
The following instructions provide the initial steps for troubleshooting missing and/or incorrect risk level indicators displayed in Student Success & Engagement from Predict.
Overview
By design, the risk indicators displayed in SS&E are only associated with certain students/sub-populations.
For each student included in the Predict Analytics Student population, the Pers...
-
Why is there a question mark on the Course Completion risk indicator?
A question mark will display on the Course Completion risk indicator when the student is not enrolled in any current or future term course sections, i.e. does not have any current or future enrollments.
How to check this:
Verify that the student Courses tab does not list any registered courses in the current term.
Verify that the student Courses tab does not list any ...
-
Why is there a question mark on the Persistence risk indicator?
A question mark will display on the Persistence risk indicator when the student's academic calendar term ended and they are not registered in a future academic term.
How to check this:
Verify that the student Courses tab does not list any registered courses in a future term.
-
What happens to the predict models when I update a data feed?
Models are only trained three times a year. Each training session ensures that new models accurately predict student data based on the historical data we have access to. When data feeds are updated mid-term, it may change the results of the models since our models were trained on your prior data. Often the results will be similar, and when new data feeds are added they ...
-
Why does a factor show up as a risk when we’d expect it to be a success factor (and vice versa)?
We calculate whether a characteristic/factor is attributed to risk or success based on the historical performance for students with that factor at your institution. Most often, this analysis of historical data comes to the same conclusion that we intuitively believe. However, there are times when the historical data does not agree with human intuition, and, in an effort...
-
Why are a student’s course completion factors the same for each of their enrollments? Why are they the same as the student persistence factors?
While we do have factors specific to the type of course a student is taking, particularly if it is a repeated course, it is often the case that the strongest indicators of risk and success are characteristics of the student overall. This means that the strongest factors related to each course are very similar or the same for each course a student is enrolled in. When th...
-
Why is this student at this risk level when we have strong information pointing them in the opposite direction?
Risk levels can only be reactive to data that we have the ability to pass through to our models. Predictions are not often 100% accurate. If assignment grades, attendance, and other information is not available to our models, the models will not be able to use that information to assign risk levels
Often, humans are able to use the contextual information that is not a...
-
Why did this student’s risk level change when none of their data changed?
If a student’s risk level changed when their data hasn’t, then it is likely that the data for other students changed. Because we always have a 10%/30%/60% of students split into High/Medium/Low risk levels, it is possible for students to move risk levels without their probability score changing. This happens when other students move categories, which pushes them into a ...
-
This student has a high percentage for their risk level, so why are they marked as medium/high risk? This student has a low percentage for their risk level, so why are they marked as low risk?
Risk levels are determined by each student’s risk relative to their peers. There are no absolute cutoffs for what is considered low/med/high risk. If an institution on average has students pass 90% of classes, then likely there are students who are marked as medium or high risk even though they have a probability of >85% of completing their course. They just happen to b...