We’ve released a set of updates designed to give teachers more flexibility, improve assessment accuracy, and help students learn more effectively. This update includes a brand‑new marking mode for Tasks, improvements to how returned tasks behave, and the first stage of our upgraded AI marking system.
New Task marking mode
Tasks were always intended for assessment. They were supposed to mimic the test/exam scenario where students could look at all the questions, tackle them in any order, go back and change their mind before finally submitting their responses for marking.
Over time, due to the sandbox nature of Smart Revise it became apparent that there were so many more ways you could use tasks. Not just for summative assessment, but also in formative ways too. One typical approach is to set a task containing a number of questions but tackle them one by one with the student self-assessing their answer before moving onto the next question.
Tasks now support this with a new marking mode: Self assessment (after question). Teachers can select this when setting a new task.
Changes to returning a task to the student
When you returned a task to a student they would lose all their answers and marking data, essentially requiring them to retake the task. This has been changed so that their answers, marking and feedback is retained. This allows a student to reflect and improve on their answers before resubmitting the task.
If you want one or more students to retake the task from scratch you should copy the task instead. Doing this will create two entries in your mark book, one for each task. If you didn’t want to see both, use the tagging feature to give each task a tag. You can then use these tags to filter what tasks you see in your mark book.
Improvements to AI marking
We recently commissioned some research into how we can make AI marking more accurate. This update includes stage 1 of a 3-part plan.
We have upgraded to the latest Open AI model that is optimised for structured, rules-based tasks such as extracting key points from answers, checking correctness step-by-step and assigning more consistent marks. We have also significantly increased the prompt to inform it of marking principles and how to justify the marks awarded. The marking should begin to be more accurate because the AI is now instructed to:
Ignore minor spelling errors.
Accept all valid answers that demonstrate understanding, even if they are not strictly industry standard approaches to compensate for the level of study.
Adopt a more structured approach distinguishing between completely incorrect and partially correct answers.
Use a structured output schema so the feedback can be parsed more accurately.
This is just the start. We anticipate even greater improvements when stages 2 and 3 are implemented. Please remember this is an improvement but it is not, and probably never will be perfect.
What’s coming next?
We are still continuing to analyse the live data from goals. A second iteration of the feature will remove the “days” requirement from the Terms goal so it behaves in the same way as Quiz and Advance. Although linking the Terms goal to the Leitner system felt like a good idea when we analysed the requirement initially, feedback has told us maybe not!
Part of the work on improving goals has also included investigating linking them more strongly to the flight path. For example, if you want to change the goals maybe you could have a setting to follow the minimum expectation, target or aspirational trajectory on the flight path. So instead of setting a minimum or maximum number of questions, you identify your intention for the whole class or individual students instead. We are still working on that possibility at the moment.
Account verification is a standard security measure used by responsible online platforms to keep student data safe and ensure that only the rightful owner can access an account. Smart Revise is no exception. Like most secure systems, we send a verification email when a new user registers. The student simply clicks the link in that message, and their account becomes verified and ready to use.
However, if the verification email never arrives—or arrives but expires—students may appear “stuck.” This article explains why verification matters, what to do when issues occur, and what not to do, so that teachers can quickly resolve the root cause without needing to involve the Smart Revise support team unnecessarily.
Why verification emails are essential
With education systems used by young people we must ensure that an account truly belongs to the person registering. Verification helps:
Confirm identity and ownership Verification ensures the person creating or accessing the account actually controls the email address provided. This reduces impersonation and unauthorised access.
Reduce automated or fraudulent sign‑ups Requiring users to interact with an email inbox helps block bots and mass fake registrations.
Protect personal and student data Verification helps ensure progress data is tied to a legitimate, reachable account.
Enable system features Some features such as password reset rely on a verified email to work correctly.
Without verification, Smart Revise cannot activate the account fully, which is why unverified students may not appear correctly in class lists or may be restricted from certain actions.
The most common reason why a student never receives an email
Students often mistype their email address. A missing character from the school domain name for example. Ask students to take extra care when entering their school email address. Students can register again using a correct email address if they need to. Smart Revise will either prevent a duplicate account or the incorrect/unverified/unused account will be deleted automatically during the annual data deletion process.
Domains your school must whitelist
This step is essential and we cannot do it for you!
If your school filters or blocks external email, the verification message may never reach the student. In these cases, your IT team must whitelist the Smart Revise sending domains.
Schools must whitelist the email (not website) domains:
*.smartrevise.co.uk
Emails are typically sent from comms2.smartrevise.co.uk, but wildcarding ensures delivery even if a backup sender is used.
If these domains are blocked, students will not receive any verification messages.
Expired links and re-sending verification emails
Verification links expire after 7 days. If a student waits too long before attempting verification, the link may no longer work—especially if the domain was only whitelisted later.
Once domains are whitelisted:
Students can request a new verification email from their Manage account page.
If there is no “Send verification email” button, then the account is already verified.
Please do not ask students to email us about verification problems
We kindly ask teachers not to encourage students to contact Smart Revise support because they have not received or cannot click a verification link.
Why not?
Email delivery issues originate at the school, not with Smart Revise. If the domain is blocked, we cannot fix this on our end—your IT support team must adjust filtering rules.
We can only see the last five days of attempted email activity. If a student has not attempted verification recently, we have no way to trace a missing message.
Most cases are solved quickly at school by correcting the email address or updating email filters.
Instead:
Check that the student used the correct email address during registration.
Ask your IT team to confirm that *.smartrevise.co.uk and comms2.smartrevise.co.uk are whitelisted.
Ask the student request a new verification email after whitelisting.
When Teachers Should Contact Us
Once your school’s IT team has confirmed that the domains are whitelisted and the student has recently attempted to verify their account, you may contact us if the issue persists.
Please include the student’s registered email address, as that is essential for investigation. (We cannot look up students by name or by class.)
If there is no alternative but for the Smart Revise team to manually verify your student accounts due to email restrictions at the school, we will be happy to assist. In this case, please provide us a full list of student usernames in one communication after all the student accounts have been created.
Alternatively use SSO
Students can also use single-sign-on with Microsoft or Google instead if your school systems support it.
We are thrilled to announce that Smart Revise has been recognised with the Teach Secondary Award for Curriculum Improvement 2025! 🏆
Judge Nikki commented: “The link between getting a question incorrect and being directed instantly to a resource that can help re-learn and consolidate, will prove invaluable for learners.”
This award highlights the impact of high-quality, effective resources in supporting both teachers and students. Smart Revise is designed to make learning more efficient, helping students consolidate knowledge quickly and allowing teachers to focus on what matters most – teaching.
Everyone at Craig’n’Dave would like to extend a huge thank you to all the teachers, students and schools who use Smart Revise. Your continued support makes this achievement possible, and it inspires us to keep improving and innovating.
Here’s to making learning smarter, one session at a time!
The new Goals feature brings a powerful upgrade to Smart Revise, helping students not only understand their progress but also take control of their study habits.
Until now, students could view a summary report to identify strengths and weaknesses across a course. But one key question remained unanswered: “Am I doing enough to be exam ready?” Flight paths helped by showing students their current trajectory based on activity—but they didn’t offer guidance on how to improve that trajectory.
We’ve listened to feedback and identified a common pattern: students love answering Quiz questions, but they often don’t know how many they should be doing each week. The dynamic nature of Smart Revise means there’s no fixed endpoint, which can leave students unsure of how to pace themselves. Meanwhile, Terms and Advance—which require deeper engagement—are frequently overlooked.
That’s where Goals come in.
Every Saturday at midnight, Smart Revise now sets a personalised weekly goal for each student across the three modes: Quiz, Terms, and Advance (if enabled by the teacher). This ensures:
✅ Balanced revision: Encouraging students to engage with all aspects of Smart Revise, not just the easiest.
📈 Clear expectations: Students know exactly how much they should be doing each week.
While Quiz is excellent for tackling the forgetting curve, true exam readiness requires more. Regular review, self-assessment, and practice with challenging questions and mark schemes are essential. Goals guide students toward that well-rounded approach.
A new way to set homework
Previously, we recommended that students complete 35 quiz questions each week as homework. While effective, this approach could be difficult to track. Now, with the introduction of Goals, setting homework is much simpler—just tell students to “complete your goals.” Progress can then be easily monitored using the Goals Overview Analytics Report, streamlining both assignment and tracking.
How are goals calculated?
Quiz
The total number of questions in the course is multiplied by 3 (a question has to be answered correctly three times to be considered mastered). This is divided by the number of weeks until the last exam to arrive at a weekly diet (a). The number of questions available to the students is then calculated and also multiplied by three (b). Finally, a workload cap is set at 100 questions (c). The lowest of the three values a, b or c is the minimum weekly goal.
Terms
The position of the student on their flight path is calculated as 0-5. This determines the minimum number of days a student should engage with Terms each week. This provides a slow ramp at the beginning of the course, or less daunting targets for those students who are not engaging, with more demands towards the end of the course and as their engagement rises. The cards chosen each day are based on the Leitner system to ensure spaced learning. The number of cards to assess in any one day is the lowest of the number of cards available or 20.
Update: the days to complete has now been reduced to 1, 2 or 3 to reduce the workload for students.
Advance
The number of questions available to the students based on the current topic filters is calculated (a). From the course start date and the date of the last exam, five milestone dates are calculated throughout the course. At each milestone the number of Advance questions a student should be answering each week rises: 3, 5, 7, 10 and 12 (b). The lowest of a or b is used to set the weekly goal.
Flight paths and completing the course
It’s important to note that Goals aren’t designed to guarantee students will reach their flight path target cone or complete all content in Smart Revise. Instead, Goals offer manageable weekly targets that encourage students to engage with all areas of Smart Revise regularly. This “little and often” approach is widely recommended for effective revision throughout the course—not just in the final weeks. As a result, students are more likely to stay on track and even exceed their target cone.
Goal messages
Alongside the weekly targets, each Goal includes a short motivational message to encourage students. For example, the initial message for Quiz might be “Take the first step”, while Terms says, “Flip some cards”, and Advance prompts with “Give Advance a try.” These messages update weekly and are tailored to each student’s position on their flight path, offering timely encouragement that aligns with their progress.
See-saw effect
Tasks also play a key role in helping students meet their Goals. The new Goals feature is designed to support—not replace—the teacher’s guidance. So, when students complete Tasks, they’re also making progress toward their Goals. This dual benefit not only reinforces the work you assign but also gives priority to Tasks over Goals. The same applies if you use Quiz as a starter or “do-now” activity in class—it contributes to their Goals too.
The more students engage during lessons, the less they’ll need to do outside of class. This creates a flexible balance: when you take an active role in guiding students, Smart Revise steps back, but when you give students more independence, Smart Revise steps in to provide structure and support. It’s a dynamic partnership that adapts to your teaching style and your students’ needs.
Monitoring goals
Students can view their goal history, but missed goals cannot be retroactively completed. Likewise, any progress beyond a weekly goal doesn’t carry over to the next week. This encourages a spaced learning approach, which is proven to support long-term memory retention. While Goals don’t restrict how much students can do, they also don’t reward cramming—reinforcing consistent, balanced study habits.
Teachers benefit from a new Goals Analytics Report, which shows student progress over the past four weeks, including the current week (highlighted in blue). With simple colour coding, it’s easy to spot which students are meeting their goals and who might need a little extra encouragement.
Looking ahead
We expect to fine-tune the algorithms behind Goals and workload caps as we learn more about how they influence student study habits. Currently, teachers can only disable Goals by turning off the entire mode—Quiz, Terms, or Advance. If a mode is inaccessible to a student, no Goals will be set for it. In future updates, we may introduce independent toggles for Goals, allowing teachers to enable or disable them separately from mode access. This would support more flexible implementation, such as introducing Goals gradually throughout the course.
Although Goals are personalised using each student’s data, the workload caps are currently fixed: 100 for Quiz, 20 for Terms, and 12 for Advance. We’re considering giving teachers the ability to adjust these limits to better suit individual learners and school contexts.
Conclusion
The introduction of Goals in Smart Revise marks a significant step forward in supporting student revision. By combining personalised targets with teacher-led direction, Goals encourage consistent engagement and help students build strong study habits over time. Whether used to reinforce classroom activities or guide independent learning, Goals offer a flexible framework that adapts to both student needs and teaching styles. As we continue to refine this feature, our aim remains the same: to make revision smarter, more effective, and easier to manage—for everyone.
Also included in this release
Added answer counters to Quiz, Terms and Advance so that the student can see how much progress they have made towards their goal while using the mode.
Unassessed cards in Terms are always shown every day so that new content is always prioritised.
Terms do not start at the same card each day for a given card set. E.g. if the red cards a, b, c are shown on day 1, the playlist will continue d, e, f on day 2 and not reset back to a, b, c. This prevents some cards never being seen due to the Goals workload cap.
Task analytics cannot be seen for a task that has not been released after marking. This prevents a student changing the page URL to see their marks before they have been released to them.
Updated message on summary reports when there is no data to display.
The time zone was not always saved correctly when a student creates an account or accesses a task for the first time. This has now been fixed.
Archived tasks no longer appear in a student’s task list
It’s a fact, some students don’t complete their Smart Revise tasks! While you might not have enforced a deadline, time has now passed, and you don’t expect the student to complete the work. For example, at the beginning of year 11 you may want to wipe the slate clean of anything outstanding from year 10. Archiving tasks now provides that facility because archived tasks will no longer show on the student’s task list.
If there are students that can have marks released, before you can archive a task you will be prompted whether this should be done first.
New visibility toggle for a student’s name when marking or viewing a task answer
Teachers will often want to project and share an answer a student has given to a question for reflection with the class. This is useful for engaging with the students to address misconceptions, exam technique or to have discussions about how an answer can be improved. However, it can be a little embarrassing for the student who’s work is being observed.
There is now a toggle on the teacher task marking screen to hide the student name.
Task analytics reports for students
Students now have the same reports as teachers for analysing the outcomes of a task. The only difference is that they can only see their own data. This includes:
Question analysis: see the top 10 least and most well answered questions in the task. Click the question to navigate to it.
Question matrix: see the marks for each question at a glance with colour coding to indicate stronger and weaker questions.
Topic matrix: see the strengths and weaknesses by topic for those tasks that span more than one topic.
Select and unselect all students in a class when assigning a task
A checkbox has been added to the page where you assign students to a task. Ticking this will select all students who have not already had the task assigned. Unticking this will unselect all the students who have not already had the task assigned. By unticking the box you can easily select just one or two students which is ideal for assigning intervention tasks.
Goals feature update
We are currently testing and refining our next big ticket feature, “Goals”. This feature automatically sets weekly targets for students based on their data and the stage of the course. Solving the problem of students not knowing how much Quiz, Terms and Advance they should be doing. This will be available soon.
In an age of AI-generated content and endless question banks, Smart Revise takes a different path—one grounded in cognitive science and the best classroom practice. Designed to help students retain knowledge long-term, Smart Revise uses three distinct modes—Quiz, Terms and Advance—to deliver an experience that is both focused and effective.
Rather than overwhelming students with novelty, Smart Revise embraces repetition of high-quality, human-written questions to strengthen memory. This approach is not only intentional—it’s backed by decades of research.
The science behind Smart Revise
Smart Revise is built on two of the most powerful, evidence-based learning strategies:
Retrieval Practice: Research by Roediger & Karpicke (2006) shows that actively recalling information improves retention.
Spaced Repetition: First described by Ebbinghaus in the 19th century and refined in modern cognitive science, this technique involves reviewing material at increasing intervals to combat the forgetting curve. Studies (Cepeda et al., 2006) confirm that spacing out revision leads to better long-term retention.
Smart Revise integrates both strategies into its core design, helping students move knowledge from short-term to long-term memory—a critical factor for exam success.
Why fewer questions work
Unlike many platforms that use AI to generate thousands of questions, Smart Revise uses carefully curated content written by experienced teachers and examiners. This ensures alignment with specifications and avoids the pitfalls of poorly constructed, irrelevant or even inaccurate questions.
While this means the number of questions is finite, it’s a deliberate strength. Repeated exposure to the same questions helps students build familiarity, reinforce neural pathways, and ultimately master the material.
The three modes of Smart Revise
Quiz Mode: Targeted Retrieval Practice
Quiz mode presents a fixed set of multiple-choice questions, typically three or more per specification point. These questions are designed to:
Assess understanding of key concepts.
Highlight misconceptions.
Reinforce distinctions between similar ideas.
Smart Revise uses algorithms to prioritise questions students get wrong and deprioritise those they’ve mastered. This ensures that revision is always focused on the student’s current needs. Being low-stakes aligns with the principle of desirable difficulty—the idea that learning is most effective when it’s effortful but achievable.
Terms Mode: Vocabulary Mastery
Terms mode focuses on subject-specific terminology drawn directly from the course specification. Using the built-in Leitner system, students rate their confidence using a red-amber-green system, which feeds into a spaced repetition cycle.
This mode supports semantic memory development, helping students internalise the language of the subject—essential for understanding exam questions and constructing accurate responses.
Advance Mode: Exam Technique and Application
Advance mode offers written-answer style questions. Unlike Quiz and Terms, this library grows over time, offering increasing variety while still allowing for repetition.
Advance questions help students:
Practice applying knowledge in context.
Develop familiarity with command words and mark schemes.
Build confidence in structuring extended responses.
All the modes support interleaving—the practice of mixing topics—which has been shown to improve transfer of learning and adaptability in exam scenarios (Rohrer, 2012).
Tasks: integrated, intentional practice
Teachers can assign Tasks giving the same questions to all students that draw from all three modes. Repetition is expected—and desirable—in Quiz and Terms. In Advance, repetition will decrease as the library grows, but it remains a valuable part of the learning process.
Marking from Tasks writes back and over-rides data from Quiz, Terms and Advance. Updating each student’s Quiz question stream priorty, RAG ratings in Terms and marks awarded in self-assessed Advance questions.
Challenging the “more is better” myth
Many tools pride themselves on offering tens of thousands of questions or AI generation to avoid repetition, but repetition of well-designed questions is far more effective.
Smart Revise helps students practice, improve and eventually master a defined set of questions—an approach that builds confidence and leads to better exam outcomes.
Conclusion: Smart Revise, smarter learning
Smart Revise is more than an assessment and revision tool—it’s a research-informed system that values quality over quantity, mastery over coverage and memory over cramming. By combining expert-written content with proven cognitive strategies, it offers students and teachers a smarter way to prepare for exams.
In the last update we introduced some additional question filters for students when using Quiz and Advance. These enabled a student to exclude mastered questions or only include questions they have answered incorrectly in the past in Quiz. This change made it easier for the students to correct mistakes and master questions. In addition, the new difficulty filters in Advance enabled students to select easy, medium and hard questions.
While these filters have been useful for students, especially those who are not joined to a class, we know some teachers want students to always be in “Smart Mode”. This is where Smart Revise chooses the next question for the student based on its spaced learning and retrieval strategy algorithms. This update puts control back in the teacher’s hands enabling them to have three levels of control for their classes in each mode: off, Smart mode only and Question filters.
These settings can be changed in the class configuration.
When a mode is off students cannot access it. Smart Mode only is a good default in the first year of a course. Question filters allows students to filter questions without affecting the topic filters. This is a useful setting in the second year of a course.
It is worth mentioning that the topic filters also give different control to students too. Teacher controlled means that students are locked into full Smart mode. Teacher guided means students can change the topics but only within those selected by the teacher. Student controlled gives students full access to change their topics. In both teacher guided and student controlled students can select individual pie charts from their summary report to only revise one topic.
Change to Terms deck builder
The Terms deck builder has been changed so it matches the new question filter options for Quiz and Advance. Nothing has really changed here but it brings some consistency to the student experience across Quiz, Terms and Advance.
Change to the flight path end date
Originally, we planned the flight path to end two weeks before the first exam because we wanted students to be “exam ready” by this point. However, it probably shouldn’t surprise us that students are working hard on Quiz, Terms and Advance just days before their exam so we have extended the flight path end date to match the date of the last exam instead. Students can now see the progress they are making right up to the last moment!
Update to command words for Quiz questions in a Task
Until now Quiz questions set in a Task would always say, “Select the correct answer” at the end of the question. However, this does not always match the command word in the specification. Some courses use “Identify” instead of “Select”. This is a minor change that students probably won’t notice but multiple-choice Quiz questions now also match the published command word for the relevant specification.
Accessibility updates
Some users reported that the colour blind Tritanopia theme was not using the best colour palette. This has now been modified and checked with the WhoCanUse website. Please continue to give us your feedback.
We are currently reviewing all questions for the 2028 GCSE courses and 2027 A level courses to replace images of data with tables instead.
OCR A level H046-H446 and AQA GCSE 8525 courses
The new OCR A level 2027 course will be one combined course for H046 (AS level) and H446 (full A level) with students able to use just one course and progress from one to the other by unlocking topics. The number of sub-topics has also been increased significantly.
On Monday 31 March 2025, the Department for Education (DfE) announced updates to the Computer Science subject content. Examination boards have the choice to adopt the suggested changes to their current specifications.
Although there are no changes to OCR GCSE J277 or Pearson Edexcel 1CP2, changes are coming to the AQA GCSE 8525 specification for teaching from September 2025 and exams from 2027 onwards.
We’ve been working very closely with the team at AQA and can reassure you that Smart Revise has been fully updated. However, if you are currently using the AQA GCSE 2027 course in Smart Revise we need to migrate you to this new course. We will be in touch directly once the new course has been released.
Live from Saturday 29th March, lots of different parts of the product have had some magic dust sprinkled on them in this update.
Question filters in Quiz and Advance for students
Paving the way for the next big feature, “Goals” are some new question filters for students to use in Quiz and Advance mode. Just above the question students will now see, “Filters” with the option to exclude mastered questions or focus on incorrectly answered questions only.
In Advance, students can choose to see easy, medium or hard questions.
Smart mode will select the next question for the student based on their data. This is still the best option for everyday spaced and interleaved learning, but the increased flexibility of filters will reduce the frustration that some students experience when the algorithms are in control!
Delete and add questions from a task for teachers
Teachers now have the option to delete and add individual questions when setting a task. Swapping one question for another has always been possible, but this new feature allows teachers to do so much more. Even manually creating a task by choosing every question you want. This is just the first iteration towards an even better interface allowing you to search and prioritise questions for selection. As a quick hack, remember that CTRL-F will allow you to “find in page” on most browsers. So, if you want to find a question about sorting, try finding “sort” for example. Just remember you will need to have selected the relevant topics to see these questions.
Improved AI marking messages for teachers
Before this update Tasks that included questions that could not be marked by AI (no robot icon) reported that marking was complete once AI had finished doing what it could. Now, if your task does include questions that are not markable by AI, or it trips up during the marking process, it will report, “AI Complete (you to mark)”. A teacher, the student or a peer can then complete the marking for those remaining questions.
Please do bear in mind that AI marking is not 100% accurate. That’s one reason why Ofqual have said that exam boards cannot use AI marking for real examinations. It is getting better all the time, but it is unrealistic to expect it to be perfect. Unlike classic algorithms that are deterministic, AI algorithms are probabilistic. They also hallucinate and the content at GCSE and A level is often abstracted for the level of study, so this is an added complication too.
Tab navigation on desktop
Using tab on the keyboard will now navigate sensibly around the page. We still recommend you use a mouse or touch wherever possible, but this is a useful accessibility option.
Bug fix: difference in data between CSV & analytics reports on-screen
Due to rounding errors sometimes the downloadable data from the analytics reports was different to what could be downloaded as a CSV file. This has now been corrected so that the data is consistent no matter how you choose to view it.
Exploit fix: students can no longer assign negative marks to an answer
If you want to fully test your product, put it into the hands of your students. They always seem to find something that we just hadn’t thought of! It transpires that by manipulating the client-side code students were able to input negative numbers into the “mark” box, giving themselves (or a peer) a negative mark for a question. We have now patched this with more server-side validation.
Rebranding
If you haven’t already noticed we’ve got a brand-new main Craig’n’Dave website. We are slowly migrating all our resources to this new look, and that includes Smart Revise. Nothing major here. A new image on the landing page and each page header but it helps to tie all our products together.
The Smart Revise update on 28th February 2025 included the following updates:
Update to the task marking interface
The question and answer are now on the same tab to reduce the need to switch between panels making it easier to mark questions in a Task.
The whitespace in the mark scheme panel has been reduced to increase the number of points (and tick boxes) shown on the screen to reduce scrolling.
Answers to Quiz questions in a Task are now displayed with all the possible answers and the one selected by the student highlighted making it easier to differentiate whether a student gave the correct answer or not. This also facilitates discussion about the incorrect answers too.
Automatic release of marks to students
The default when setting a task is now “AI marking” with the option to change this to teacher, self or peer assessment instead. “Automatic” is now also the default option for releasing marks to students.
That means unless you change it, Tasks that only contain multiple-choice Quiz or AI markable longer answer questions will be marked as soon as they are submitted with the marks released to students immediately, eliminating the necessity for a teacher interaction.
In addition to students receiving their marks in a timely fashion, it also means that they will always be looking at the latest data on their flight path and in their summary report.
You might be wondering why this wasn’t always the default option. Initially we wanted a safety net for AI feedback. Until lots of questions had been marked by AI, we couldn’t be absolutely sure the response would always be appropriate. We were unnecessarily cautious, but it was the right decision. Tasks was also intended to be an assessment mode where you may want to hold back marks to a test or mock exam until a particular lesson. You can still do this, but that is no longer the main use of tasks for most teachers, so it makes sense to flip the default option.
Small update to AI marking with more to follow
The AI marking now has a greater understanding of the course and not just the level of study. This will make marking more accurate in some cases. We have also identified a path for future development to greater accuracy for questions that AI marking frequently marks incorrectly. This is a minor update, so don’t expect a sudden increase in accuracy, but it does pave the way for more significant improvements in the future.
Historical task and flight path update bug fixed
If a task was set and marked in the past, but the marks are only recently released to students this would cause the data update to be tagged against the day the task was marked instead of the day the marks were released. This only affected the Quiz flight path, but it caused incorrect peaks in the flight path line making it look more like a mountain range! This has been fixed.
Corrected misleading text in the Task creation process
The messaging suggested copying a task to another year group was not possible (it is). The task creation wizard has been updated to make copying tasks across year groups easier too.
Students have three flight paths that show their progress with the content in Smart Revise. The blue line shows progress with Quiz (multiple choice), pink is Terms (definitions) and green is Advance (written answers).
Selecting “expand” will show the student a more detailed daily breakdown too. Each dot representing a day they logged in and used Smart Revise.
Teachers can see the flight path for each student in their analytic reports, and the more detailed view by selecting “Load full data”. A top tip is to hover over a dot to see the date and the progress on that date.
Students should aim to be in the green “target cone” at all times, and the teacher can set the parameters for that in the class and individual student settings.
Quiz progress
Every time a student answers a Quiz question correctly the count for that question increases by one. The algorithms will reprioritise the question in the queue, but each question can be answered correctly up to three times to count towards flight path data. So, think of each question being worth 0-3 points. If a question is answered incorrectly, it is given a much higher priority, so will be more likely to be shown again in the near future but will also have its count reset to zero. This is the reason that flight paths can go down too.
If a course has 600 questions and the student has answered every question correctly three times, that would be 1800 points or 100%. Imagine a student that has mastered 75 questions (that means correct three times in a row), 20 questions answered correctly twice, 10 questions answered correctly once, and 8 questions answered incorrectly. Their progress on Quiz can be calculated as:
(75 * 3) + (20 * 2) + (10 * 1) = 275. If there are 600 questions in the set, the maximum is 1800, so 275 is 15%.
Remember that Quiz questions asked in a task also contribute to flight path progress when the marks are released to the students.
Terms progress
Each term can be self-assessed by the student as red, amber or green. Only green terms count for the flight path so if there are 200 terms and the student has 30 marked green then their progress on Terms would be 15%.
When terms definitions are asked in a task, the student must score full marks for the term to be recorded as green and have a positive effect on the flight path. Anything less and the answer won’t count.
Advance progress
Advance is much more complex. Questions are tagged as easy, medium or hard and students must have answered a range of questions across the three difficulties to achieve 100%. They do not need to answer every question to achieve 100%, but they cannot answer all the easy questions for example without their progress being capped.
We aim for students to be “exam ready”. In classic revision that would mean having attempted and received high marks across a range of past papers. Smart Revise captures this by ensuring students must have achieved a given number of marks in easy, medium and hard questions to achieve 100% progress. For example, that could be 100 marks of easy questions, 100 marks of medium questions and 100 marks of hard questions. Every mark is worth a point. The harder questions are worth more marks, and therefore there are more flight path points to be gained.
The target cone
The green area on the flight path is known as “the target cone”. The cone starts at a date decided by the teacher and extends to the end of the course. Students should be aiming to be within this green target cone at all times. By default, the lower line at the bottom of the flight path, known as the minimum expectation line is set to 60% and the line at the top of the target cone, known as the aspiration line is set to 80% at the end of the course. That means to be within the target cone at the end of the course students must have completed 60-80% of the content within Smart Revise.
The flight path start date, minimum expectation and aspiration target can be set by the teacher at a whole class level or it can be different for each student. Read our helpdesk article to find out more:
How much engagement do students need to be within the target cone?
This is a difficult question to answer because it depends on the minimum and aspiration target. The more students use Smart Revise the greater their progress will be. Generally speaking, 35-50 Quiz questions per week for GCSE students and 40-70 questions for an A level student should enable them to achieve 100% by the end of the course if they begin within the first two months of starting the course. Essentially, if a student is below the target cone they should do more, and if they are within the top half of the target cone they are doing well. So the target is really, “be within your green target cone”. If this isn’t challenging enough, increase the minimum expectation in the class configuration settings.
All the questions students answer in Tasks also counts towards progress on their flight path once the marks are released to students. This may cause their position to go up or down!
Progress and grades
There is a clear positive correlation between the number of marks a student achieves in Smart Revise and the number of marks they are likely to achieve in a real exam. It stands to reason, the more revision and practice of questions they do the better a student will perform in an exam. However, progress in Smart Revise and grade boundaries are not related. It is not possible to say that 60% progress is the same as 60% in an exam. Therefore, we advise caution when interpreting the data in that way. See flight paths more as an indicator of how much students are using Smart Revise and how well they are answering questions. I.e. the amount of content they have seen and their confidence.
Of course, it is possible to set flight path targets to match target grades based on exam grade boundaries, and this is a good starting point, but always raise expectations with the flight path. Set the minimum and aspiration targets in Smart Revise to be higher than target grades.